Last issue took us historically from before the invention of the triode up through the introduction of the integrated circuit. Whats left to the story of audio amplification then? We saw that Nyquist and Bode brought a degree of mathematical certainty to modeling during the 1930s and 40s. Modeling got more sophisticated by the 1970s, culminating with the introduction of SPICE (Simulation Program with Integrated Circuit Emphasis). A UC Berkeley program developed for internal use at the university, it came into wider use on Tymshare by 1973, when I started using it on a TI Silent 700 dial-up terminal. With a 300-baud acoustic coupler modem, you would place the handset of your telephone into soft rubber cups and slowly, very slowly communicate with a mainframe running the FORTRAN SPICE program.
SPICE takes in circuit diagrams by way of a node list and device models and crunches numbers to produce DC operating conditions, AC response, and even distortion and noise predictions. For the first time, those pesky equations determining a circuits response succumbed to computer analysis. I remember spending a whole weekend writing the equations to simulate the record amplifier frequency response of the Advent 201 cassette recorder, which was pretty hairy, then going in on Monday and just fiddling the parts around to get the response flat (we had changed the head for a new model, which had a different set of losses, thus changing the record EQ requirement). I altered the part values, and my fiddling was informed by my hand equation crunching, but when computer modeling became available, life got easier.
You still had to know what you were doing, since SPICE takes in a model and tells you what it does it doesnt generate topologies or values. But you could design things more quickly (usually!) on the computer than you could with a soldering iron. At Lucasfilm I got a 9-track IBM format tape of SPICE from a grad student at Berkeley in the early 80s, and today, Im told, free copies are available on the Web part of the great software diaspora of the last few years.
The transition from tubes to transistors was not an easy one, as described last time. With transistors basically substituted into tube circuits, things didnt always work right. Once the problems were solved, basically by using more transistors, then preamps that solved the problem of changing input impedance across frequency, for instance, came out, first from Advent, and then my company Apt. They got excellent reviews, and sounded as good as tube preamps on blind A/Bs. So I addressed the New York Audio Society at the New School, proud of my invention, and there was an opening question that asked, Well, what about tubes? I went carefully through the entire design process of how I had discovered the input impedance interaction (among other) differences, and then made a solid-state unit emulate the good properties of tube ones, with the greater long-term stability of solid state. And then came the closing questions. The same questioner as before asked, Well what about tubes then, eh? I thought the whole talk answered his question, but some people are never happy. I presume hes still using tubes.
Then came TIM, transient intermodulation distortion, or, as I like to call it, transcendental intermodulation distortion. Well see why in a minute. Remember last time that there was an exhortation in the literature as early as 1951 that an amplifier can slew rate limit and have problems because of the relative speed of the different stages. With a fast first stage and a sluggish second one (with the dominant pole usually there controlling the open loop response of the amplifier), the first stage could slope overload under transient conditions before feedback can be effective, and it was thought that the solution at the time was apportioning the relative open loop gain and speed of the stages. The problem was re-discovered, and found its way into wider print by the early 1970s. A few engineers led by Matti Otala proposed reducing open loop gain, using more open loop bandwidth, and thus using less negative feedback as a solution to this problem. Feedback became a bad thing. This new mysterious TIM was given the marketing role of why transistor circuits did not sound as good as tube ones at first, and thus proclaimed that the fixed transistor amps were now good.
It started in the American high end, and within several years swept the marketplace, resulting in Sansui winning the race in the end once it was reduced to a numbers race with their 3000 V/µs power amplifier, as I recall. Well the whole thing was just silly, out of proportion to the problem, which caused me to call it a transcendental problem rather than a transient one. The problem was potentially real, but actually only applied strongly to the very worst circuits, and the solutions proposed to tame it, while they may have worked, traded off the advantages of negative feedback (low static distortion, among others) for reducing transient distortion, thus throwing out the baby with the bath water.
Many authors subsequently found this to be so, including Jung, W.G., Stephens, M.L., and Todd, C.C. in Slewing Induced Distortion in Audio Amplifiers, four part series The Audio Amateur, starting Feb. 1977, Cordell, R.R., Another view of TIM, Audio Feb. 1980, and Garde, P., Transient Distortion in Feedback Amplifiers, J. Audio Eng. Soc. Vol. 26, pp. 314-322, May 1978.
Today we still see a residue of that time, with low feedback designs being thought good in some circles, long after the problem has been studied in detail and solved. But there are still those using tubes .
Another factor came along in the early 1970s with solid-state power amplifiers: safe area protection. These are circuits that examine the output stages voltage, current, and time conditions, and attempt to shut the amplifier down in case the capabilities are exceeded. Today this is probably one of the most important differences among amplifiers rated at a nominal 100 watts in 8 ohms, for instance what they will actually deliver to a reactive load varies dramatically due to the map of voltage/current limitations that safe area protection imposes. Strangely enough, the single factor that most correlates with this capability on the data sheet, among conventional amplifiers at least, is their weight! That shows how much power the amplifier is capable of dissipating internally when faced with a reactive load. After all, with a fully reactive load like a pure inductor or capacitor, all of the power is dissipated in the amplifier, and none in the load. While real world loudspeakers arent all that bad, they are certainly more difficult to drive than the resistors that theyre usually designed for. Real work has been done in more recent years on shaping the open loop gain function to provide more feedback than simple dominant single-pole designs. Ed Cherrys nested differentiators come to mind as a valuable improvement, although there are many other schemes in the literature. They are all working with the same parameters that Nyquist and Bode would well understand.
Finally, remember in the first part that I brought up the 1937 invention of linear Pulse Code Modulation, now the dominant digital audio coding method for audio. While this seemed tangential to a history of audio amplification, my thesis here is that it is not tangential at all, because today we see systems where the only analog in sight is the very input stage and the very output stage, and, with digital microphones being introduced, and directly driven Class D switching mode power amplifiers, too, its not long before the entire chain is digital, and the history of analog amplification becomes just that history.
Still, theres lots of problems to deal with in digital audio see this months feature article!