Skip to Main Content
In October 1961, I drove up the entrance drive of the sprawling Bell Labs complex in Murray Hill, NJ, to begin my career as an electrical engineer. I can still remember how frightened I was, and at the same time how thrilled. It was an awesome place I was entering, the Mecca of my world, peopled by the legends of my imagination—Shannon, Pierce, Shockley, Bode, Rice, Tukey, and Darlington among many others.
I thought this famed institution would be forever, and of all the revolutionary events that have happened in these many years since that first day at Bell Labs—events that I shall speak of presently—that this building now bears a different company name and that the Bell System no longer exists are at the top of my list of happenings that I could not have imagined.
In 1962, shortly after I began work at Bell Labs, the PROCEEDINGS OF THE IEEE featured a series of future predictions authored by IEEE Fellows. I do not remember if I had read the predictions when they had been first published, but 25 years later I reviewed those predictions in one of the regular columns I was writing for the IEEE SPECTRUM magazine. Those thoughtful predictions from our most famous engineers had by then become the perfect subjects for humor and derision. But in my opinion the most serious failing was the authors' failure to recognize and appreciate the revolution in electronics that would come about because of the integrated circuit.
My favorite prediction, written as if an engineer in the year 2012 was looking back at the field in 1962, was as follows:
“After a competitive race in the 1960s to produce the smallest units, reason had prevailed. While components were small by earlier standards, the ultimate sizes were such that costs were reasonable and servicing practicable. For example, whole receivers were the size of pound candy boxes, rather than cigarette packs.”
Jack Kilby had made his first integrated circuit in 1958, so it had existed when the predictions had been published in 1962, but it was not until 1965 that Gordon Moore had made his now famous prophecy about the exponential progress that would occur in the density of integrated circuitry. Moore's prediction was the one prediction that withstood the test of time, and the prediction that drove our profession into relentless acceleration.
It was amazing to me that our most famous engineers in 1962 had had such an impoverished idea of future happenings. But in truth I would have done no better, and probably even worse. Of course, I was a callow, inexperienced engineer then. But now I have no such excuse, and even so there is a high degree of probability that any predictions I would make now would be subject to equal derision by engineers in the future.
For some years now I have been intrigued by the idea of exponential progress. Moore's law has been a quantitative measure of this in the fabrication of semiconductors, but perhaps it extends to all of technology, and the whole concept of exponentials is somehow incompatible with the way we reason about the future. Exponentials imply breaking points and future shocks—a murky, unknown future driven by forces seemingly out of our control. So we revert to our usual behavior of linearly extrapolating the present, and this works for awhile—and then it does not.
When I look back on my frame of reference in 1962, I realize that I had no concept of the inevitability of constant change, let alone exponential change. It was if the world was static, frozen in time at my graduation. I remember how I had saved all of my college texts in the basement, sure that I would be referring to them for the rest of my career. Alas, I do not think I ever looked at them even once afterwards, and I do not even know whatever happened to those decaying boxes of books rotting in the basement of my parent's house.
Why was I unprepared for change? Was it my fault or the fault of the education system? There was a kind of finality associated in my mind about what I learned in college. I felt a timelessness and constancy about mathematics, and even in the technology courses I had the idea that if I learned the material being taught, I would be able to apply it for the rest of my career. No one ever told me—or if they did I did not listen—that all of what I was learning would soon become obsolete and it would be necessary to, in effect, be a college student the rest of my career. But who can forget the feeling of graduation, throwing back the tassel on the funny hat and feeling the exhilarating freedom of being finished with formal education?
In 1962 I was attracted and influenced by two almost opposing poles—Heathkit and Shannon. Heathkit both symbolized and embodied the physical world of electronics. I can still remember the thrill, and even the smell, of opening a new box with a kit from that company in Benton Harbor, MI. I have since on occasion looked around my present accumulation of possessions and asked myself which I have had the longest? Perhaps it is the Heathkit oscilloscope that I built in high school about 60 years ago. I still have it, and I think it still works. Alas, there is no longer any use for it.
Building a Heathkit was mindless and you did not need to be an engineer to do it. It was like paint-by-the-numbers. Nonetheless, there was a feeling of pride and accomplishment when you turned on the final product and it worked. I thought the world of electronics would always be like that Heathkit—different assemblages of tubes, transistors, capacitors, and resistors. There were a myriad of ways these fundamental components could be interconnected, so the future was rich in one sense, and yet impoverished in another sense that I had yet to realize.
Some years later when I wrote a column in the IEEE SPECTRUM magazine about the death of Heathkit I got more mail about that essay than any I had written before or would write afterwards. For older engineers it was as if their childhood had been stolen from them. The smell of soldering resin was gone forever. The physical world had receded and had been replaced by an intangible world of software and of impenetrable and unfathomable chips. The intricate, paint-by-the-numbers wiring of a Heathkit had become one big featureless, colorless blob.
If I had imagined the future then, and I doubt that I really did, it would have been a world increasingly filled by a stream of incrementally improving Heathkit-like appliances—better televisions, high-fidelity amplifiers, and so forth. And perhaps in some sense this is what happened after all. Fifty years later television sets still have the same function. They got bigger, as we anticipated, and like the popular science magazines envisioned, we got flat panel displays that you could hang on the wall. But they are still recognizable as TVs. The revolutionary changes lay elsewhere.
Shannon's theory of information was the other pole influential in my life at that time. While Heathkit symbolized the physical world of electronics, Shannon symbolized the virtual, theoretical world of mathematical analysis. His paper on information theory published in the (now defunct) Bell System Technical Journal in 1948 was to me an object of beauty comparable to a great painting or symphony. Like Moore's law, it too has withstood the test of time, and like great works of art and music, its beauty still shines forth from those yellowed pages of a journal filled otherwise with forgotten trivia.
Shannon's paper exemplified the idea that it was possible to sit in an office and dream on paper with mathematics about things that would change the world. It did not need to be changed with a soldering iron in a laboratory; it could be done with imagination, pencil, and paper. It inspired me, but at the same time intimidated me. This level of work was denied to us mortals, and as subsequent years went by, I saw that there were few jobs in our profession that were supported by pure mathematical thought.
In college, those two realms—the physical and the virtual—were interleaved. In one course, I would learn about rotating machinery, and in another, differential equations. Perhaps the engineering profession at that time was ecumenical in this regard, but as the decades went by we were cleaved into distinct disciplines. We had engineers who were primarily physicists, who worried about such things as semiconductors and photonics, and we had systems engineers whose primary tools were mathematical analysis and computer simulation. Another category grew from almost nonexistence in 1962 to a dominant category today—that of software and computer science. While there is some passage through these boundaries, almost akin to quantum tunneling, they have become ever more distinct in education and practice.
There were two big projects in Bell Labs in 1962—the Picturephone and the millimeter waveguide. It seems incredible to me now in retrospect that we were all so certain of this vision of the future. Of course, the network and its future at that time were controlled by the Bell System, so AT&T and Bell Labs had the liberty to plan the long-term evolution—and this was it. But if the AT&T planners' vision was flawed, they had a lot of company. Just about every futurist and science fiction writer believed that after voice communication, the next frontier was video. After all, what else was there?
The Bell System demonstrated the Picturephone with considerable fanfare in New York at the 1964 World's Fair, and Arthur C Clark, having been enchanted by the Picturephone in a visit to Bell Labs, featured it in the famous movie, 2001. But by 1982 the Picturephone on my desk at Bell Labs lay unused; there was no one left to call.
To support the 6-MHz analog signal required by the Picturephone a broadband transmission channel was required. The plan was for the bandwidth to be provided by hollow pipes of about 5-cm diameter, with millimeter wavelength signals being propagated in a circular mode whose electric field went to zero at the inside edge of the pipe in order to minimize transmission loss.
A lot of excellent engineering went into the project, but it was never deployed. The market failure of Picturephone removed any urgency, while the dramatic progress in optical fiber transparency rendered the millimeter waveguide obsolete before it was even fully born.
In retrospect, these projects are seen as missteps—the Picturephone because of the difficulty in predicting market acceptance, and the waveguide because of technological upheaval. However, when I consider the technological landscape in 1962 it looks now like a time of great fertility. We did not know it then, but we were on the cusp of so much dramatic potential. Consider: the integrated circuit was just beginning its evolution, the laser had recently been invented, the digital computer was just coming into being, and the first modems were just being marketed. Moreover, in fields like communication we were far from the known limits of optimization. So much was just sitting there waiting for us.
Sometime in the mid-1970s I was visiting my parent's home, and I remarked on the ancient Bakelite, rotary-dial telephone that had occupied a nook in the hallway since I was a child.
“Why don't you get a new phone, Dad?” I asked. “They're now relatively inexpensive.”
“What for?” he replied. “They do the same thing as this one does.”
That shut me up, and I knew better than to pursue the conversation. I lamented the fact that he was mostly right—a phone was still a phone. But in 2011, as I write this essay, my wireless smartphone lies sleeping in front of my laptop computer. I think my father would have agreed that it no longer resembles his old telephone. To me, this smartphone exemplifies all of the unexpected and dramatic progress that has occurred over these recent decades.
My father kept his old phone for more than 60 years. Possibly it still exists somewhere even now. But the cell phone represents an entirely different understanding of technology and the pace of change. In the United States alone, about a half million cell phones are thrown away every day. It has become the most ubiquitous device in the history of the planet, and in a number of countries there are more cell phones than people.
If I were to open up this phone (which I am not about to do!), I would first observe how little is really inside the small outer case. The incredible function it empowers is seemingly built upon almost nothing. All of the processing power is handled by a few integrated circuit chips containing millions of transistors. Much of the space is taken up by a lithium battery. There is a brilliant LCD display that is so detailed that I cannot make out individual pixels, and an internal camera with an optical sensor CMOS array chip with five million distinct pixels. None of this was known, or even considered remotely feasible in 1962.
But this engine under the hood is only the beginning of the story. There is a vast virtual and physical infrastructure beyond in support of this little phone. In the phone itself there are very sophisticated processing algorithms to implement orthogonal frequency division multiplexing (OFDM). These adaptive algorithms are the culmination of decades of research into communication theory, and they approach closely the theoretical limits that we now know for the wireless channels they inhabit.
The phone has speech recognition capability, again only made feasible by voluminous, cheap processing and much research on speech recognition. It does not even have to be trained, and it seems to do a good job.
A small chip in the phone implements GPS, and in the skies above more than two dozen satellites circle Earth, streaming down time and position. The clock in the little cell phone is corrected to extreme accuracy by rather complicated algorithms. Location and altitude are determined through a number of sophisticated calculations even correcting for general relativity.
Voice conversations are digitized and organized into packets for wireless transmission to any of the ubiquitous base station receivers that now blanket Earth. The packets are constructed and transmitted according to the protocols of an entirely new kind of digital network—the Internet. From the base stations the bits flow over optical fibers carrying terabits per second of capacity.
Many of the packets are encrypted using public key cryptography dependent upon the intrinsic difficulty of factoring large numbers, while pictures, music, and videos are compressed using clever algorithms conceived by information theorists in these last few decades.
Using the programmable keypad on the phone's display, the world's information can be searched and accessed. Friends can be found and connected through social networks. Products can be bought and sold. And there are so many other things that we have become accustomed to expecting from that little phone that the magic has somehow become ordinary.
And, by the way, you can also make voice calls using a keypad for dialing. Perhaps my father would say that it is still a phone after all.
Just thinking and writing about all the technology exemplified by this little phone fills me with pride for our profession. Most of these achievements were created by people that I have known through my career these last 50 years. I take a vicarious pride in our joint accomplishments. I never expected it, but we engineers did indeed change the world.
Earlier I observed how technology in 1962 was fertile with potential. Could we say the same thing about technology today? Frankly, I do not sense that same potential, but perhaps such fertility is only evident in retrospect. Moreover, in some fields—perhaps in conventional electronic circuitry and communications—we are approaching known limits of technology. The recent decades have been characterized by the explosion of information technology, but perhaps diminishing returns have set in and the torch of progress will pass to another field. Conventional wisdom today believes that new field will be biology.
Nonetheless, I do believe in exponential progress as a rule driven by economics and, possibly, simply by the increasing number of engineers and scientists in the world. In many cases, the progress is one of functionality itself, regardless of how that functionality is achieved, rather than necessarily being achieved through increased transistor density according to Moore's law. For example, today much of the perceived functionality of the smartphone has been through social and business invention. Companies like Google, Facebook, eBay, and many others would be in that category, in spite of the fact that they are built upon the underlying technology platform. Even the Internet itself, the browser, and the Web seem more due to social invention than to the technology on which they reside.
In order to ride the curve of exponentially increasing functionality as we “use up” the capabilities of current technology, we will have to open up new fields of exploration, just as optics opened up new capabilities for the engineers of 1961. I really have no idea what these new fields will be, but to give the engineers of the future something to laugh about, I will suggest two possibilities. The first is based mostly on wishful thinking, and it is significant breakthroughs in batteries and energy storage—both in the small for portable devices and in the large for the electrical grid. The second possibility is based on a hope for something magical, and it is that we find some fantastic way to use quantum physics in processing and communication. I do not particularly mean quantum computing, which is now going through the traditional hype cycle, but something else. Exactly what this is, I could not guess, but whatever it is, it will be seen as magic.
Following some variety of the second law of thermodynamics, the world gets continually more complicated as time passes. A young engineer starting his or her career today faces a world vastly more complex than I did in 1961. At first look that would seem to make great achievements much more difficult to attain, but perhaps overwhelming complexity also portends rich unrealized potential. In any event, I feel both sorry for and jealous of those young engineers. We took the low-hanging fruit. I have no idea what is growing further up the tree.
Back to Top