The solid-state age at 50: whats next?

Dec. 1, 1997
WASHINGTON - Try to imagine a world without solid-state electronics. It`s impossible. Electronics has become so pervasive that it is almost like the air we breathe. Our silicon-based technology has made it so.

by John Rhea

WASHINGTON - Try to imagine a world without solid-state electronics. It`s impossible. Electronics has become so pervasive that it is almost like the air we breathe. Our silicon-based technology has made it so.

This month, as we mark the 50th anniversary of the birth of the solid-state age, it is useful to reflect on how an invention that was considered so insignificant at the time has turned our world upside down - and how the world might be turned upside down again.

The transistor is one of the few inventions that can be traced to a specific date: Dec. 23, 1947. We also know the names of the inventors: William Shockley, John Bardeen, and Walter Brattain, all working at the Bell Telephone Laboratories in Murray Hill, N.J.

On that Tuesday morning Bardeen and Brattain demonstrated for their supervisor, Shockley, the tiny germanium device hooked up to a kluge of copper wires that first performed the switching function previously confined to fragile, bulky vacuum tubes. The three were all subsequently lionized with Nobel prizes and other honoraria.

The transistor (so called because it transfers an electrical signal across a resistor) is a genuine example of the overused cliché "paradigm shift." Everything that has proceeded from it, integrated circuits, semiconductor memories, microprocessors, solid-state displays, and the rest, are all extrapolations of that basic invention.

The sub-micron barrier of semiconductor feature sizes has long since been breached, thanks in part to the Very High Speed Integrated Circuit (VHSIC) program of the Defense Advanced Research Projects Agency, 256-megabit dynamic random access memories (DRAMs) are due on the market before the end of the century, and NEC officials have begun building a pilot line for 1-gigabit DRAMs with 0.18 micron geometry on 12-inch wafers.

Slow beginnings

It is hard to understand the nonchalance with which the accomplishment of Shockley, Bardeen, and Brattain was treated. AT&T`s lawyers tidied up the legal details before company leaders got around to announcing their accomplishment at a press conference on June 30, 1948, and the New York Times buried the news in its weekly column on radio programs the following day. Not a particularly auspicious beginning.

Twenty-five years ago, as the San Francisco bureau chief for Electronic News, I met Shockley for the only time. I visited him and his wife at their home in Palo Alto, Calif., to interview him and try to find out why everybody involved had been so calm about such a momentous event. He had obviously been asked that question many times, and his answer was always the same. It was his job, like any other job. He said he didn`t discuss the event in his "riding group" (car pool) on the way home or even mention it to his wife. She agreed. So did Bardeen and Brattain when I talked to them on the phone. I`ve wondered ever since how we`ll all react the next time something like this happens.

Within a decade after its invention the single-function transistor was followed by the invention of the multi-function integrated circuit (officially attributed to Jack Kilby at Texas Instruments in Dallas, whose worked overlapped that of Intel co-founder Bob Noyce, then at Fairchild Semiconductor in Mountain View, Calif.), which laid the groundwork for today`s age of ubiquitous electronics.

The technology then proceeded to large-scale integration (LSI, sometimes referred to as "large-scale insanity," which involved at least 100 gates per chip) in the late 1960s and then on to VHSIC and beyond. Today you have more digital processing power in your home, in fact in your car, than existed in the entire world in the pre-transistor age.

There is a well known industry folk tale about how IBM was considering entering the computer business in the late 1940s and challenging the industry leader, Univac. According to the legend, IBM commissioned a market research study which concluded that the worldwide market for computers would be just 50 machines.

It`s easy to scoff at such naiveté now, but put yourself in the position of the IBM market researchers. The bulky machines made out of vacuum tubes could only crunch numbers and they were so unreliable that people had to be standing by to constantly change the burnt-out tubes. The Army used them for calculating ballistic trajectories, and the cost was exorbitant.

Of course, there was only a market for 50 - at most - machines like that. But if you knew that solid-state electronics was waiting in the wings you`d have come up with an entirely different answer. I don`t think there are many office buildings today that have fewer than 50 computers.

The dynamic commercial electronics markets have now so far outstripped the declining military and aerospace applications that custom integrated circuits have become an anachronism. Standard products, once referred to caustically around Silicon Valley as "jelly beans," are now commodity items traded like soy beans or pork bellies. They make a lot of money for the companies that stay ahead of the technology and proceed expeditiously down the experience curve.

Military lead

It wasn`t always so. The military services launched the modern semiconductor industry with their sponsorship and procurement of integrated circuits for the Atlas, Titan, and Minuteman missiles in the 1960s. The industry benefited not only from the government`s development dollars, but also from large production runs and resulting economies of scale made possible by the guaranteed military markets.

In the turbulent days following World War II electronics had to play a key role, and an embryonic industry needed this outside help to do it. Two examples, one a potential failure and the other a resounding success, illustrate this role:

Immediately following the war, President Truman asked his science advisor, Vannevar Bush, to assess the technologies necessary to maintain an uneasy peace. It was obvious that the Soviets would not be content with their domination of Eastern Europe and that further conflict was likely. In his report, "Science, the Endless Frontier," Bush concluded that a nuclear strategy, later to become known as mutual assured destruction (MAD), could act as a deterrent to conventional warfare, but he opposed using ballistic missiles for that purpose on the grounds that they could never be large enough or accurate enough to deliver the nuclear warheads.

Pentagon leaders didn`t buy the whole idea and proceeded with what they called a "strategic triad" of bombers, intercontinental ballistic missiles, and submarine-launched ballistic missiles - all made possible by electronics advances - and the strategy was vindicated by the collapse of the Berlin Wall in 1989.

In fairness to Bush, it is not reasonable to expect even such a distinguished scientist as he was to be aware of the advances in electronics that lay just around the corner. Yet consider how tragically different the world would be today if an effective deterrent had not been fielded in time and conventional warfare had been allowed to rage on.

The death toll in World War II has been estimated at around 60 million people, greater than the total of all other wars in human history. It could have been much worse. Albert Einstein was once asked how he thought World War III would be fought. He said he didn`t know but he was sure World War IV would be fought with spears and bows and arrows.

Later, when the Soviet Union electrified the world with the Sputnik satellite on Oct. 4, 1957, the United States was literally stuck on the launch pad. The Russians had the powerful rockets, but we had the sophisticated electronics that would make our puny launchers the equal of theirs in getting useful payloads into space.

Thanks to electronics, we got into the race, beat the Russians to the moon, explored the planets of our Solar System, and created a dynamic commercial space industry (communications, meteorology, earth observation, navigation, all examples of dual-use military and civilian technology). Electronics did for space what the Colt .45 did for the settling of the American West; it put the small man on an even footing with the big man.

I first met Noyce in his office at 7 o`clock one morning. He had a reputation for being something of a party animal, and I don`t think I`ve ever seen a man so hung over - or so coherent under the circumstances. Intel was in the process of bringing out its 1K RAM, but his vision a quarter-century ago already was for devices with feature sizes of 0.002 micron - that`s 20 Angstroms - and functional densities of 1,014 (100 trillion) equivalent gates per device.

Noyce was fascinated by that vista. As he later told an IEEE conference in 1983, "Turning to a biological system, that number, 1,014, is equal to the number of synapses in the human brain ... In other words, if all of the devices that this industry produced in a year [1983] were assembled in a single entity, that entity`s complexity in terms of number of fundamental elements would be equal to that of the human brain ... If we were to continue to approximately double the number of elements produced each year by the semiconductor industry we would have produced the number of elements in all human brains by about the year 2020."

Why bother? Another ancient joke from the computer industry has it that humans are quite capable of producing human brains themselves at considerably less cost. Best of all, they do it with unskilled workers who enjoy their work.

Beyond electronics

As difficult as it was for the Bell Labs team to look beyond the vacuum tube electronics of their time, it seems almost impossible now to envision an era beyond electronics itself. One possibility that continues to receive attention, notably at the Air Force`s Rome Laboratory in Rome, N.Y., is integrated optics, in which data are processed with photons rather than electrons. Data are already transmitted as photons over high-bandwidth fiber optic links. Further progress in hybrid optoelectronic systems could shift more of the processing to the non-electronic realm.

In this context, the medium could dictate the method. Our present era of digital electronics had its origin in 1843 with the invention of the telegraph by Samuel F.B. Morse (an artist, by the way, not an engineer), which was the first medium to carry communications as something other than freight and also the first, via the dots and dashes of the Morse Code, to introduce the concept of a digital format.

Another idea that I`ve heard bandied about since I was a rookie reporter for Electronic News in 1961 is the concept of organic semiconductors. It was a hoax then, and we were the gullible, unwitting perpetrators of the hoax, but it is an idea that refuses to die. The idea even had a sponsor a decade ago: the Strategic Defense Initiative (SDI), the focal point of what were considered other nutty ideas.

"With genetic engineering coming on line, in the coming decades we may grow what we will think of as ICs in the organic domain," James Ionson, who headed the innovative science and technology projects for the now-defunct SDI, once said. "Organic devices are highly nonlinear and respond to even the smallest electrical stimulus."

Among the approaches that have been suggested is the use of amino acids to create three-dimensional electron transfer protein structures with complexities at that magic number of 1,014 elements per device, something the SDI people thought would be possible in the 2020 to 2030 period.

Whatever technology follows electronics, if there is any technology to follow electronics, I`d like to think we can do better than the New York Times and assure you that you can read about the next paradigm shift on the front page of Military & Aerospace Electronics. At least that`s what I`d like to think.

Voice your opinion!

To join the conversation, and become an exclusive member of Military Aerospace, create an account today!