Arno Penzias: his knowledge runs from the big bang to the barbershop

WASHINGTON - Arno Penzias lives - intellectually - in the year 2020. He returns to our time periodically to report on what he sees there.

Oct 1st, 1997

by John Rhea

WASHINGTON - Arno Penzias lives - intellectually - in the year 2020. He returns to our time periodically to report on what he sees there.

As vice president and chief scientist at Lucent Technologies (formerly Bell Labs), he is a sort of minister without portfolio who can poke around anywhere he likes to extrapolate not only existing trends and determine how technology can improve human life in the 21st century, but also what we have to do now to make that happen. He splits his time between one home in Bell Labs country in New Jersey and another in Silicon Valley that he uses as his observation post.

Penzias is, predictably, optimistic. His vision is a world with more choices, more complexity, more opportunities, greater economies of scale, greater durability of knowledge, and less uniformity. The purpose of technology, particularly information technology, is to enhance human beings, not replace them, he maintains.

But he also warns of a dark side, a world in which privacy as we understand it today disappears. Exponential increases in bandwidth and processing power may breach the technological barriers to a world of constant individual surveillance postulated by George Orwell in his 1949 classic, 1984. That was impossible in 1949. Who would watch the watchers? It`s impossible today, but there are no guarantees it will always be impossible

If you want to put together at least a fuzzy map of the future, it is essential to understand the significance of past milestones. Penzias has an excellent benchmark that he uses to extrapolate computing power: IBM in 1954 developed the first mainframe computer capable of true data processing. Unlike its predecessors, the IBM 704 could process alphabetic and numeric data. It was no longer merely an electronic calculator (the original name for computers; the human operators were called "computers") to crunch numbers.

The IBM 704 had 128K of memory - that`s not a misprint - at a cost of one dollar per bit. In 1971 Intel introduced its revolutionary DRAM, the 1024-bit 1103, at sample prices of $100 per chip. They rapidly fell to $10 in production quantities. The U.S. dollar (which was worth a lot more in 1954) now bought 100 bits. Today`s dollars (which are worth even less) buy a million bits.

The past is prologue, as it says on the facade of the National Archives in Washington. Penzias expects current silicon-based technology to drive up performance by another factor of a million, including a 100-fold decrease in feature size at the chip level to a resolution equivalent to 500 atoms. "We can get 10 times more speed without even trying," he says, but the next big frontier will be improvements in the algorithms needed for speech recognition and pattern recognition.

Spending a couple hours with Arno Penzias, a I did before a recent AFCEA technology conference, must be like being invited to the house of Socrates for dinner and stimulating conversation afterward. The man who received the Nobel prize in physics in 1978 for accidentally discovering evidence of the "big bang" which created the universe (when all he really wanted to find out was why his radio astronomy antennas were plagued by static) knows no limits except the limits of the human imagination - which means no limits at all.

In this, the 50th anniversary year of the transistor, which created the modern electronics industry, Penzias recounts some of the lessons we have learned in the past half century. For example, as World War II was coming to an end, it was becoming increasingly obvious that the American economy would soar and that this would put a burden on the manually operated telephone system. Within Bell Labs there were predictions that, unless improvements to the system were made quickly, every woman in the United States would have to be a telephone operator.

If you will overlook for the moment the sexism of those days, in which being a telephone operator was considered a "woman`s job," consider what actually happened. Bell Labs was half right. Today, with our complex of satellites and computers and fiber optic cables, everybody is a telephone operator. When was the last time you talked to a flesh-and-blood telephone operator? Some may feel nostalgia for those less hectic days, and I do sometimes, but not the people who depend on instantaneous global communications.

Ironically, Bell Labs, whose scientists pioneered the transistor mostly for the military command and control systems of the early days of the Cold War, lagged in implementing the invention in its own telephone network. The Belltone hearing aid company grabbed the ball and ran with it while AT&T operators (yes, mostly women) were still answering the phone. Penzias struck me as a little defensive about this lapse, claiming that the reliability of transistors had yet to be established then and that it wasn`t as serious for a hearing aid not to work every time. But one doesn`t tell a Nobel laureate to stuff a sock in it.

There is a lesson here. The same kind of serendipity that got Penzias his Nobel prize for finding something he wasn`t looking for got the world into the age of solid-state electronics through the mechanism of the lowly hearing aid. Where are the next equivalents of hearing aids between now and the year 2020?

The most essential commodity in getting there from here isn`t the sub-sub-micron chips or the robust algorithms. It`s the most precious commodity of all: imagination. Here is where Arno Penzias really shines, and I`ll let him finish this column for me with a passage from his book Ideas and Information* (which I, like a techno-groupie, asked him to autograph for me):

"When my son, David, was looking for his first job, one interviewer asked him, `How many barbers are there in the United States?` Not every engineering graduate would welcome that kind of question, but it was a good way of finding out whether or not a prospective co-worker could deal with the kinds of things not taught explicitly in engineering school."

(At this moment, put the magazine down for a moment and try to do this in your head. OK, go ahead and use a calculator if you like. Nobody`s watching.)

"David remembered that there were four barber shops on the main street of our town with a total of about ten barbers. Since our town has something over 10,000 people in it, that worked out to one barber per thousand, or about 200,000 barbers in a nation of 200 million people. The interviewer used a different method: one haircut per month for each of the 100 million people who get haircuts, and 400 haircuts per month per barber, which works out to 250,000.

"When David presented me with the problem, I used money instead. [So did I and I got about the same answer as Arno Penzias.] I figured that each of the 100 million men who patronized barbershops spends $100 a year on haircuts, or $10 billion all together. I guessed that each barber must take in about $30,000 a year, in order to make a living and pay for a share of the shop, which gave me about 300,000 barbers. The actual number is just under 100,000, according to the U.S. Department of Labor."

* Arno Penzias, Ideas and Information, Managing in a High-Tech World. 1989. New York: Simon & Schuster, 224 pages.

More in Communications