Skip to content
Business
Link copied to clipboard

How small is too small? Transistor may find out soon

SAN JOSE, Calif. - Sixty years after transistors were invented and nearly five decades since they were first integrated into silicon chips, the tiny on-off switches dubbed the "nerve cells" of the information age are starting to show their age.

A replica of the first transistor (left), built 60 years ago. Paul Otellini, Intel Corp. CEO, holds a wafer of chips with billions of transistors.
A replica of the first transistor (left), built 60 years ago. Paul Otellini, Intel Corp. CEO, holds a wafer of chips with billions of transistors.Read more

SAN JOSE, Calif. - Sixty years after transistors were invented and nearly five decades since they were first integrated into silicon chips, the tiny on-off switches dubbed the "nerve cells" of the information age are starting to show their age.

The devices - whose miniaturization over time set in motion the race for faster, smaller and cheaper electronics - have been shrunk so much that the day is approaching when it will be physically impossible to make them tinier.

Once chip makers can't squeeze any more into the same-sized slice of silicon, the dramatic performance gains and cost reductions in computing over the years suddenly could slow. And the engine that has driven the digital revolution, and modern economy, could grind to a halt.

Even Gordon Moore, the Intel Corp. cofounder who famously predicted in 1965 that the number of transistors on a chip should double every two years, sees that the end is fast approaching - an outcome the chip industry is scrambling to avoid.

Preparing for the day they can't add more transistors, chip companies are pouring billions of dollars into plotting new ways to use the current transistors, instructing them to behave in different and more powerful ways.

Intel, the world's largest semiconductor company, predicts that a number of "highly speculative" alternative technologies, such as quantum computing, optical switches and other methods, will be needed to continue Moore's Law beyond 2020.

Transistors work something like light switches, flipping on and off inside a chip to generate the ones and zeros that store and process information inside a computer.

The transistor was invented by scientists William Shockley, John Bardeen and Walter Brattain to amplify voices in telephones for a Bell Labs project, an effort for which they shared the Nobel Prize in physics.

On Dec. 16, 1947, Bardeen and Brattain created the first transistor. The next month, on Jan. 23, 1948, Shockley, a member of the same research group, invented another type, which went on to become the preferred transistor because it was easier to manufacture.

Transistors' ever-decreasing size and low power consumption made them ideal candidates to replace the bulky vacuum tubes then used to amplify electrical signals and switch electrical currents. AT&T saw them as a replacement for clattering telephone switches.

Transistors eventually found their way into portable radios and other electronic devices, and are most prominently used today as the building blocks of integrated circuits, another Nobel Prize-winning invention that is the foundation of microprocessors, memory chips, and other kinds of semiconductor devices.

Since the invention of the integrated circuit in the late 1950s - separately by Texas Instruments Inc.'s Jack Kilby and future Intel cofounder Robert Noyce - the pace of innovation has been scorching.

The number of transistors on microprocessors - the brains of computers - has leaped from several thousand in the 1970s to nearly a billion today, a feat that has unleashed previously unimaginable computing power.

"I think [the transistor] is going to be around for a long time," Moore said. "There have been ideas about how people are going to replace it, and it's always dangerous to predict something won't happen, but I don't see anything coming along that would really replace the transistor."