The tech industry thrives on legends. Here’s a good one.
It’s 1958, and Texas Instruments stands deserted. Everyone at the Dallas-based company traditionally takes two weeks of vacation in July, leaving the the plant empty.
Well, almost everyone. Hidden away, quietly toiling within the cavernous Semiconductor Building, is one Jack Kilby. Kilby’s the new guy at Texas Instruments—so new he’s not entitled to any vacation time yet.
Kilby’s a Midwesterner. Born in Missouri, he spent most of his youth in the oil-rich city of Great Bend, Kansas, named because it nestles up against a curve in the Arkansas River.
He’s a lover of amateur radio and Big Band music, a veteran of World War II’s Office of Strategic Services (the predecessor of the CIA), and an electrical engineer.
Kilby also was seemingly born into technology—his dad, also an engineer, ran a small electric company.
Left to his own devices at work, Kilby decides he’ll try and solve the “tyranny-of-numbers” issue facing the industry.
As electronics got progressively more complicated, they also required an increasing number of components. The invention of the transistor in 1947 rendered the cumbersome vacuum tube obsolete, but now you had hundreds—even thousands—of miniscule components to wire together.
It was labor-intensive, expensive, and (worst of all) unreliable. Every soldered connection formed a potential point-of-failure in the end product. With thousands of soldered wires, circuits became as fragile as the old vacuum tubes.
When Kilby joined Texas Instruments in 1958 the company already had a potential solution, known as the Micro-Module program. The Micro-Module program sought to make all components the same size, which would then snap together like puzzle pieces to form circuits.
However, people still needed to assemble each circuit by hand. It solved the soldering problem, but labor remained an issue.
“Further thought led me to the conclusion that semiconductors were all that were really required—that resistors and capacitors, in particular, could be made from the same material as the active devices,” wrote Kilby in 1976.
Sitting there in the abandoned abode of Texas Instruments, Kilby grabbed his lab notebook and described what came to be known as “The Monolithic Idea”—that resistors, capacitors, and transistors could be manufactured from the same block of material and included in a single chip.
Then he sketched out a quick design for a flip-flop circuit using components made entirely of silicon.
Semiconductors, such as silicon and germanium, are physically unique. In their purest forms, they’re just poor electrical conductors—better than an insulator (like glass) but nowhere near as efficient as metal.
You can shape how a semiconductor conducts electricity, however, by modifying the base substance with impurities. Kilby realized each component of a circuit could be built from the same material. The individual pieces wouldn’t be as efficient as those made with specialized materials—Teflon, for instance, was a better capacitor than a modified semiconductor—but it could be done.
Said Kilby in his 2000 Nobel Prize lecture, “Resistors were provided by the bulk resistance in the silicon, and capacitors were formed at the p-n junctions”—in other words, where two types of impurities met. Armed with his sketches, Kilby went to his supervisor, a man by the name of Willis Adcock, and asked for time to pursue the theory. He got it.
First, Kilby created a prototype circuit made entirely of discrete pieces of silicon. While not housed on a chip, he’d at least proved that a complete circuit could be made from a single material.
Then came the biggest step. At the time, Texas Instruments built transistors from germanium wafers, and Kilby managed to snag a few of these before they’d been cut up. For such an important device, the first integrated circuit is as unpretentious as it gets: a thin ribbon of germanium, crudely glued to a glass slide, the circuit etched in by hand.
On September 12, 1958, Kilby called together the company’s executives. He hooked the crude piece of germanium up to an oscilloscope, passed in a current, and a simple sine wave appeared on-screen.
It was a sine wave that changed the world forever. The integrated circuit, or microchip, is the foundation of modern electronics. It’s the reason you can carry an entire computer in your pocket, instead of owning one that takes up an entire office. It enables the Internet to exist. It’s the reason humans landed on the moon.
So that’s the legend. Is it the full story? Of course not. Legends never are.
History of the integrated circuit
Like many inventions, the integrated circuit was really a matter of time. Kilby drew upon the works of an Englishman, Geoff Dummer, when coming up with the idea of the integrated circuit. In the early 1950s, Dummer proposed electronics built from a single block of components, but he lacked the technique to make it into a reality.
Then there was Robert Noyce (Noyce and Kilby received the Draper Prize together in 1989). Noyce, often referred to as “the Mayor of Silicon Valley,” is credited as the co-inventor of the integrated circuit, and for good reason.
Noyce came up with the same idea completely independently, used silicon instead of germanium (silicon operates at higher temperatures), and had an altogether more-refined design.
Oh, and he went on to co-found Intel in 1968 with colleague Gordon Moore. Intel, of course, created the first microprocessor, equally important to modern computing.
And you probably know Texas Instruments because—at one point—you took a math class and used one of the company’s calculators. Oddly enough, Kilby gets credit for that one as well.
He and two co-workers, Jerry Merryman and James Van Tassel, developed the electronic handheld calculator because Texas Instruments needed a way to sell the public on the consumer benefits of the integrated circuit.
A modest man
Kilby might’ve laughed if called a legend. By all accounts, he was a simple sort, comfortably middle-class and content with his accomplishments—the consummate engineer who believed solving a problem was its own reward.
While proud of the integrated circuit, he was always quick in speeches and articles to lavish praise on the innovators who came both before and after. “I am pleased to have had even a small part in helping turn the potential of human creativity into practical reality,” Kilby said in his Nobel lecture.
Yes, Kilby might’ve laughed if called a legend, but legends are what we remember.
So we turn it into legend: Dallas, 1958, an abandoned building and a man who helped to completely change the world while his peers were on vacation.
Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read ouraffiliate link policyfor more details.
CPUs and Processors
Hayden writes about games for PCWorld and doubles as the resident Zork enthusiast.