Birth of a Standard: The Intel 8086 Microprocessor
The release of Intel's 8086 microprocessor in 1978 was a watershed moment for personal computing. The DNA of that chip is likely at the
What's most surprising about the tremendous success of the 8086, though, is how little people expected of it when it was first conceived. The history of this revolutionary processor is a classic tale of how much a small team of bright engineers can accomplish when they're given the freedom to do their jobs in innovative ways.
When development of the 8086 began in May 1976, Intel executives never imagined its spectacular impact. They saw it as a minor stopgap project. They were pinning the company's hopes on a radically different and more sophisticated processor called the 8800 (later released as the iAPX 432). In an era when most chips still used 8-bit data paths, the 8800 would leapfrog all the way up to 32 bits. Its advanced multitasking capabilities and memory-management circuitry would be built right into the CPU, allowing operating systems to run with much less program code.
But the 8800 project was in trouble. It had encountered numerous delays as Intel engineers found that the complex design was difficult to implement with then-current chip technology. And
Enter the Architect
Intel execs maintained their faith in the 8800, but knew they needed to respond to Zilog's threat somehow. They turned to Stephen Morse, a 36-year-old electrical engineer who had impressed them with a critical examination of the 8800 processor's design flaws. The company's upper brass picked Morse as the sole designer for the 8086. "If [Intel] management had any inkling that this architecture would live on through many generations and into today's ... processors," recalls Morse, "they never would have trusted this task to a single person." (For more, see our in-depth interview with Morse.)
Picking Morse was surprising for another reason: He was a software engineer. Previously, CPU design at Intel had been the domain of hardware engineers alone. "For the first time, we were going to look at processor features from a software perspective," says Morse. "The question was not 'What features do we have space for?' but 'What features do we want in order to make the software more efficient?'" That software-centric approach proved revolutionary in the industry.
Although the 8086 was Morse's pet project, he didn't work alone. Joining
Beyond laying down some basic requirements--that
Upon its release, Morse's creation hardly took the computing world by storm. The midrange personal-computer market was saturated
In March 1979, Morse left Intel. Then a series of seemingly unremarkable events conspired to make the 8086 an industry standard.
A few weeks after Morse's departure, Intel released the 8088, which Morse calls "a castrated version of the 8086" because it used an adulterated version of the 8086's 16-bit capability. Since many systems were still 8-bit, the 8088 sent out the 16-bit data in two 8-bit cycles, making it compatible with 8-bit systems.
Two years later, IBM began work on the model 5150, the company's first PC to consist only of low-cost, off-the-shelf parts. It was a novel concept for IBM, which previously emphasized its proprietary technology to the exclusion of all others.
Obviously, an off-the-shelf system demanded an off-the-shelf microprocessor. But which to choose? IBM decided early on that its new machine required a 16-bit processor, and narrowed the choices down to three candidates: the Motorola 68000 (the powerful 16-bit processor at the heart of the first Macintosh), the Intel 8086, and its "castrated" cousin, the Intel 8088.
According to David J. Bradley, an original member of the IBM development team, the company eliminated the Motorola chip from consideration because
IBM then had to choose between the 8086 and the 8088. Ultimately, the decision came down to the simple economics of reducing chip count. IBM selected the 8088, a decision that allowed the company to build cheaper machines because it could use fewer ROM modules and less RAM, Bradley says.
In a sense, though, it didn't matter which of the Intel chips IBM chose. Both were built on the same underlying 8086 code written by Stephen Morse.
From Chip to Standard
How did the 8086
Gradually, however, the disparate parts of the PC universe fell into orbit around the 5150. One big reason for its success was the IBM name on the box. The brand had more cachet among business buyers than rival companies such as Radio Shack or Apple. The question of the day was, "Do you want to buy a computer from International Business Machines or from a company named after a fruit?" Bradley says.
And because IBM had used off-the-shelf components, other companies could produce clones--and clone they did.
With the IBM PC quickly becoming dominant, Intel capitalized on the trend by developing improved versions of the 8086 over the years, starting with the 80186 and
Right Place, Right Time
According to Morse and Bradley, our current x86-dependence mostly came down to chance. "I was just lucky enough to have been at the right place at the right time," says Morse. "Any bright engineer could have designed the processor. It would probably have had a radically different instruction set, but all PCs today would be based on that architecture instead." In a similar vein, IBM veteran Bradley jokes, "If IBM had chosen the Motorola 68000 for the IBM PC (as some wanted), we would have had the WinOla duopoly rather than the Wintel duopoly."
The true power of x86 lies not in the particular operation codes that make our CPUs run, but in the momentum of common computer standards. The 8086 paved the way for rapid, exponential progress in computer speed, capacity, and price-performance--all driven by fierce competition among hundreds of companies vying to improve the same thing.
Morse's humble 8086 instruction set still lies at the heart of nearly every modern PC CPU, from the Opteron to the Athlon to the Core 2 Quad. For a practical demonstration of just how powerful the x86 standard is, consider this: Any assembly-language program written as far back as 1978 for the Intel 8086 microprocessor will run, unmodified, on Intel's latest Core 2 Extreme CPU--just 180,000 times faster.