The release of Intel’s 8086 microprocessor in 1978 was a watershed moment for personal computing. The DNA of that chip is likely at the center of whatever computer—Windows, Mac, or Linux—you’re using to read this, and it helped transform Intel from merely one of many chip companies to the world’s largest.
What’s most surprising about the tremendous success of the 8086, though, is how little people expected of it when it was first conceived. The history of this revolutionary processor is a classic tale of how much a small team of bright engineers can accomplish when they’re given the freedom to do their jobs in innovative ways.
When development of the 8086 began in May 1976, Intel executives never imagined its spectacular impact. They saw it as a minor stopgap project. They were pinning the company’s hopes on a radically different and more sophisticated processor called the 8800 (later released as the iAPX 432). In an era when most chips still used 8-bit data paths, the 8800 would leapfrog all the way up to 32 bits. Its advanced multitasking capabilities and memory-management circuitry would be built right into the CPU, allowing operating systems to run with much less program code.
But the 8800 project was in trouble. It had encountered numerous delays as Intel engineers found that the complex design was difficult to implement with then-current chip technology. And Intel’s problems didn’t stop there—it was being outflanked by Zilog, a company started by former Intel engineers. Zilog had quickly captured the midrange microprocessor market with its Z80 CPU. Released in July 1976, it was an enhanced clone of Intel’s successful 8080—the processor that had effectively launched the personal-computer revolution. Intel had yet to come up with an answer to the Z80.
Editor's note: This article originally published on June 17, 2008. We updated the article with improved formatting and a new primary image on June 8, 2018.
Enter the Architect
Intel execs maintained their faith in the 8800, but knew they needed to respond to Zilog’s threat somehow. They turned to Stephen Morse, a 36-year-old electrical engineer who had impressed them with a critical examination of the 8800 processor’s design flaws. The company’s upper brass picked Morse as the sole designer for the 8086. “If [Intel] management had any inkling that this architecture would live on through many generations and into today’s ... processors,” recalls Morse, “they never would have trusted this task to a single person.” (For more, see our in-depth interview with Morse.)
Picking Morse was surprising for another reason: He was a software engineer. Previously, CPU design at Intel had been the domain of hardware engineers alone. “For the first time, we were going to look at processor features from a software perspective,” says Morse. “The question was not ‘What features do we have space for?’ but ‘What features do we want in order to make the software more efficient?’” That software-centric approach proved revolutionary in the industry.
Although the 8086 was Morse’s pet project, he didn’t work alone. Joining Morse’s team were other Intel employees, including Bill Pohlman, Jim McKevitt, and Bruce Ravenel, all of whom were essential in bringing the 8086 to market in the summer of 1978.
Beyond laying down some basic requirements—that the 8086 be compatible with software written for the popular 8080 chip and that it be able to address 128KB of memory—Intel leadership stayed out of Morse’s way. “Because nobody expected the design to live long, no barriers were placed in my way, and I was free to do what I wanted,” he says.