Intel CEO's legacy: PC glory, smartphone grief

Today ends the last full week of Paul Otellini's reign as Intel's CEO.

When Otellini passes the torch to current COO Brian Krzanich at next Thursday's annual stockholder's meeting, he'll be stepping away from a stratospheric eight-year run that saw Intel truly cement its position as Chipzilla, the 800-pound beast ruling over the PC processor market. Need proof? Intel made more than $1 billion per week in 2012, a 50-plus-percent increase over Otellini's rookie stats.

“When Otellini walked into the job, Intel was in chaos,” says Patrick Moorhead, president and principal analyst at Moor Insights and Strategy, and a former vice president at AMD—a.k.a Intel’s bitter rival—during most of Otellini’s tenure.

Now, Intel is in a clear leadership position, and the company's actions during Otellini's tenure have made an indelible mark on the entire PC industry. Just how did he do it? Simple: by focusing on Intel's core strengths and going from hit to hit.

Keeping the doctor away

Otellini's run started with a bang. His first major coup occurred a mere month after he ascended to the lead role in May 2005: In June, Apple announced plans to transition Macs to Intel processors exclusively. By January 2006, the migration had begun.

Mere months after announcing the plan to switch Macs to Intel, Paul Otellini made a surprise appearance during a Steve Jobs presentation to "report that Intel's ready."

Until then, the Woz's wonder computer ran IBM's PowerPC processors. The move meant a lot of things for Apple—shifting your entire platform to a whole new processor architecture is kind of a big deal—but for Intel, it meant that for the first time, all of the dominant PC ecosystems were on board the x86 bandwagon. Less than a year after the announcement, Apple released Boot Camp, giving Mac-heads the ability to run Windows XP seamlessly. Such are the benefits of sweet processor synchronicity.

Ironically, part of the reason Steve Jobs and company decided to get cozy with Otellini was because Intel's roadmap promised better long-term power efficiency than IBM's. As rumors swirl of a possible Mac migration to ARM processors, that previous success looms as an ominous specter.

Ultrabooks

Speaking of Apple, Intel forcibly created a new breed of Windows-based MacBook Air alternatives with the genesis and careful nurturing of its Ultrabook brand.

Okay, okay. Maybe Ultrabooks haven't set the world ablaze as quickly as Intel originally forecast, but pricey ultrathins have been a rare growth segment in an otherwise utterly forlorn PC market. The slim-and-trim aesthetic has also bled into the mainstream, thanks in no small part to Intel's massive marketing blitz.

Robert Cardin

There's no doubt that the innovations pushed in Ultrabooks have already left a mark on the computing landscape.

“I think people will look back and say Ultrabooks are Otellini’s finest accomplishment,” says Moorhead. “But I have to give him credit: In literally less than four years, we’ve seen the average notebook thickness cut in half at the same price point, a big move to touch, and a sharp swing towards SSDs.”

Oh, and the MacBook Air that Ultrabooks originally aped?

“Quite frankly, if it wasn’t for Intel, Apple would never have had the MacBook Air,” Moorhead says. “Without Intel’s very ultra-low-voltage processors, Apple would never make a MacBook Air.”

The sound of technical superiority

Intel
Intel's power lies in its fabs, which boast the best processor technology in the world.

Tick-tock. Tick-tock. For some, that sound represents the inevitable march of time. For Intel, it signifies the relentless march of innovation.

Let me paint a picture: It's 2005. AMD is coming off a string of high-profile wins, from the 2003 unveiling of the first 64-bit x86 chip to the rapid-fire releases of the company's first dual-core server chip (an Opteron) and dual-core desktop chip (the Athlon 64 X2). Shortly thereafter, it would snag the performance crown for several years. Those achievements garnered a then-record 20 percent slice of the overall processor market in 2006 for AMD, with the company nearing a 30 percent share of the desktop market for the latter half of the decade. Senior AMD suits started squawking that they wanted even more.

Otellini's now-legendary response: Tick-tock.

Intel

Introduced in 2007, at the height of AMD's resurgence, the tick-tock design principle became a guiding light for Intel's processor technology. In "tick" years, the transistor technology used in Intel's chips are shrunk down, and the manufacturing process is improved. For example, Intel's 2011 Sandy Bridge chips were built using the 32-nanometer (nm) manufacturing process, while the Ivy Bridge "tick" follow-up is built using 22nm.

"Tock" years, meanwhile, introduce a whole new processor microarchitecture, with generally bigger innovations than the tick years. The Haswell processors coming this June are a tock, and they're expected to bring huge gains in power efficiency and graphics capabilities when compared to Ivy Bridge.

Tick. Tock. Tick. Tock. Innovation, like clockwork.

“Tick-tock was a very simple, easy-to-remember statement that reflects execution,” says Moorhead. “Before that, Intel had a very poor reputation for execution, architecture, and even design. What tick-tock did was get them back on track for always hitting their dates, at very high levels of execution.”

Otellini's vision paid off. Intel now commands more than 83 percent of the PC processor market, while a somewhat floundering AMD recently turned to ARM-based server processors and a new custom chip unit in an attempt to shore up its position.

The careful tending of technical superiority

At the end of the Otellini era, Intel clearly holds the lead in processor manufacturing technology. In fact, it's the only chipmaker that has been able to keep pace with Gordon's Moore's famous law.

Doing so hasn't been easy. The never-ending push of tick-tock has kept the Intel engineers' collective eyes on the prize, but the company has had to introduce revolutionary new technology to each tick cycle in order to shrink transistors from their 90nm size in 2005 to the 22nm technology used in Ivy Bridge.

Intel innovations in manufacturing, Moore's Law presentationIntel
Intel's technology revolutions, year-by-year. (Click to enlarge.)

The chart above shows some of Intel's major milestones. The switch to high-k metal gate transistors in 2007 was a huge move, as was the transition to immersion lithography in 2009. The introduction of "tri-gate" three-dimensional transistors in 2012's Ivy Bridge was a fundamental rethinking of the core structure of processors. Competing chipmakers aren't expected to mass-produce 3D chips until 2015 at the earliest.

[Read more: Breaking Moore's Law: How chipmakers are pushing PCs to blistering new levels]

Otellini is leaving Intel with clear current-day technical superiority, but he's also invested heavily in the future. In 2012, Intel gave ASML Technologies $3.3 billion to spur the development of larger 450mm silicon wafers and extreme ultraviolet lithography technology, which is expected to eventually replace the current immersion lithography technique when transistor sizes drop below 10nm.

But the biggest gift Otellini leaves for Intel's engineers is cold, hard cash: The company's R&D and acquisitions budget is a whopping $18.9 billion—with a "B"—in 2013.

Coooooooooorrrrrrrrres!

Intel
This Intel concept chip has a whopping 48 cores.

Multicore CPUs first reared their heads under Otellini's watch. While AMD's Athlon 64 X2 beat Intel's dual-core Pentiums to the punch, Intel shipped the first quad-core consumer chip the very same year its dual-core processors were introduced—2006.

Less than a decade later, every mainstream Intel and AMD processor rocks two or more cores. But beyond multifaceted processors, the most important score for Intel was its Core architecture, the devastating response to AMD’s 64-bit processors.

“I think it was the Core architecture that really [thrust Intel into a technological leadership role],” Moorhead says. “And Otellini did a lot to take the Core architecture and really drive it, as quickly as humanly possible, into every single thread of the business.”

To Itanium, and beyond!

That's not to say Otellini's track record is flawless.

He presided over the largest layoff in Intel history way back in 2006. He got beat to the 64-bit punch by AMD and was slow to respond to the quick-rising mobile threat—a mistake that has left Intel scrambling to establish a beachhead in the bourgeoning tablet and smartphone markets.

Despite those stumbles—and the slow embrace of mobile is a major one—Otellini is leaving Intel in far better shape than it was when he claimed the CEO throne. It's cash-flush, technologically superior, and poised to continue its dominance for years to come.

Otellini's brilliance may be leaving Intel with a nicely stacked deck. All things considered, though, Brian Krzanich still has a big task ahead of him when he takes office on May 16: Intel's next-gen CEO must get inside next-gen devices.

Subscribe to the Power Tips Newsletter

Comments