15 Turning Points in Tech History
Intel Dispels the Megahertz Myth
In the early days of PC chip manufacture, speed was the name of the game. All you had to do was crank up the clock cycles and watch performance-hungry customers come running. But as the new century dawned and clock speeds soared into the gigahertz, old chip designs couldn't keep up. They ran too hot and consumed too much power. Enter the Intel Pentium M, a radical new chip pioneered by a team at Intel's Haifa, Israel, labs, led by Mooly Eden.
Though the Pentium M was intended for mobile PCs, with lower power consumption and more efficient instruction pipelines than contemporary CPUs, it became clear that Intel was onto a breakthrough, even for desktops.
Soon the die was cast. Today, Intel's leading Core series of chips, launched in 2006, are derived from the Pentium M, while the company's earlier architecture is due to be retired later this year. From now on, the chips that win the race will have to be not just faster, but smarter.
NetWare Falls to the Net
Before Novell NetWare debuted in 1985, transferring files in the typical office meant handing off a floppy disk. NetWare's affordable PC networking quickly took the computing world by storm; by the late 1980s, Novell claimed a 90 percent share of the market.
But Novell never foresaw NetWare's Achilles' heel. If Microsoft was famously slow to embrace the Internet, the same goes double for Novell.
By most standards, Windows NT was clunky compared to NetWare, but it had one clear advantage: native support for TCP/IP, the core protocol of the Internet. NetWare servers relied on the older protocols IPX and SPX, which made it harder to integrate NetWare servers with FTP clients, browsers, and Internet e-mail. As demand for the Internet soared, businesses began replacing their NetWare servers with Windows, and NetWare networks were soon headed the way of the floppy disk.
IT Gets Saddled with Accountability
Running a large IT shop was never easy, but at least you didn't have the government breathing down your neck. That changed in 2002, however, with the passage of the Sarbanes-Oxley Act.
Enron, WorldCom, and other corporate accounting scandals exposed the need for greater accountability on the part of publicly traded companies. But accountability means auditing, and to audit you need records. Unfortunately, the burden of recordkeeping required by Sarbanes-Oxley fell to IT.
By 2005, IT managers found themselves spending an inordinate amount of time and money on SOX compliance. Vague rules and unproven technologies left many guessing. And if SOX wasn't bad enough, healthcare companies had the added burden of HIPAA to worry about.
Whether the next administration will see fit to revisit these regulations remains to be seen, but for IT there's no going back. Regulatory compliance has cemented its position as a key component of business operations, for better or for worse.
Apple Flips its Chip Strategy
The Macintosh has always stood apart, even before Apple launched its "Think Different" ad campaign in 1997. In defiance of the x86 platform's dominance of the PC chip market, the earliest Macs used Motorola 68000-series CPUs. Later, when performance demanded an upgrade, Apple switched to the PowerPC, but the net effect was the same: Macs and PCs were as fundamentally different as, well, apples and oranges.
But Apple couldn't fight the tide forever. Performance bottlenecks and high power consumption dogged the PowerPC, and by 2005 its future as a general-purpose processor seemed doubtful. In June of that year, Apple announced that it would begin shipping Macs based on Intel processors, ending 20 years of Thinking Different about CPUs.
In so doing, the PC processor market effectively became a monoculture. Virtually every mainstream computer you can buy today is based on Intel's architecture. Macs can even run Windows. But it's OK, Mac fans; if what's inside doesn't make you feel different, how it looks still can.
Outsourcing Goes Global
As the year 2000 approached, U.S. companies saw the Y2K problem as a threat to their software. What they didn't anticipate was the impact it would have on the American workforce.
Faced with a shortage of hands to address the Y2K crisis, IT departments looked abroad for answers. They found an untapped gold mine. The rise of the Internet, coupled with social and economic reforms, had fostered a veritable army of highly skilled workers in the developing world.
Indian companies, including Infosys and Wipro, were among the first to popularize offshore IT outsourcing, but companies in Russia, Eastern Europe, China, and elsewhere would soon follow. Meanwhile, the Immigration Reform and Control Act of 1990 had created the H-1B visa program, which made it easier for U.S. companies to import foreign workers.
Today the Y2K problem may be long behind us, but these trends show no sign of slowing. As more and more countries awaken to the Internet economy, IT workers must struggle to stay competitive in an increasingly flat world.
15 Turning Points in Tech History