Humble beginnings
Technology marches relentlessly onward, discarding the old to make way for the new. Today’s heroes quickly becomes yesterday’s news. As pundits ponder the future of the PC—Do desktops matter anymore? Are tablets PCs or something else? What about hybrids like the Surface?—we figured it was time to hop off the hype cycle and forget the constant tick-tocking of Moore’s Law for a moment.
Let’s revel in the roots of the wonder machines that make our lives easier. From the first GPUs and CPUs to the forerunner of the Internet, these are the breakthroughs that birthed computers as we know them today. If nothing else, staring history in the face reminds us that all computing devices share the same background, regardless of what shape and form PCs take today.
Laying the processor path

Before the first mainstream PCs could evolve from room-size mainframes and punch-card machines, the proper components needed to be created. Let’s start with the beating heart of the PC: the processor. Intel laid claim to the “first commercially available microprocessor” crown with its 4004 chip, which it introduced in 1971.
Intel’s site does an admirable job of tooting the 4004’s horn, but here is a particularly juicy stat. The first microprocessor had 2300 transistors with circuit features 10,000 nanometers wide. Intel’s latest “Haswell” Core processors, meanwhile, pack more than 1.4 billion transistors, with each sporting features measuring just 22nm across.
Image: CPU-Zone
Super-sized storage

Hard drives, as we know them, have technically been around since 1956, when IBM shipped them as part of the RAMAC 305, a system as large as a pair of refrigerators. These puppies weren’t like today’s svelte 3.5-inch storage options, though: The RAMAC’s hard drive consisted of 50—count ’em, 50—24-inch platters. The RAMAC 305 offered 5MB of data space, at an approximate cost of $10,000 per megabyte. (And you thought SSDs were expensive!)
Those spinning platters eventually got smaller, thanks to the march of technology, but it wasn’t until the late 1980s that hard drives became a somewhat common feature in mainstream PCs. Even then, floppy drives remained ubiquitous for years thereafter.
Image: Wikimedia Commons
RAMming speed

Long before the RAMAC introduced storing data on spinning platters, engineers were working hard at perfecting random access memory, or RAM. The first RAM implementation came out of England’s University of Manchester, where inventors Freddie Williams and Tom Kilburn successfully used the “Williams tube” in 1948. The Williams tube tracked dots flashing on a CRT screen, serving as a primitive form of RAM.
Unfortunately, the Williams tube was unreliable and prone to failure. Magnetic-core memory (pictured) debuted shortly thereafter and became the go-to standard for decades. Magnetic-core memory utilized a tic-tac-toe-like array of wires, each section of which could be magnetized either clockwise or counterclockwise, which created basic “one” and “zero” states.
Image: Wikimedia Commons
Hunting and pecking

The inspiration for the keyboard lies back in the analog days of olde, with the humble typewriter. Users of early mainframes and computers relied on electrical teleprinters and keypunches to communicate with the machines, before those devices were eventually replaced by computer screens and keyboards.
Image: Women entering data on punched cards, via Wikimedia Commons
Pointing and clicking

The evolution of the mouse is a story unto itself, and one that we’ve told before. The bones of today’s precision input device of choice can be found in Douglas Engelbart’s “X-Y Position Indicator For A Display System” (shown here), which tracked motion with the help of two perpendicular discs to monitor movement, two corresponding potentiometers, a lonesome top-mounted button, and a wooden case to hold everything together—in 1963.
The first tracking device that actually looked like the mice we use today came out of Xerox’s acclaimed PARC research and development facility in 1972. Sporting three buttons and the same basic layout as today’s point-and-clickers, that mouse was designed for use with the Xerox Alto—one of the very first commercially available PCs.
The dawn of the PC era

By the early 1970s, all the pieces were in place, and a deluge of personal computers began to sweep in, each vying for the right to scream “FIRST!”
The Computer History Museum considers the $750 Kenback-1 to be the holder of that particular title. Packing 8 bits of fury and a whopping 256 bytes of memory—or roughly 1/4096 of a megabyte—the Kenback-1 relied on switches and blinking lights for input and output functions, but only 40 were ever made. Meanwhile, the $1750 Micral N (pictured) was the earliest commercial, non-kit PC available (it came to market in 1973), but it never reached U.S. shores.
While the Kenback-1 and Micral N were technically PCs, however, they bore little resemblance to the computers we use today.
The Xerox Alto

Of all the options available in the first wave of computers, none so clearly resembled the PCs of today as much as the Xerox Alto, which premiered in 1974. Beyond the aforementioned mouse, the Alto boasted software that included a graphical interface complete with menus, icons, and even basic cut-and-paste capabilities. Email and word processing programs, along with an early Paint-style bitmap editor, were eventually written for the machine.
Xerox’s seminal PC was never sold commercially, but thousands of units were used within Xerox and distributed to universities. Later stars such as the Apple II, Radio Shack TRS-80, and IBM PC followed the Alto’s basic formula almost to a T.
Image: Wikimedia Commons
Computing on the run

As soon as computers hit the scene, they started shrinking, but the earliest so-called portable PCs were so hefty—often weighing dozens of pounds—that you could do little more than lug them from desk to desk. It took until the early 1980s for the Epson HX-20 to introduce truly portable computing, cramming a pair of 614kHz CPUs and a calculator-size dot-matrix printer into its totally totable 3.5-pound frame. Now about that 120-by-32-pixel display…
Image: Wikimedia Commons
Powering up PC graphics

The evolution of computer graphics is almost as messy as the early years of PCs themselves, but many people consider IBM’s Monochrome Display Adapter (introduced in 1981) to be the first “graphics card.” Sure, the Monochrome Display Adapter was designed to produce only 80 columns by 25 lines of text characters and symbols, but nevertheless it was a PC component with the sole purpose of displaying video.
It (somewhat arguably) counts in our book—or at least our slideshow—and IBM’s MDA arrived more than a decade before ATI and 3dfx ignited the cutthroat graphics wars. The first video card to be regarded as an honest-to-goodness graphics processing unit, Nvidia’s GeForce 256, didn’t appear until 1999, but you’ll have to read our article on the 10 most important graphics cards in PC history for the nitty-gritty details.
Image: Wikimedia Commons
Opening Windows

Right around the same time that the IBM PC and discrete graphics cards were helping computer hardware hit its stride, lines of code pouring forth from the brain of Bill Gates coalesced into Windows 1.0, the first in a line of operating systems that would create an indelible impression on the PC ecosystem for decades to come.
Microsoft unleashed its OS into the world in 1985, but the first iteration didn’t really catch on despite its user-friendly graphical interface. In fact, Windows didn’t become the juggernaut we know and love until Windows 3.0 landed in 1990, complete with Solitaire and full support for Intel’s legendary 386 processor. The Start menu (RIP) played coy for a while longer, first appearing in Windows 95.
The World Wide Web

Today, computing is spreading far beyond proper PCs, thanks in large part to the ever-expanding reach of the Internet and the proliferation of cheap, Net-ready mobile devices. Now it’s borderline impossible to separate computers from the Internet itself; shunning SkyDrive in Windows 8.1 takes a Herculean effort.
From the creation of ARPANET in 1969 to the rise of the World Wide Web (pictured) at CERN in 1991 and beyond, the tale of our connected world is large enough to consume an article all its own. The good news: We’ve already written that article. For more, check out PCWorld’s look at the most important milestones in the history of the Internet and Web browsers.