SLIDESHOW

The 10 most important graphics cards in PC history

Graphics cards couldn't always play Crysis. Here's how the humble cards of yesteryear transformed into the monsters of today.

A feast of eye candy

Those of us who lived through the time without meaning—otherwise known as the mid-1990s—understand the definition of "tease." Game publishers of the day were suddenly and mysteriously quite capable of pumping out visually decadent, ridiculously immersive "3D" graphics—yet few gamers could take advantage. You either dumped a bunch of dough on one of those newfangled "3D video cards," in which case you likely bought a product that didn't work as expected (can you say "proprietary"?), or you were stuck, salivating, with Paleozoic non-accelerated gear.

The mid-’90s were a precursor to a revolution in graphics and gaming alike, and at the center was the video card, in the midst of its coming-of-age party. Let's take a walk down memory lane.

Monochrome Display Adapter

1981, IBM

Not a name that exudes excitement, "Monochrome Display Adapter" is nonetheless the way the peeps rocked their Grand Theft Auto in the year MTV lifted off.

That's a bald-faced lie, of course. Still, many people consider IBM's MDA to be the first video card ever. Though it was incapable of generating real graphics (that's why we didn't call it a "graphics" card), it was nevertheless a PC component designed for the sole purpose of displaying video—in this case, 80 columns by 25 lines of text characters/symbols—and thus it fits our most basic definition of a video card. That it was arguably the first such device, and that it became a PC standard for many years to come, cements its position.

Image credit: Wikimedia Commons

iSBX 275 Video Graphics Controller Multimodule Board

1983, Intel

By the mid-1980s, the console-gaming world (Intellivision football, anyone?) was chock-full of sweet gaming graphics. The computer graphics movement was in full swing too—but on a far more primitive scale than Coleco, Mattel, and the like, and even then only in certain sectors, such as big business, government, the military, and other industries that could afford the tools of the trade. Intel celebrated its seventh birthday by jumping into the pool with its iSBX 275 Video Graphics Controller Multimodule Board, a mildly revolutionary device that was capable of displaying eight unique colors at 256 by 256 resolution.

Image credit: Intel Vintage

VGA Wonder

1988, ATI

The latter half of the 1980s saw a number of small evolutionary steps in the video card business, yet one name stood out above all others if just for its nonstop river of product. That name was ATI. Founded in Ontario, Canada, in 1985, ATI wasted little time flooding the digital world with seemingly everything it could. Its Wonder and Mach cards and their many variants paced the market throughout the turn of the decade, and made ATI a (nerd) household name. We chose the 16-bit, fully 2D VGA Wonder because it came with a mouse port (and, sometimes, with a mouse), but this entry is really a nod to a company that kept its name and the video card concept at the forefront throughout the formative days of the PC.

Image credit: VGA Legacy

Voodoo1

1996, 3dfx

Future industry juggernaut Nvidia had just birthed its very first card a few months prior when San Jose, California-based 3dfx stomped in from the coin-op arcade jungle and changed everything. The Voodoo1, as it is known, eschewed 2D graphics completely, forcing users to run it alongside a separate 2D card. But that didn't matter: The gaming world had been waiting for hard-core hardware that kept up with the "3D" graphics found in first-person shooter chart-toppers such as Doom, Quake, and Duke Nukem 3D, and the Voodoo claimed market share faster than a ripe banana attracts fruit flies. Four megabytes of RAM, a 50MHz core, a proprietary API that pioneering "3D" game developers gravitated toward—what's not to like?

Image credit: VGA Legacy

Riva 128

1997, Nvidia

Nvidia's first questionable entry—1996's NV1, a Sega-partnered card that ventured into uncharted territory by employing quadratic rather than triangular surfaces, an underwhelming audio component, and provisions for a Sega controller—endured the utter apathy of developers. The company's second offering, the Riva 128, ditched the quads, the audio, and the Sega ties, yet didn't fare much better initially. When Nvidia rolled out upgraded drivers, however, the card climbed the charts. Its 3D performance wasn't up to Voodoo standards—nothing was—but it was snappy enough for many people, particularly when you consider that it was a full-blown 2D/3D product. The Nvidia ship had changed course, and the seas ahead were favorable.

Image credit: Wikimedia Commons

Voodoo2

1998, 3dfx

By 1998, everyone with a chipset had made a run at the 3D-acceleration throne. One problem: It wasn't vacant. Once again 3dfx showed the way with its second, and last, truly great card. The Voodoo2 improved on the original's high-resolution (1024 by 768) support, delivered no fewer than three on-board chips, and was one of the first video cards ever to delve into the realms of multicard support, in which two cards work in parallel within a single computer. The Voodoo2 remained a strong enthusiast contender through the turn of the millennium, when poor design decisions and questionable corporate moves crippled its developer. By 2002, 3dfx was no more.

Image credit: Wikimedia Commons

GeForce 256 (DDR version)

1999, Nvidia

With the GeForce 256, the modern era of the video card began in earnest. It was the first 3D accelerator to fully support DirectX 7, at a time when the Microsoft API was just beginning to flex its muscle. It was the first video card to be called a GPU. (In fact, Nvidia coined the expression, defining it as "a single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.") And it wasn't crazy-expensive, considering it took so much of the workload from the CPU. Its transformation and lighting engine was its biggest draw, but with its fast clock speed and 32MB of DDR memory, the GeForce 256 did a whole lotta things right.

Image credit: VGA Legacy

GeForce 8800 GTX

2006, Nvidia

If 1999's GeForce 256 proved that Nvidia was here to stay, 2006's DirectX 10-compatible GeForce 8800 GTX revealed the company's inner monster. Colossal in every way, the 8800 GTX sported 129 streaming processors, 768MB of DDR3 memory, a 575MHz core speed, and a texture-fill rate of 36.8 billion per second.

That the 8800 GTX sapped power as fast as the nearest hydroelectric dam could produce it somehow came as no surprise. That the card didn't meet a game it couldn't capably run for many years after was a far happier surprise.

Image credit: iXBT.com

Radeon HD 5970

2009, ATI

One of the last truly powerful cards to bear the ATI nameplate before new owner AMD dropped the "TI" in favor of "MD," the Radeon HD 5970 was so masterfully engineered, so infused with mega-wonderfulness that it remains an option even today, some four years later. Indeed, some reviewers of the time looked at the 5970, this elongated monument to 3D excess, as not only the fastest video card ever, but perhaps also a digital dagger through the heart of rival Nvidia. That notion proved to be a bit much, as our final entry will attest, but this dual-GPU brute (12 by 4 by 1.5 inches, 3.5 pounds) most assuredly fanned the flames of an already impassioned battle.

Image credit: iXBT.com

GeForce Titan

2013, Nvidia

Seven billion transistors. 6GB RAM. Supercomputer-level architecture. Water cooling. A small shape and comparatively quiet operation. And, according to reviews, impressively frugal power consumption.

"With capability like this, you can play the most graphically intensive PC games with Nvidia 3D Surround monitors at maxed-out settings," claims Nvidia. Yep, there's little doubt the GeForce GTX Titan represents state-of-the-art GPU and video card technology. It is truly the king of raw, single-GPU speed as of this writing.* That it looks supercool only adds to the Titan's mystique.

So why doesn't every self-respecting gamer already own one? Three words: one thousand dollars.

* Geeky footnote: The pure-speed crown goes to the Asus ARES II, a $1500 behemoth that reviewers say is the fastest single card ever made, but it uses a pair of overclocked AMD Radeon 7970 GPUs.

Image credit: www.geforce.co.uk