"Revolutionary" is a term that the tech industry likes to throw around. A case in point is last week's announcement of Lytro, a type of camera that promises to eliminate the need to focus on a subject before snapping a picture.
We offer the fine folks at this startup our best wishes, but we can't help but be skeptical. After all, we've heard bold predictions of revolutionary impact before, and the results haven't always been pretty. For skepticism's sake, let's review 12 ballyhooed tech revolutions that didn't pan out.
AT&T Picturephone (1964)
The idea of video calling dates back at least as far as the 19th century, but the breakout product was supposed to be AT&T's Picturephone. Introduced at the New York World's Fair in 1964, its designers expected it to change forever the way we made phone calls, by showing video of the other party through a dedicated monitor.
Of course, voice calling proved to be a more convenient and economical way to stay in touch, so the Picturephone never took hold. Even today, video calling tools like Skype haven't transformed communication as drastically as email, text messaging, and status updates have. Maybe we just don't want to look presentable all the time.
Bubble Memory (1970s)
Introduced in the 1970s, Bell Labs' Bubble Memory was supposed to supplant the hard disk. The technology, which recorded data in bubble-shaped magnetic regions on the surface of a chip, was resistant to the elements and contained no moving parts.
Unfortunately, as John Dvorak notes, Bubble Memory was slow, expensive, and complicated to control. Only the military found it useful. For everyone else, the hard disk continues to rule, and flash memory accomplishes much of what Bubble Memory promised.
Virtual Reality (1990s)
Children of the 1990s will remember virtual reality for its brief season of ubiquity in video game arcades around the world. For a high fee, you could don a weighty piece of headgear and play simple shoot-'em-ups that tracked the motion of your head. Temporarily, virtual reality appeared to be the future of home entertainment, or perhaps the start of a societal transformation.
Immersive gaming and natural input eventually made their way into homes via Nintendo's Wii and Microsoft's Kinect for Xbox 360, but stepping into your own virtual universe continues to be an idea better left to Star Trek.
Internet Appliances (Circa 2000)
When Internet appliances began taking hold around the turn of the 21st century, the concept of a consumer device that offered simplified access to Internet services sounded promising. (Not to one of my PCWorld colleagues, however, who called them "boat anchors" when they were introduced and who correctly predicted their quick demise.)
Unfortunately for the appliances' makers--including 3Com with its well-designed Ergo Audrey shown here--the movement fell flat on its face. Users figured out that they could stick with their old-fashioned PCs for surfing the Internet and save the $500 to $1000 that an Internet appliance would cost.
Audrey died on June 1, 2001, in the wake of the dot.com crash, as did most of its competitors. The winner in this category so far? Probably Apple's iPad, introduced nine years later.
If inventor Dean Kamen had simply promised that the Segway would revolutionize the mall cop industry, no one could have faulted him. But instead, Kamen hyped the two-wheeled, mass-sensing electric vehicle as a city-changer. The Segway, he told Time magazine in 2001, "will be to the car what the car was to the horse and buggy."
Turns out, people didn't want to spend thousands of dollars to fill an ill-defined gap between walking, biking, and driving--especially when concerns about the safety of the vehicle generated bad publicity, culminating in 2010 in the death of the new owner of Segway when he drove one off a cliff. Still for an hourly rate, you can probably rent a Segwayat your nearest amusement park and experience the failed revolution first-hand.
Foveon X3 Camera Technology (2002)
Foveon's X3 sensor was supposed to spark a revolution in digital photography (sound familiar?) by measuring color in the same way that the human eye does. Whereas existing cameras measured one color at every sensor point and used calculations to figure out the other two colors, Foveon would measure all three primary colors.
In the end, the difference in image quality wasn't significant enough to prompt widespread adoption of the technology in consumer-level cameras, and now Foveon's technology is a mere niche presence, mainly in cameras made by Sigma Corporation.
Municipal Wi-Fi (2004)
Around 2004, the idea of making public Wi-Fi available citywide was all the rage in the United States. Municipalities around the nation started planning their own community-wide hotspots, many of them partnering with private companies such as EarthLink and AT&T.
But within a few years, things went sour. Wi-Fi was expensive to implement, and the technology turned out to be not very good at covering large areas. By 2007, many cities had pulled back or scrapped their muni Wi-Fi plans.
RFID Everywhere (2004/2005)
About seven years ago, an article in Wired magazine described the grocery store of the future, where every item would be tracked through tiny chips powered by radio frequency energy. The technology would enable retailers to recommend sales based on your proximity to certain items and allow you to check out instantly. Stores would be able to adjust prices on the fly in response to supply and demand.
A flap over RFID arose in 2005 when the U.S. government said that it would require RFID chips in U.S. passports the next year. That announcement triggered a run on the older passports by paranoid tech devotees who hoped to avoid RFID tagging technology.
The movement foundered on consumer concerns, on the price of RFID tags, and on the high cost of installing an RFID infrastructure. Today, near-field communication, a close cousin, is only a blip on the radar of consumer technology. One of NFC's most ambitious goals is simply to allow customers to pay by smartphone instead of by credit card. Grapefruits, to my knowledge, remain technology-free, as do the rest of your groceries.
Only a few years ago, netbooks were primed to become the new face of computing--low-cost, ultraportable devices that left most of the heavy lifting to the cloud. The craze apparently was kicked off by the OLPC (One Laptop Per Child) movement, which solidified in 2006 and was dedicated to providing small, very low-cost laptops to children in developing countries.
But Apple's iPad stole netbooks' revolutionary thunder starting in the spring of 2010, and demand for mininotebooks is in decline as tablets take over. To their credit, netbooks removed some significant cost barriers to computing, but no one argues today that they're the future of computing.
HD Radio (2006)
It's not clear how many people actually thought that HD Radio would spark a music revolution, but the HD Radio Alliance had no qualms making that assertion in 2006.
Unfortunately, high-quality radio tuners never got the support they needed from auto makers, and with people tuning in to their iPods and their smartphones instead of to traditional radio, the need for an even larger spectrum of stations is hard to justify.
In 2006, Microsoft's Origami architecture was supposed to usher in a new class of computers called Ultra-Mobile Personal Computers, or UMPCs. These small, touchscreen devices would provide instant access to information, using a tablet version of Windows XP. (Shown here: UMPCs from Samsung, left, and Asus.)
What Microsoft, Intel, and various hardware makers failed to realize--aside from the fact that no one wanted to pay upward of $1000 for one these things--was that touchscreen smartphones would soon dominate this very niche. Now, the UMPC is regarded as a big-time tech flop.
Google Wave (2009)
The search giant promoted Google Wave, its real-time messaging platform, as a blend of Twitter, email, and instant messaging. However, the ill-fated collaboration tool never really caught on.
Wave was difficult to master, it bored casual users, developers were hesitant to jump aboard its bandwagon, and most of us found that Facebook did the trick in a much easier way. Google shuttered the service in March 2010.