A respected member of the digiterati waved a yellow flag this week over the grave of the personal computer, and with good reason.
Jonathan Zittrain, co-director of Harvard’s Berkman Center for Internet and Society, as well as teaching law and computer science at that university, warned of the threat to innovation and freedom that the passing of the PC could have on our lives in an essay appearing in MIT’s Technology Review.

He argued that the dwindling importance of the PC is creating a shift in the balance of power in the digital world. “[W]e’re seeing an unprecedented shift of power from end users and software developers on the one hand, to operating system vendors on the other,” he wrote. “This is a little for the better, and much for the worse.”
Some of us, myself included, might find that power shift alluring. What’s wrong with “it just works,” rather than “it worked this time?” Zittrain’s answer to that question should alarm anyone seeking the comfort of the Big Brother emerging from the PC’s funeral pyre.
Those familiar with Zittrain’s work will recognize his prime culprit in the demise of the PC paradigm: the iPhone. In his book, “The Future of the Internet and How To Stop It,” he blames Apple’s smartphone for contributing to the death of the Web and explains the siren song of Apple’s approach to computing.
“We have grown weary not with the unexpected cool stuff that the generative PC had produced, but instead with the unexpected very uncool stuff that came along with it,” he wrote in the book. “Viruses, spam, identity theft, crashes: all of these were the consequences of a certain freedom built into the generative PC. As these problems grow worse, for many the promise of security is enough reason to give up that freedom.”
While the iPhone’s walled-garden model may have addressed some of the annoying aspects of the PC environment, it created more serious problems. Those problems, however, because they don’t affect people as directly as a computer crash or inbox full of spam are largely being ignored by a population constantly being wooed by the next cool thing.
For example, since software must be approved by Apple before it can be sold in its App Store, the company has chosen to be an arbiter of what’s good and bad content. That model might be tolerable if it were limited to a single platform or company, but it’s starting to be copied by others, notably Microsoft.
“[T]he fact that apps must routinely face approval masks how extraordinary the situation is: tech companies are in the business of approving, one by one, the text, images, and sounds that we are permitted to find and experience on our most common portals to the networked world,” Zittrain wrote.

“Why,” he continued, “would we possibly want this to be how the world of ideas works, and why would we think that merely having competing tech companies — each of which is empowered to censor — solves the problem?”
Worse yet, by reducing the avenues by which content can reach people, Zittrain asserts, the easier it is to control the flow of information. What government wouldn’t prefer applying pressure to a few information chokepoints rather than chase the myriad of sources by which content is distributed today? “Suddenly,” Zittrain wrote, “objectionable content can be made to disappear by pressuring a technology company in the middle.”
“If we allow ourselves to be lulled into satisfaction with walled gardens, we’ll miss out on innovations to which the gardeners object, and we’ll set ourselves up for censorship of code and content that was previously impossible,” he added. “We need some angry nerds.
An irony resonates in Zittrain’s arguments. It’s that the company that made an indelible mark on the public consciousness with a commercial about smashing Big Brother could be fertilizing a garden for his growth.
Follow freelance technology writer John P. Mello Jr. and Today@PCWorld on Twitter.