The Internet freaked out this week after Google announced the closure of its cloud-based RSS reader, Google Reader.
RSS fans begged Google to change its mind, signed a petition and scrambled to come up with alternatives. Although Google Reader is just one reader among many (probably the best one), many expressed fear that the RSS format itself is threatened by Google's action.
Despite the thousands of articles and blog posts lamenting the loss, few wondered why so many people think RSS is worth saving.
The conventional wisdom, which is conventional but not wisdom, says that RSS is obsolete because now we have Twitter and other social things. Techcrunch even said " In essence, Twitter is a big RSS reader."
In fact, Twitter is not a big RSS reader. RSS is something you control, and Twitter is something other people control. (Even if you dedicate a Twitter account exclusively to the same sources of content you had in Google Reader, the viewing options, functionality and everything about Twitter is controlled by Twitter.) That both give you streams of content is a superficial similarity. Fundamentally, they are opposites.
What Google Reader and RSS fans fear is not the loss of a good service and a great format. They fear the loss of control. They fear a future in which decisions about what they see, watch, read and listen to are determined by secret algorithms and the whims of the social media masses.
It's not an unreasonable fear: The taking away of control from the user is the way the whole industry is going.
Samsung's fuzzy user interfaces
Computer user interfaces have evolved to become more user-friendly and more sophisticated. But while they have become easier to use, they've become harder to control.
The command line gave users perfect control. You typed in a command. It was either right or wrong. If the command was right, you could predict exactly what would happen.
The graphical user interface was a little fuzzier. For example, you select a group of icons by drawing a box with your mouse. A small slip of the fingers and you might miss a couple of icons or select the wrong ones.
Multitouch user interfaces are less exact still. It's easy to tap the wrong thing, or use the wrong gesture. Pinching and zooming doesn't give you an exact size; it just gives you generally smaller or generally bigger. The flicking gesture is even less exact. Flick the screen and down it goes; where it stops, nobody knows.
The shiny new Samsung Galaxy S4 features eye-tracking. If you tilt the phone while looking at a web page, it will scroll automatically. If you're watching a video and look away, the video pauses.
Another major addition to the repertoire of touch is the lack of touch. A new hover feature enables you to see what's inside folders, emails and photo galleries by holding your finger just above the screen.
These are both very cool new technologies and they demo well. What they also have in common is that the additional power and convenience come at the expense of user control. There will be times when users doesn't want the page to scroll when they tilt, the video to stop when they look away and don't mean to hover, and the phone will do what they don't want to do.
Kinect and Leap
Another user interface advancement makes a comparable exchange between ease and power on the one hand and control on the other. In-air gestures used for controlling Kinect for Xbox 360, and several others I told you about in January -- including the sophisticated technology developed by Leap Motion -- are cool but they bring new levels of non-exactness to computer user interfaces.
Google Personalized Search
Search used to be a more exact science. You typed in Boolean search operators and a search engine spit back what it had in the index based on whatever metrics that search engine would use to place the results in a specific order.
In recent years, search has gotten a lot smarter, and by smarter, I mean less user controllable.
For example, when I used to search for my name, which is spelled Elgan, I used to get every result using that exact combination of five letters. Now, search engines go ahead and toss in results for the more common Elgin spelling, assuming that I have made a mistake.
That's one example of a generalized intelligence that has been baked right into search engines. But in addition to intelligence, we also have personalization. The best example is Google Personalized Search.
When you search Google nowadays, Google takes into consideration past searches, your location, your ZIP code, and possibly your activity on YouTube, Google+, Gmail and other Google sites as well as dozens of other "signals."
In other words, your results are unique to you, and if someone else conducts the exact same search they will get a different result.
Users are no longer controlling search results. Google's algorithms are now in control.
Social networks are designed increasingly to protect users from their own decisions.
We decide to follow too many people, and as a result our streams would overwhelm us with status updates and content.
So Facebook and other social networks protect us by using algorithms to decide what we see and what we don't see.
For example, if you're a fan of Dr Pepper and "Like" their Facebook page because you want to get that company's status updates, Facebook's EdgeRank algorithm will protect you from the majority of those updates and will not deliver them to your News Feed.
With some knowledge and effort, you can exert some control over what you get, but you cannot control who gets what you post.
As users of social networks, we don't control what's in our incoming or outgoing streams. That's controlled by secret algorithms and unpublished criteria.
Another helpful advancement can be found in the world of Autocomplete and its notorious cousin, Autocorrect.
In both cases, sophisticated software takes a guess at what you intend to type. The results can be so comically disastrous that blogs showing Autocomplete and Autocorrect gone wrong make for some of the funniest reading on the Internet.
Even general computing -- the creation and management of your own data files -- has become less user controlled as the result of innovation.
The two concepts that dominate computing these days are cloud computing and the post-PC computing model.
With cloud computing, the whole point is to remove user control and knowledge of exactly where your applications and files reside. The word "cloud" is a reference to the network diagram symbol for a system of complexity where the details are unknown and where the knowledge of those details are unnecessary.
You just upload your financial and personal data into "the cloud" and somebody else takes care of the details.
The post-PC paradigm, as exemplified by the Apple iPad, removes your control over the location and management of your own files by removing the ability to have knowledge of or access to those files.
I'm not saying the trend away from user control is all bad.
All this innovation is generally good. Ease-of-use is good. The reduction of complexity is good.
But taken together, all of this removal of user control takes its toll and could be creating problems that will be hard to solve.
I also suspect that the general trend away from user control is a trend that benefits companies more than users.
Take the example of Google Reader. You can complain to Google all you want about the loss of a user-controlled content stream and its replacement with algorithm-controlled streams. But ultimately Google is in the algorithm business. Its innovations, intellectual property, and trade secrets are Google's secret sauce -- the stuff that makes Google better than its competitors.
In other words, user control is nice, but there's no money in it.
If we as users want to maintain control, we're going to have to fight for it. And that fight starts with a full understanding of this larger trend to take control away.
Mike Elgan writes about technology and tech culture. Contact and learn more about Mike at http://Google.me/+MikeElgan. You can also see more articles by Mike Elgan on Computerworld.com.
Read more about web apps in Computerworld's Web Apps Topic Center.
This story, "Mobile computing and social media innovation can mean less user control" was originally published by Computerworld.