You've been uploading pictures, sharing stories, and entering personal data into your favorite social network for years. Now the network says that all of your data is public and that it's going to share the information with an advertiser.
A hacker breaks into your bank's servers, and someone in Russia uses your username and password to drain your account.
A nondescript Web ad drops a cookie in your browser that logs your keystrokes for months. That data gets sold to marketers who use it to hound you with ads for hair replacement products.
The cloud storage company you've been using to store business files goes out of business, then sells all your data to a hedge fund.
Unless you're really unlucky, you haven't had to endure all of these things. But in today's cloud-based world, they're all possible. Many of us store large amounts of personal information--everything from financial figures and healthcare data to documents to entertainment media and social networking stuff--at various companies' sites around the Web.
It's very convenient for us, and the hosting companies benefit in one way or another from storing and using the data. But we have few real guarantees that our data won't be misused, lost, or stolen (in fact, it's happening more and more). That's not right. We digital consumers need a data security bill of rights that has the force of law behind it to ensure that our data will be protected and to guarantee that, if it's not, we'll be compensated for the losses we suffer. Here is what that bill of rights might look like.
In Video: A Digital Consumer Bill of Rights
The data I store in the cloud is my valuable property.
The key question is this: Does my personal data constitute property that has value? The answer is yes. In the Internet economy, information is the chief currency. Many of today's biggest and most profitable Internet companies have found a way to turn user information into real money. The usual model is to collect and aggregate as much user data as possible, anonymize it, and then sell it to third-party Web marketing or advertising firms.
Because my information has genuine value, I should have a property owner's rights when I lend that data to some Internet company. If it gets lost or misused, I should be compensated.
In the era of the "cloud," we lend our data--everything from business documents to music to email--to companies that host it on their servers, and they reap some financial reward from doing so, even if they don't charge us for storage. Cloud storage companies are like banks, but for data instead of dollars. We store our money in banks because it's a convenience for us; the banks then use that money to make money for themselves. Even if we don't pay the bank a fee for storing our money, we obviously have a right to insist that the bank safeguard our assets and that it compensate us for any of our assets that it loses through bad investments or theft. Likewise, if my personal files are lost or stolen, the cloud company has caused me to lose something of value, and I am entitled to compensation for my loss.
The danger isn't limited to the possibility that my data may be destroyed. My personal reputation could be seriously hurt if some bad actor either published that information or blackmailed me by threatening to publish it. If the resulting harm fits the description of defamation under the law, the cloud company should compensate me for damage to my reputation.
Cloud companies like Google and Amazon encourage us to store our business information on their servers, which opens another set of risks if the information is lost. Trade secrets could end up in the hands of my competitors, or they could be published for everyone to see, which could be disastrous to my business or to my business reputation.
Companies like MegaUpload represent yet another danger. If a company breaks the law and authorities seize the site, I could lose my files, even if all of them are perfectly legal. If my files are destroyed or otherwise not returned to me, I should be entitled to compensation (if there's a company left standing to compensate me).
The law enforcement agency that seized my data should regard it as valuable property, too. As with any other third-party property innocently implicated in a crime, law enforcement officials should hold it only as long as it's needed for evidence; then they should return it to me.
My data shouldn't end up in a fire sale.
When an Internet company goes out of business, my permission for it to host my data ceases immediately. When the assets are dissolved, I deserve an assurance that my personal data will be deleted, or that my documents, files, and media be returned to me immediately, with no copies left behind. It is wrong to treat my data as a transferable asset that the company's creditors can acquire or sell off to the highest bidder.
I should have the right to know.
I should have the right to quit.
If an Internet site, whether it be a social network or a bank, keeps personally identifiable information about me, it should be required to destroy that information immediately when I decide to cancel my membership and exit the network. The site should guarantee that it will no longer share my data with partner sites or businesses, that my data will no longer appear at the website, and that others will no longer be able to locate my data through preexisting links.
If I store digital files such as music, video, applications, or documents on the servers of an Internet company, I should have the right to require that all of those files be returned to me immediately, and that all copies be destroyed.
I have a right to expect reasonable protection of my data.
Any Internet company that stores my data for any business purpose must have in place a reasonable level of security to protect my valuable property. The company should state at its website that it complies with the data security laws that most states (45 at last count) have passed, and that it adheres to the Federal Trade Commission's data security guidelines. Companies should constantly upgrade their security to deal with new hacking technology.
I must approve retroactive changes before they take effect.
Just as my 401K manager has certain responsibilities because it seeks to use my money to make more money, so should Internet companies that seek to make money using my personal data.
When my 401K plan decides to invest in a new fund, I receive written notification before it transfers my money anywhere. The plan administrators also give me a set of options to use in case I don't like the growth potential of the default new fund. Any Internet company that intends to use my data in any way that differs from the way they described to me when I agreed to participate in the first place should be required by law to inform me in advance of the change and give me an opportunity to quit.
I should be informed when a service I use plans to share my data with "Big Data" aggregators.
I may be comfortable with Facebook having data about who my friends are and whom I've poked. But if you combine that with data about everything I've searched for on Bing and everything I've purchased on Amazon, you get a scarily detailed view of who I am and what makes me tick.
So before Facebook agrees to share my data with Bing and Amazon, it should let me know about the transaction and allow me to opt out.
I have the right not to be tracked on the Web.
I have the right to stop random telemarketers from calling me by putting my number on the FTC's Do Not Call Registry. I should have the same right to prevent companies from keeping tabs on my every move on the Web, by joining a Do Not Track list. Web advertising companies have volunteered to regulate themselves in this area, but many are not honoring individuals' requests to opt out.
A whole industry of marketing and advertising companies has grown up around the practice of dropping cookies into Web browsers to track where people go on the Web and what choices they make. Most of the time I don't know that it's happening to me. And even when I do know that it's happening, I may not know exactly how to stop it. We must fix that situation.
One important limitation to this rule: An Internet company should have the right to track my movements and choices on its site so that it can tailor content to my interests and needs. Only when a company tracks my movements over many sites and over an extended period of time should they be required to obtain my express consent.
If the government wants to monitor my digital transmissions, it should get a search warrant first.
In general, to gain access to my Internet use data from my ISP, or my GPS location information from my wireless carrier, law enforcement should have to obtain a search warrant from a judge after showing probable cause that the data searched will yield material evidence of a crime.
Law enforcement needs to obtain only a special "D order"--named after subsection (d) of section 2703 of the Stored Communications Act--from a court to force my ISP to hand over a list of the email addresses and IP addresses that I communicate with, and the Web pages that I visit. My ISP should surrender this information only if presented with a full-fledged search warrant.
Law enforcement can also use the D order to acquire information about my location over time from my wireless company, which keeps records of the cell towers that my phone connects with while I'm within its network range (which is almost all of the time). This information can help establish my approximate location on the map every time my phone pings the network. Looking at the data over time, investigators can build a detailed map of my comings and goings. The wireless carrier should give this information up only if law enforcement comes to its door with a valid search warrant.
As we know from the erosion of privacy protections after September 11, 2001, the rules we make about data access cut two ways. On the one hand, they can give us more-effective tools to monitor and convict bad actors, but on the other they can reduce the privacy rights and expectations of the vast majority of people who are not and never will be guilty of any criminal activity. The United States has a long history of strictly limiting the scope of such police powers, and it should continue doing so in the digital age.
What Needs to Be Done?
Large and supposedly secure companies around the world now routinely report instances of data loss or theft. Digital consumers say identity theft is their biggest fear. The Privacy Clearinghouse says that at least 500 million records have been breached since 2005, with more than 22.4 million sensitive records lost or exposed in 2011.
Yet so far, Congress has resisted passing legislation that would impose a set of rules on any company or other organization that seeks to store our sensitive information on its servers, and to establish a legal framework for dealing with those that fail to obey the rules.
Without this protection for consumers, the responsibilities of Internet companies remain ambiguous, and individuals who have been harmed by data loss are on shaky legal ground in seeking compensation for the losses they suffer.
The primary currency in the new economy is user data. Though the immense value of this data is obvious to the companies that trade in it, it has been less obvious to the regular Netizens who actively or passively turn it over. Consumers must come to grips with the reality of the situation, and must demand that their data be used fairly and be protected from harm.
It's time to set up a legal framework at the federal level in which user data is treated as valuable property, and Internet companies who lose or misuse it are held liable for the resulting harm. But this will only happen when consumers demand it from lawmakers.