Researcher: Social Networks Shouldn't Reuse Private Info
While social networking services may legally own customer-generated data generated on their sites, they still should not reuse that material outside the context in which it was created, contended a Microsoft researcher who studies social networks.
Willfully failing to respect the context of how that data was created may only lead to increased regulatory oversight in the future, warned Danah Boyd, in a series of talks given at the WWW 2010 conference, being held this week in Raleigh, North Carolina, as well as in a follow-up interview with IDG News Service.
"When the law comes down, it is usually not pretty," she said.
Boyd's idea of context may seem abstract, but it is a vital one to understand how the social rules of privacy should be applied to the online world. In the interview with IDG, she explained the concept further.
People use public social network services to share personal information about their lives with their friends. Yet, we have not yet set the boundaries for the appropriateness of reusing data online, she said. Nor do social networking sites, which Boyd calls data aggregators, fully understand the boundaries either.
In real life, person-to-person interactions are sharply defined, and we when know these boundaries have been crossed. Someone conversing with his or her partner about some sensitive matter would feel slighted if the partner were to go to work and repeat the conversation to coworkers.
Online, matters are fuzzier. "It's a tricky thing because marketers want to look in on us," she said.
For data aggregators, user data is valuable as a marketing tool. Facebook, for instance, will reuse personal information in conjunction with ads. "If you connect with your favorite band's page, we may display your name and profile photo next to an advertisement for that page that is displayed to your friends," Facebook's privacy page states.
Such marketing efforts of social networking sites can unwittingly cross social boundaries, and social networking users feel slighted.
"You're out joking around with friends and all of a sudden you're being used to advertise something that had nothing to do with what you were joking about with your friends," Boyd said. People don't hold conversations on Facebook for marketing purposes, she said, so it would be incorrect for marketing efforts to capitalize on these conversations.
"We're going to have to ask what is acceptable and what isn't, and we haven't really worked out those issues yet," she said.
The idea of applying context to the discussions of online privacy may be a relatively new one, though Boyd said she is not the first to think about it. She pointed to the work of New York University's Helen Nissenbaum, who authored the 2009 book Privacy in Context that tackles these issues.
Nonetheless, data aggregators should keep in mind some basic considerations about context when reusing data: "Why are people producing this data? Who do they want to actually consume it? What does it mean when it goes beyond that?" were a few of the questions she suggested that social networking sites should ask before reusing data.
"What is the social contract around data aggregation? We need to work that out," Boyd said.
Though a University of California-Berkeley-trained Ph.D, Boyd has worked with a number of IT and Internet companies, such Google and Yahoo. In 2007, she created a firestorm of controversy when she noted the class divisions seemingly evident between Facebook and MySpace.
With Microsoft, Boyd works as a researcher, but she insists the company has no input into the conclusions she reaches with her work. Instead, the company uses her as a consultant for its own social media-related projects.
"I look a lot like a professor, but instead of teaching hungover 18 year-olds, I'm teaching people who are actually implementing these systems," Boyd said.
Why do social networking sites keep running afoul of privacy norms? In some of the cases, such as information-focused Google, the cause is just naivety, Boyd said. The company tries new features internally and assumes if a few employees like them, they must be good. But when presented to a more diverse audience, the limitations of such new features become immediately obvious.
In her talk, Boyd suggested that social networking sites could save themselves potential embarrassment by vetting potential new features and changes through privacy rights watchdogs like the Electronic Frontier Foundation and the Electronic Privacy Information Center.
Other companies, like Facebook, are more calculated about crossing these contextual boundaries, and risk attracting regulatory scrutiny for doing so, she said.
"Facebook has always been in the process of brokering public information. Its economic interest is to encourage users to be as public as possible," Boyd said. "The question is what are they willing to trade in terms of people, and profit."