The collection and analysis of big data holds great promise, but may also lead some companies to create profiles of consumers leading to discrimination, the chairwoman of the U.S. Federal Trade Commission said Monday.
The FTC is “committed to rigorous enforcement” of current law related to data privacy and discriminatory practices, but companies, U.S. policymakers and other groups need to have a deeper discussion about fair big data practices, FTC Chairwoman Edith Ramirez said during an agency workshop on big data and discrimination.
Monday’s workshop could “help foster a discussion about industry’s ethical obligations as stewards of information detailing nearly every facet of consumers’ lives,” she said. Ramirez said she hoped the workshop would also identify gaps in existing U.S. law related to the collection and use of big data.
Big data “has the capacity to save lives, improve education, enhance government services, increase marketplace efficiency and boost economic productivity,” Ramirez added. “But the same analytic power that makes it easier to predict the outbreak of a virus, identify who is likely to suffer a heart attack, or improve the delivery of social services, also has the capacity to reinforce disadvantages faced by low-income and underserved communities.”
FTC Commissioner Julie Brill, like Ramirez a Democrat, repeated calls for Congress to pass new transparency rules for data brokers. She also called on data brokers to police their own industry by better understanding how the companies that buy their data sets use the information and by prohibiting discriminatory uses of the data.
The FTC has found that, in some cases, companies are targeting ads based on racial or other assumptions, said Latanya Sweeney, the agency’s CTO. At a website for members of Omega Psi Phi, an African-American fraternity, the agency found ads for defense lawyers and for users to check their own criminal backgrounds, she said. The site also had a large number of ads for poorly rated credit cards, she said.
Panelists at the workshop disagreed on how Congress or the FTC should deal with big data and potential problems.
In some cases, companies collecting data should give consumers significant notice and options to opt out, but in other cases, a company like an electric utility, passively collecting information about electricity use, may want broad participation from customers in order to understand how to best deliver service, said Nicol Turner-Lee, chief research and policy officer at the Minority Media and Telecommunications Council, an advocacy group.
“There’s going to be some data that we need, that has socially beneficial purposes, that we would like most people to participate,” she said. “When we’re coming up with a framework, does it balance use vs. harm?”
Policymakers need to look “situationally” at the nature of the data when thinking about rules for big data, added Stuart Pratt, president and CEO of the Consumer Data Industry Association, a trade group representing companies that collect consumer data. “Law often is too monolithic and too rigid.”
But the U.S. shouldn’t abandon long-held ideas that individuals should have control over their personal data, said Pamela Dixon, executive director of the World Privacy Forum.
“I don’t think the [longtime] structures need to be reinvented or shoved aside because data sets are larger,” she said. “It’s important to keep the regulations that we have … to ensure that fair information practices are still applicable and relevant.”
Dixon and Danah Boyd, a principal researcher at Microsoft Research, both noted that the big data industry is in many ways in its infancy. Many companies using big data are still figuring out how to use the information responsibly, Boyd said.
Researchers at Microsoft have found they can reliably predict that search engine users will soon be admitted to the hospital based on the searches they make, Boyd said. But that doesn’t mean Microsoft Bing search engine should send a warning notice to a user saying he should see a doctor, she said.
“That’s creepy,” she said.