Facebook users may inadvertently reveal their sexual preference to advertisers in an apparent wrinkle in the social-networking site’s advertising system, researchers have found.
The researchers set up six Facebook accounts, analyzing the type of advertisements served to them and way those advertisements differed based on the profile’s declared sexual preference.
Two of the profiles purported to be males interested in females, and two females interested in males. Another profile was for a male interested in other males, and the last a female interested in other females. All six profiles claimed to be 25-year-olds living in Washington, D.C.
Unsurprisingly, the researchers found that ads that explicitly mentioned sexual preference, such as ads for gay bars, were served to the gay profiles. But they found that many ads that did not explicitly refer to sexual preference were shown exclusively to the gay profiles.
One example was an advertisement for a nursing program at a medical college in Florida, which was only shown to gay men.
The researchers said that persons seeing the ad would not know that it had been exclusively aimed at them solely based on their sexuality, nor would they realize that clicking on the ad would reveal to the advertiser, by implication, their sexual preference in addition to other information they might expect to be sent, such as their IP (Internet Protocol) address.
“The danger with such ads, unlike the gay bar ad where the target demographic is blatantly obvious, is that the user reading the ad text would have no idea that by clicking it he would reveal to the advertiser both his sexual preference and a unique identifier (cookie, IP address, or e-mail address if he signs up on the advertiser’s site),” the researchers wrote in a paper. “Furthermore, such deceptive ads are not uncommon; indeed exactly half of the 66 ads shown exclusively to gay men (more than 50 times) during our experiment did not mention ‘gay’ anywhere in the ad text.”
The scenario would appear to violate Facebook’s advertising policy, which says “Any targeting of adverts based on a user attribute such as age, gender, location or interest, must be directly relevant to the offer and cannot be done by a method inconsistent with privacy and data policies.”
A Facebook spokeswoman downplayed the study, saying that the site does not pass any personally identifiable information back to an advertiser.
Christopher Soghoian, a doctoral candidate at the School of Informatics and Computing at Indiana University, wrote on his blog that Facebook could deal with the issue in a couple of ways.
The site could simply stop allowing advertisers to target ads based on sensitive information, such as sexual preference or political affiliations, or it could inform users that an ad was targeted based on a specific attribute of their profile, Soghoian wrote.
“Users should also be told, after clicking on the ad, but before being directed to the site, that the advertiser may be able to learn this sensitive information about them, simply by visiting the site,” Soghoian wrote. “I suspect that neither option is going to be something that Facebook is going to want to embrace.”
The research paper, “Challenges in Measuring Online Advertising Systems,” was written by Saikat Guha of Microsoft Research India, and Bin Cheng and Paul Francis, both of the Max Planck Institute for Software Systems in Germany.