The British government wants Google, Yahoo, and Microsoft to block Internet searches that are likely to lead to child abuse images. Internet search providers have until October to commit to banning lists of keywords deemed abusive or the government will consider legislation to force them, the U.K.’s Prime Minister David Cameron said Monday in a speech.
Some people are putting “simply appalling” terms into search engines in order to find illegal images of child abuse and they’re getting results, Cameron said Sunday in an interview broadcast by the BBC.
The U.K. government needs to have a “very, very strong conversation” with Internet search providers and tell them that they shouldn’t provide results for terms that are “depraved and disgusting,” he said.
The interview was in advance of a speech the Prime Minister gave Monday that focused on removing child abuse content from the Internet and preventing children from accessing pornography.
“Put simply, there needs to be a list of terms—a black list—which offer up no direct search returns,” Cameron said, according to a draft version of the speech text published on his office’s website. “I have a very clear message for Google, Bing, Yahoo and the rest: You have a duty to act on this—and it is a moral duty.”
What might this new Internet look like?
Arguments that a ban on searches that clearly reflect the “sick and malevolent intent of the searcher” would violate freedom of speech will not be accepted, he said.
Cameron said that more general search queries like “child sex” should return options like: “Do you mean child sex education?” or “Do you mean child gender?” In addition, “abhorrent” combinations of search keywords that are clearly related to child abuse should return no result at all, Cameron said.
Internet search providers have until October to answer whether they are willing to commit to censor results for search terms on a blacklist compiled by the Child Exploitation and Online Protection Centre (CEOP Center), a law enforcement organization affiliated with the U.K.’s Serious Organised Crime Agency (SOCA) that is dedicated to fighting the sexual abuse of children.
If the answer given by the companies in October is not satisfactory, or their progress is slow or non-existent, the government might take a regulatory approach.
“I can tell you we are already looking at the legislative options we have to force action,” Cameron said.
The Prime Minister expects companies to set their “greatest brains” to work on any implementation issues and technical obstacles.
“You hold hackathons for people to solve impossible Internet conundrums,” Cameron said. “Well—hold a hackathon for child safety.”
“We have a zero tolerance attitude to child sexual abuse imagery,” a Google representative said Monday via email. “We use our own systems and work with child safety experts to find it, remove and report it. We recently donated 5 million U.S. dollars to groups working to combat this problem and are committed to continuing the dialogue with the Government on these issues.”
“We will support the work of third parties in running education and deterrence campaigns on our platforms and are already actively engaged in discussions with CEOP and others about their proposals,” a Yahoo representative said via email.
“Microsoft will support education and deterrence campaigns to target those seeking to access indecent and illegal images of children,” a Microsoft representative said in an emailed statement. “Our PhotoDNA technology is already widely used across the industry to help prevent the proliferation of known illegal images and we remain completely committed to helping tackle the scourge of online child abuse content.”
A question of a free Internet
The Open Rights Group (ORG), an U.K.-based digital rights watchdog organization doesn’t believe that banning certain search terms would be an effective approach.
“Most child abuse images are circulated in private networks, or are sold by criminal gangs,” said ORG Executive Director Jim Killock in a blog post Sunday. “Banning search terms seems unlikely to combat the serious activity, which is independent of search engines.”
Furthermore, even if some identifiable search terms would indeed bring up illegal images, banning those terms will likely lead to people looking for such content inventing new terms to find it, Killock said.
“Cameron invites a game of cat and mouse which is likely to have very limited impact,” he said. “The terms used may hide themselves into search terms that cannot be banned because they are innocuous.”
Such a ban might also affect the ability of organizations like the Internet Watch Foundation (IWF) to identify child abuse images using search engines and report them to the proper authorities, Killock said.
Cameron said in his speech that search engine providers already block abusive images reported by the IWF, but mentioned that the organization is too small to identify all of them because it relies almost entirely on members of the public reporting abusive content they’ve seen online.
The IWF did not immediately respond to a request for comment on whether such a ban would impact its ability to identify child abuse images.
“Cameron’s announcement is symptomatic of the way the Internet is viewed and treated by policy makers,” Killock said. “The technical challenges and consequences of policies are viewed as less important than the moral purpose justifying calls for action.”