Web giants such as Google and Facebook should pre-screen user content before it goes online, since offensive material threatens the Internet’s overall health, according to a U.K. government report released Thursday.
The recommendation comes as social-networking, video-sharing and other Web sites battle problems with cyberbullying, violence and offensive material.
Thereport, from the House of Commons’ Culture Media and Sport Committee, advocates assigning a government minister to oversee Internet safety as well as other issues such as P-to-P (peer-to-peer) file sharing and targeted advertising systems.
Google depends on users to report offensive videos on YouTube and removes bad ones that have been flagged within the hour, the company told the committee.
But the committee rejected claims by Google that it was not possible to pre-screen content, even though 10 hours of video are posted on YouTube.com every minute.
“We found the arguments put forward by Google/YouTube against their staff undertaking any kind of proactive screening to be unconvincing,” the report said. “Major providers such as MySpace have not been deterred from reviewing material posted on their sites.
MySpace reported that it reviewed “each image and video that is uploaded to the MySpace server.” Committee members traveled to MySpace’s U.S. offices and saw “several hundred” people reviewing material. MySpace takes offensive material offline within two hours but is trying to cut that time to an hour.
The European Union’s E-commerce Directive does not require service providers to prescreen content on their networks or make them responsible for illegal content on their networks. Google is using that regulation as a prime defense in Italy.
Italian prosecutors are considering filing criminal charges against four company executives for allowing a video of a disabled child being bullied to get posted on Google’s Video service. The video, posted in September 2006, was only up for a few hours but chalked up 12,000 views.
Overall, the committee recommended that the technology industry agree on minimum standards for take-down times for offensive material.
“We find it shocking that a take-down time of 24 hours for removal of child abuse content should be an industry standard,” the report said.
Additionally, sites should make their terms and conditions or acceptable user policies more prominent for users, it said.
Google’s top lawyer, Kent Walker, told the committee the company would consider the possibility of using data on users’ histories to minimize the amount of bad material posted. If implemented, that could cause users concern over privacy.
The report also said companies should provide a one-click mechanism for reporting suspect content to law enforcement, a feature that is not used widely today. That suggestion also raises questions over how law enforcement would be able to handle a flood of reports.
The report stops short of recommending legislation to control content, suggesting that companies should instead agree on industry-wide policies.
But if the government were to impose regulation and security mandates, it could stifle innovation around user-generated content, which many companies hope will drive new business, said Martin Warner, an IT commentator who studies the issue.
“Everyone knows that the whole future of the Web is what we do with content,” Warner said.