Weeding out fake reviews online takes care, incentive

Taiwan's Fair Trade Commission this week fined Korean conglomerate Samsung $340,000 for "astroturfing."

Specifically, the Taiwanese FTC said Samsung paid two "marketing firms" more than $100,000 to hire people to "highlight the shortcomings of competing products," engage in the "disinfection of negative news about Samsung products," positively review Samsung products and, (in a bizarre turn of phrase), do "palindromic Samsung product marketing," whatever that means.

Wait, what's 'astroturfing'?

Samsung was fined for paying a "large number of hired writers and designated employees" to post comments in online forums praising Samsung and criticizing competitors.

Astroturf is a brand of fake grass; "astroturfing" is a reference to a fake "grass-roots" movement.

The practice of astroturfing has a long and sordid history. The term was coined by U.S. Sen. Lloyd Bentsen of Texas in 1985, referring to a letter-writing campaign orchestrated by the insurance industry. In fact, astroturfing has been a major tool of political dirty tricks since the Roman empire.

user reviews

The rise of the Internet, online messages boards and social media—and with it, the rising influence of "the crowd"—has brought the practice to business, including the mobile computing industry, as well as other types of businesses.

Sites like Fiverr host astroturfing transactions openly. A Fiverr user named " Jay from India," for example, offers to promote your iOS, Android or BlackBerry app on 25 online forums for $10.

Astroturfing scale ranges from the local business where the owner asks family and friends to write positive online reviews to the biggest sustained astroturfing campaign in history: China's 50-cent army.

The Chinese government reportedly pays as many 300,000 people to post pro-Chinese government comments on forums, message boards and social media sites within China and all over the world. It has reportedly being going on for years as part of a sustained policy.

(This effort of disinformation bolsters that government's massive monitoring system for social media worldwide, which reportedly employs 2 million people.)

While China allegedly relies upon sheer manpower to overwhelm global public opinion about the Chinese government, other organizations use automation.

A class of software called " persona management software" magnifies the effectiveness of each paid fake opinion writer by auto-generating a credible but phony online persona (also called a "sockpuppet"), including a fake name, email address, web site, social media profiles and other data. The software creates fake online activity to give the non-existent users a "history" or online "footprint."

"Persona management software" specific to social networks is called a " social bot."

Some industries rely almost entirely upon web-based reviews, and so astroturfing is rife. Hotels, restaurants, and books are heavily reliant on customer-generated reviews to attract new business.

In New York alone, recently, the New York attorney general shut down 19 companies engaged in the astroturfing of online sites like Yelp, Google Local and CitySearch.

Commenters were recruited on sites like Craigslist, Freelancer and oDesk and paid between $1 and $10 per "review."

A new book by NPR's David Folkenflik, called Murdoch's World, claims that Fox News engaged in " institutionalized astroturfing of the Internet" by using Fox employees using untraceable wireless connections to post pro-Fox posts online. One even used a dial-up AOL account. Another used 100 fake personas, according to the book.

What's wrong with astroturfing?

A Nielsen study from last year determined that 70 percent of people surveyed trust online user reviews and that 83 percent are influenced in their buying decisions by these reviews.

A Gartner report from about a year ago predicts that by 2014, between 10 percent and 15 percent of all social media "reviews" will be fake astroturfed opinions paid for by various companies.

This is problematic because buyers can be misled, and in unpredictable ways.

When I posted a short item on Google+ about the Samsung fine this week, an alarmingly large percentage of commenters expressed a belief that astroturfing is a common problem in the industry and that all major companies do it.

This belief is a problem for two reasons. First, just as astroturfing itself leads consumers astray by making them believe fake opinions, the belief that astroturfing is common leads consumers astray by making them doubt real opinions.

Second, a widespread belief that "everybody astroturfs" is itself an incentive for companies to engage in astroturfing. Why not benefit from the practice if consumers already believe you do it?

What to do about astroturfing

Some day soon, there may be a widely deployed software solution to the problem of paid astroturfing of online comments.

Cornell University researchers have created an algorithm that can detect astroturfed comments. They claim they can identify fake opinion posts 90 percent of the time. It would be helpful for a company like Google to deploy something like this to get a "second opinion" about whether comments posted online are real or fake.

In the meantime, we're each on our own. Helpfully, Time magazine recently published a handy guide for how to spot fake online comments and reviews.

The bottom line is that Samsung is not the only company engaged in astroturfing—not by a long shot. It's a widespread practice, and one that's difficult to detect.

Yes, we should all consult user options, but approach them with healthy skepticism. But more importantly, we should heavily favor reviews by professional reviewers in reputable publications before buying products.

Such journalist reviewers are not only skillful and experienced at writing product reviews, they're actually paid to be objective.

Subscribe to the Business Brief Newsletter

Comments