Red Points’ research has demonstrated that product reviews are an essential tool for consumers to evaluate the trustworthiness of listings online. Sellers on ecommerce know this also, however, and know how to manipulate online opinion.
Product reviews and comments for online listings are the most important indicator of trustworthiness for consumers. This has been shown repeatedly in market research conducted by Red Points. The graph below is an excerpt from market research on consumer attitudes towards counterfeit watches, in which the importance of the reviews is made very clear.
The second most important indicator is the star rating of the product, another easily manipulable factor for ecommerce. These trends are shown to be consistent across consumers of a number of different product sectors, including cosmetics, outdoor apparel and more. All research is available for free download on our resources page. But are these reviews really trustworthy?
Fake product reviews
Sellers on ecommerce platforms understand how important product reviews and star ratings are to produce sales online; after all, it’s common knowledge that the best form of marketing is word of mouth. Consumers trust the recommendations of other consumers far more than anything the brand itself says about its product. For this reason, sellers find it critically important to know how to get product reviews on Amazon. So a number of methods designed to manipulate these indicators with fake product reviews have been developed.
Ecommerce platforms’ review policies
Before discussing review manipulation tactics, it would be wise to see what the ecommerce platforms have to say on the matter.
Amazon’s review policy (image below) is clearly outlined, and can be found under the site’s community guidelines. The list rules out a broad list of actions that would wrongly influence a product’s ratings. Explicit rules regarding AliExpress’s community posting also clearly prohibit the selling, buying, bartering for or giving away of feedback. eBay’s community rules simply forbid “Artificially increasing or decreasing any community content, including ratings and reviews”, which is a somewhat more vague message than Amazon’s.
Paying for ecommerce reviews
In recent years, there had been a pervasive issue within online merchants paying to have their products reviewed favourably. The response from ecommerce platforms was fast and effective. By late 2015, Amazon had launched lawsuits against more than a thousand individuals for providing fake reviews for sellers on the ecommerce platform, taking a clear stance against the practice.
However, the existence of companies that pay for product reviews continues. Paid reviewers and the companies willing to employ them have simply wisened up since Amazon’s lawsuits began. The paid-for reviews haven’t disappeared, they’ve just organised themselves more covertly.
Review sites and Facebook groups
Since 2015, Amazon Reviews groups on websites like Facebook have taken the place of the paid reviewer. While Amazon and other ecommerce platforms have been cracking down on fake reviews etc on their platforms, counterfeiters have responded by focusing their reputation on social media, which is far less regulated, before driving their customers back to ecommerce for the sale.
Instead of offering money to a selection of professionals for reviews, companies and ecommerce sellers now offer free samples of their products in exchange for reviews.
Here’s how they work: a seller makes a post in a group like this (image below), offering a fixed number of available items for review. Interested reviewers privately contact the seller and make an agreement. The reviewer makes a legitimate purchase of the product, and writes a review, sometimes with images and video, for the item when it arrives. The seller may request certain keywords or hashtags be included in the review to improve visibility. At this point, the item is refunded by the seller, and the reviewer keeps the free sample.
Of course, there are always exceptions. The seller in the image above can be seen offering a 107% refund of their product’s price for reviews. Postings like this are not just offering free products as quick workarounds of Amazon’s rules, but are clear financial rewards. The implications for the scale of the fake-review industry are huge; in offering a direct payments for reviews, instead of free products, the entire operation becomes scalable and even profitable for reviewers as they can produce a high number of reviews in a short period of time.
Russian troll farms are coordinated teams of internet users spamming political messages on online platforms, working to manually spread political “dezinformatsiya” online. Many companies have also adopted similar tactics for the benefit of their business, or even to the detriment of their competition. The term “Fake News” has been bandied about casually too often in the past few years, but it remains a real issue in online spaces
Enter crowdturfing; a portmanteau of crowdsourcing, the act of recruiting a large amount of people to work a small amount on a big task, and astroturfing, referring to artificial grassroots support for something. The two ideas combined result in a service offered to almost anyone which can repeatedly spam comments on social media and message boards like Reddit. The messages can garner support for, or demand disapproval against, anything. The strategy has been applied by climate change deniers and tobacco lobbyists in attempts to shift public perception.
When Hillary Clinton released her memoirs of the 2016 US elections, the Amazon listing was flooded with fake reviews. Within a day of publishing, the book received 1,500 wildly divisive reviews, either offering 1- or 5-star ratings and coordinating written reviews. Whether these reviews were organised and targeted by an authoritative power or are simply the result of polarised political opinion, it’s safe to say that unbiased commentary of the literature was hard to find in the reviews section for the book.
Is AI used to generate fake reviews online?
With the blistering speed at which technology, especially artificial intelligence (AI), is improving, an assumption could be made that AI is involved with fake reviews. However, while computers are already able to write in a way that can convince people the text is not computer-generated, researchers at the University of Chicago have been unable to find evidence that AI is being utilised to artificially generate reviews online. The deep-learning technology required to generate human-mimicking content demands a huge amount of processing power from computing neural networks. To utilise such a service is greatly more complicated and costly that simply hiring an expansive team of low-paid workers to manually type commentary.
This is not a cause to relax, however; the price of technology is always dropping. Technology such as this is on the cusp of being widely applicable, and as time goes on it will become a more realistic and affordable tool for businesses, organisations and eventually even individuals. The implications of this, without regulations and oversight monitoring how AI is being used, could prove extremely detrimental for the future.