Amazon to crack down on fake reviewers using AI technology
and live on Freeview channel 276
Amazon has said it will use artificial intelligence (AI) to crack down on fake reviews ahead of Prime Day 2023. The technology will be used to identify comments that aren’t genuine on the shopping website.
The shopping site has been grappling with fake review “brokers” who are paying users to produce reviews for products in exchange for money or free items. Amazon has now invested in machine learning models to analyse data to help detect fraudulent behaviour.
However, UK consumer group Which? has said the action is still “nowhere near enough” just months after the site uncovered Facebook fake review factories. Fake review brokers currently use social media to buy, sell and host fake reviews which could be challenged in the UK Government’s new Digital Markets, Competition and Consumer Bill.
The dangers of fake reviews have been highlighted at being able to sway customers to make purchasing decisions from what they believe to be genuine feedback from other shoppers. However fake reviews can be used to boost a seller’s ratings or to undermine a rival business.
Although they aren’t always easy to spot, including use of generic information, or an unusually high percentage of five star reviews can be key identifiers for fake reviews. Last year, Amazon had reported more than 23,000 social media groups that had over 46 million members and followers that had been facilitating fake reviews.
Rocio Concha, Which? Director of Policy and Advocacy, said:“Which? has uncovered that Facebook fake review factories trading reviews for Amazon and other sites can still be found easily. Recent government research found that up to one in seven reviews in popular product categories on e-commerce platforms are fake – and our own research has shown how fake reviews make consumers more than twice as likely to choose poor-quality products – so fake and misleading reviews are still a huge problem.
“A better collaborative effort by industry and the government to share data to tackle the problem would be a step in the right direction. However, as Amazon suggests, there needs to be more clarity on enforcement. The government’s Digital Markets, Competition and Consumers Bill will be an important first step to enabling the regulator to clamp down on fake reviews, but the Bill must go further by explicitly making the buying, selling and hosting of fake reviews subject to criminal enforcement.”
Amazon has been using AI to crack down on fake reviews for several years, but have now said they will add more investment into “sophisticated tools” to improve customers and sellers on the platform. The company has said fraud-detecting AI to look at a range of factors to calculate if a review is fake.
The software will look into factors such as author’s relationships with other online accounts, sign-in activity, review history and any other unusual behaviour. Dharmesh Mehta, the head of Amazon’s customer trust team, told the BBC: "We use machine learning to look for suspicious accounts, to track the relationships between a purchasing account that’s leaving a review and someone selling that product,
"Through a combination of both important vetting and really advanced machine learning and artificial intelligence - that’s looking at different signals or behaviours - we can stop those fake reviews before a customer ever encounters it,"