98% of shoppers read reviews, but AI erodes trust

In a marketplace where 98% of consumers read reviews before buying, nearly 14% of 73 million analyzed reviews were likely fake, with 2.

NK
Nina Kapoor

April 18, 2026 · 7 min read

A shopper in a digital marketplace surrounded by glitching, fake AI-generated reviews, symbolizing the erosion of consumer trust.

In a marketplace where 98% of consumers read reviews before buying, nearly 14% of 73 million analyzed reviews were likely fake, with 2.3 million confidently identified as partly or entirely AI-generated. This overwhelming volume of fabricated content significantly compromises the primary tool consumers use for informed decision-making. The sheer scale of deception makes diligent review-reading increasingly futile, leading to widespread misinformed purchases.

Consumers overwhelmingly rely on online reviews to make purchasing decisions, but a rapidly increasing volume of these reviews are fabricated or AI-generated, making trust increasingly difficult to ascertain. This tension creates a challenging environment where genuine feedback struggles to gain traction amidst sophisticated artificial content, directly impacting consumer trust in online reviews and influencer marketing in 2026.

The digital marketplace is becoming a minefield of manipulated information, likely forcing consumers to become more discerning and pushing brands towards more transparent, direct engagement strategies to differentiate themselves. This shift represents an authenticity arms race, where legitimate businesses must invest heavily in proving their veracity rather than solely focusing on product quality or service excellence.

The Unshakeable Reliance on Reviews

  • 98% — of consumers read reviews before making a purchase, according to Forbes and Dixa. The pervasive habit of 98% of consumers reading reviews before making a purchase underscores the foundational role reviews play in pre-purchase research.
  • 97% — of participants indicated customer reviews factor into their buying decisions, according to Dixa. The high percentage of 97% of participants indicating customer reviews factor into their buying decisions illustrates that reviews are not just read, but actively influence purchasing choices.
  • 79% — of customers put as much weight on customer reviews as they do personal recommendations, according to Dixa. The equivalence of 79% of customers putting as much weight on customer reviews as they do personal recommendations highlights the deep trust consumers place in collective online feedback, mirroring the influence of direct peer advice.

The profound reliance consumers place on online reviews is unequivocally demonstrated, making their authenticity non-negotiable for informed purchasing decisions. The widespread habit of consulting reviews before purchase creates a critical vulnerability: if the reviews themselves are compromised, then the entire decision-making process becomes flawed, rendering consumer diligence ineffective.

The Rising Tide of Fake and AI-Generated Feedback

MetricObservationSource
Reviews Analyzed73 millionAP News
Likely Fake ReviewsNearly 14% of 73 millionAP News
Confidently AI-Generated Reviews2.3 million (partly or entirely)AP News
Growth Trend of AI ReviewsMultiplied since mid-2023AP News
Regulatory ActionFTC sued company behind AI tool RytrAP News

Data compiled from AP News, reflecting analysis by The Transparency Company and regulatory actions.

The Transparency Company's analysis of 73 million reviews revealed that nearly 14% were likely fake, with 2.3 million confidently identified as partly or entirely AI-generated, according to AP News. This proliferation is not static; The Transparency Company began detecting large numbers of AI-generated reviews in mid-2023, and they have multiplied significantly since then. The Federal Trade Commission's lawsuit against the company behind the AI writing tool Rytr, accusing it of offering a service that could pollute the marketplace with fraudulent reviews, further underscores the gravity of the situation, as reported by AP News.

The rapid escalation and sophisticated nature of AI-generated reviews, coupled with regulatory action, confirm a significant and growing threat to the integrity of online feedback. The rapid escalation and sophisticated nature of AI-generated reviews, coupled with regulatory action, shifts the focus from isolated incidents of fraud to a systemic, industrial-scale problem, where tools designed for content creation are being weaponized to erode consumer trust on a massive scale.

The Incentives Behind Manipulation

The primary driver behind the proliferation of fake reviews stems from clear financial incentives. Some traders publish or promote fake reviews to improve their own reputation or damage competitors', according to PMC. The competitive pressure from traders publishing or promoting fake reviews to improve their own reputation or damage competitors', combined with the proven impact of reviews on purchasing decisions, creates a strong motivation for unethical businesses to manipulate online perception.

This manipulation is further fueled by the average consumer's behavior: an average consumer reads around 10 reviews before forming an opinion of trust, according to Forbes. The deep engagement of an average consumer reading around 10 reviews before forming an opinion of trust means that even a small percentage of fake reviews can significantly sway a consumer's perception, making the investment in generating them appear worthwhile to bad actors. While 80% of brands are maintaining or increasing their influencer marketing budgets, according to Forbes, this alternative strategy for building trust is also susceptible to authenticity challenges, creating a parallel front in the battle for consumer confidence.

The strong financial incentives for reputation manipulation, coupled with consumers' active search for review-based trust, create a fertile ground for the deliberate generation of deceptive content and the rise of alternative influence strategies. The strong financial incentives for reputation manipulation, coupled with consumers' active search for review-based trust, create a fertile ground for the deliberate generation of deceptive content and the rise of alternative influence strategies, creating an authenticity arms race, where legitimate businesses must not only deliver quality products and services but also constantly prove their genuine engagement against a backdrop of sophisticated, AI-powered deception.

The Erosion of Trust and Impact on Businesses

The pervasive presence of fake reviews fundamentally complicates the relationship between consumers and businesses. While 47% of customers spread the word about a positive experience and 95% of customers shout from the rooftops about a negative experience, according to Dixa, the sheer volume of fabricated content makes these genuine expressions harder to discern. The sheer volume of fabricated content making genuine expressions harder to discern means that authentic word-of-mouth, both positive and negative, risks being overshadowed by artificial signals.

Brand responsiveness, a crucial component of trust, also faces devaluation. For example, 88% of consumers are more likely to buy from a company that replies to all its reviews, according to Forbes. However, if consumers cannot differentiate authentic reviews from AI-generated ones, the effort invested by legitimate businesses in engaging with customer feedback becomes less impactful. If consumers cannot differentiate authentic reviews from AI-generated ones, the effort invested by legitimate businesses in engaging with customer feedback becomes less impactful, trapping businesses in a cycle where proving authenticity becomes as critical, if not more, than the quality of their interactions.

While genuine customer experiences and responsive brand engagement are powerful drivers of trust, the pervasive presence of fake reviews dilutes these authentic signals, making it harder for both consumers and legitimate businesses to navigate the marketplace. Legitimate businesses are forced to divert resources from innovation and customer service towards verifying their online presence, a cost that ultimately impacts consumers and stifles fair competition.

What's Next for Consumer Trust in 2026

The widespread proliferation of AI-generated reviews fundamentally compromises consumer diligence, forcing legitimate businesses into an unsustainable authenticity arms race.

  • Based on The Transparency Company's analysis revealing 2.3 million confidently AI-generated reviews multiplying since mid-2023, businesses that fail to actively verify and showcase their authentic customer feedback are effectively ceding control of their online reputation to sophisticated AI deception. Businesses that fail to actively verify and showcase their authentic customer feedback are effectively ceding control of their online reputation to sophisticated AI deception, meaning proactive measures are no longer optional.
  • The fact that 98% of consumers read reviews, yet nearly 14% of them are likely fake, means that consumer trust is not just eroding, but is actively being weaponized against legitimate businesses, forcing them to invest in proving authenticity rather than solely delivering quality. Consumer trust is not just eroding, but is actively being weaponized against legitimate businesses, forcing them to invest in proving authenticity rather than solely delivering quality, redefining the competitive landscape, prioritizing verification over innovation.

The digital economy is approaching a critical juncture where the integrity of online reviews, a cornerstone of consumer decision-making, is severely compromised. The digital economy approaching a critical juncture where the integrity of online reviews, a cornerstone of consumer decision-making, is severely compromised, necessitates a fundamental re-evaluation of how brands build and maintain trust. Businesses must move beyond simply collecting reviews and instead focus on transparently verifying their origins, engaging with genuine feedback, and potentially shifting emphasis towards direct consumer relationships and verifiable third-party endorsements that are harder to fake. The continued rise of AI-generated content means that platforms and businesses must innovate rapidly to implement robust detection mechanisms and foster environments where authenticity can thrive.

Key Takeaways for 2026

  • Consumer diligence in reading an average of 10 reviews before trusting is rendered futile by the 2.3 million confidently AI-generated reviews multiplying since mid-2023.
  • The Federal Trade Commission's lawsuit against Rytr highlights the industrialization of fake review generation, indicating a systemic challenge.ge to market integrity.
  • Legitimate businesses must invest in proving authenticity, as 98% of consumers read reviews while nearly 14% are likely fake, weaponizing trust against them.

How do online reviews impact consumer purchasing decisions in 2026?

Online reviews hold significant sway, with 97% of participants indicating they factor into buying decisions, according to Dixa. This impact is profound, as consumers often weigh reviews as heavily as personal recommendations, making them a primary source of information before a purchase. However, the increasing presence of fake and AI-generated reviews means that this impact can be misleading, driving decisions based on manipulated information rather than genuine feedback.

What are the most effective influencer marketing strategies for building trust in 2026?

Building trust in influencer marketing in 2026 requires transparency and authentic alignment between influencers and brands. With 80% of brands maintaining or increasing their influencer marketing budgets, according to Forbes, strategies that prioritize genuine engagement and disclosure of partnerships are crucial. This approach helps consumers differentiate credible endorsements from purely commercial arrangements, fostering a more reliable connection.

How can businesses combat fake online reviews in 2026?

Businesses in 2026 can combat fake online reviews by actively verifying the authenticity of customer feedback and showcasing these efforts transparently. The Transparency Company's findings of 2.3 million confidently AI-generated reviews means businesses must adopt robust internal systems and potentially leverage AI-powered detection tools. Proactive engagement with all reviews, as 88% of consumers are more likely to buy from companies that reply, also signals legitimacy and responsiveness to genuine customer experiences.

By Q3 2026, many legitimate businesses, particularly those in competitive e-commerce sectors, will likely be forced to adopt advanced AI-powered review verification systems to counter the escalating threat of AI-generated deception. This investment will be critical for maintaining consumer trust and ensuring that their genuine customer feedback can cut through the noise of fabricated content.