TheMurrow

The Only 10 Questions You Need to Answer Before Trusting Any Product Review

Reviews can be currency—and counterfeit is everywhere. Use this editor-grade framework to spot incentives, suppression, and credibility before you buy.

By TheMurrow Editorial
February 26, 2026
The Only 10 Questions You Need to Answer Before Trusting Any Product Review

Key Points

  • 1Use incentives-first thinking: check affiliate links, free products, and hidden relationships before you treat any review as evidence.
  • 2Scan for suppression signals: too-clean profiles, sentiment-conditioned incentives, and suspicious rating patterns that suggest edited-out doubt.
  • 3Triangulate with a three-source rule: platform reviews, methodology-driven editorial testing, and community discussions with different incentives.

The first time you realize you’ve been played by reviews, it feels oddly personal. You didn’t buy a bad toaster or hire a mediocre contractor. You bought a story—one written in five-star fragments and enthusiastic blurbs that seemed to come from people like you.

70%
BrightLocal’s Local Consumer Review Survey 2026 found 70% of consumers have made a purchase they later regretted after reading reviews.
14%
14% say that regret cost them more than $1,000—not just buyer’s remorse, but a market signal misfiring at scale.

The damage isn’t always small. BrightLocal’s Local Consumer Review Survey 2026 found that 70% of consumers have made a purchase they later regretted after reading reviews, and 14% say that regret cost them more than $1,000. That’s not just buyer’s remorse; it’s a market signal that’s misfiring at scale.

97%
The same BrightLocal research reports 97% of consumers read reviews for local businesses, and people consult an average of six different review sites.

Meanwhile, almost nobody is opting out. The same BrightLocal research reports 97% of consumers read reviews for local businesses, and people now consult an average of six different review sites. Reviews remain the public’s favorite shortcut—and the industry’s favorite lever.

A quiet shift is underway, though. Regulators in the U.S. and UK have moved from scolding bad actors to writing rules meant to change behavior. For readers, that raises an obvious question: if the law is catching up, why does trusting reviews still feel so hard?

“Reviews are no longer just advice from strangers. They’re currency—and currency gets counterfeited.”

— TheMurrow

Why reviews feel less trustworthy than they used to

Review ecosystems have become a high-stakes marketplace where a single rating point can influence discovery, ranking, and sales. The incentive to manipulate is built into the system: star averages determine what you see first, and what you see first shapes what you buy.

That’s why “review fraud” is no longer confined to obvious spam. The more common problem is subtler: incentives, selective visibility, and marketing content dressed as testing. A business doesn’t need to invent hundreds of fake people if it can quietly tilt the playing field—prompting only happy customers to post, burying critical feedback, or pushing reviewers toward positivity.

BrightLocal’s data helps explain why this matters. If 97% of consumers read reviews for local businesses and use six review sites on average, the average person is doing serious homework. Yet 70% still report regret after relying on reviews. That gap is where manipulation thrives: the consumer invests effort, believes they’re being careful, and still loses.

One more complication: review formats have expanded. The “review” might be a Google listing, an Amazon page, a “best-of” SEO article, a TikTok recommendation, or a glossy roundup that looks editorial but functions like a storefront. The same core question applies across all of them: who benefits if you believe this?

What readers actually want (and rarely get)

Across categories—electronics, supplements, local services, software—readers tend to want the same things:

- A simple checklist that travels well from Amazon to contractors.
- Clear red flags that don’t require technical expertise.
- A way to distinguish editorial testing from marketing content.
- Practical tactics for triangulating information across sources.

A trustworthy review doesn’t just persuade. It leaves evidence in its wake.

Key Insight

The core question is consistent across every review format: who benefits if you believe this? If the incentives are hidden, credibility drops.

The new rules: what regulators are trying to fix (and why it matters)

Regulation won’t make every review honest. It does, however, clarify what counts as deception—and gives enforcement agencies sharper tools. Over the last 18 months, the rules of the road have changed in ways that readers should understand, even if they never read a single legal document.

In the United States, the Federal Trade Commission has been tightening standards around endorsements and reviews. The FTC Endorsement Guides (updated 2023) emphasize disclosure of material connections—payments, free products, affiliate commissions, employment ties, family relationships. The principle is simple: if the reviewer has a stake in your purchase, you deserve to know.

Then came the big hammer. In an August 2024 press release, the FTC announced a final rule banning fake reviews and testimonials, explicitly targeting AI-generated fakes, review suppression, and incentivized reviews conditioned on sentiment. The rule becomes effective 60 days after publication in the Federal Register.

The UK has moved even more aggressively in public messaging. A new consumer regime that took effect April 6, 2025 explicitly bans fake reviews and makes website hosts accountable for preventing and removing them. Under the DMCCA framework, penalties can include fines up to 10% of global turnover, a scale that tends to focus corporate attention.

Disclosures help—but they don’t solve the whole problem

A disclosure can be honest and still misleading in effect. “We may earn a commission” doesn’t tell you whether the reviewer tested competitors, how products were selected, or whether negative experiences were filtered out.

Regulators are targeting extreme abuses—fabrication, suppression, coercion. Readers still have to evaluate the gray zone: the content that’s technically compliant yet editorially thin.

“Legal compliance tells you what a reviewer can’t hide. It doesn’t tell you what they chose not to learn.”

— TheMurrow

The checklist: six questions that reveal whether a review is credible

Most “how to spot fake reviews” advice focuses on tone: too many exclamation points, vague praise, broken English. Those signals can help, but modern manipulation often looks polished. Better to use a framework that works across platforms.

Six credibility questions to run on any review

  1. 1.1) Who benefits if I buy?
  2. 2.2) What did they actually do with the product or service?
  3. 3.3) What’s missing?
  4. 4.4) Do the numbers look human?
  5. 5.5) Is there review gating or suppression?
  6. 6.6) Can I verify across independent places?

1) Who benefits if I buy?

Start with incentives. Look for affiliate links, sponsorships, free products, or “partner” badges. The FTC’s Endorsement Guides stress disclosure of material connections—and readers should treat missing disclosures as a meaningful negative signal.

Practical test: if a review is glowing and conveniently links to purchase pages, ask whether the site earns money when you click or buy. That doesn’t automatically disqualify it, but it changes how you weigh certainty.

2) What did they actually do with the product or service?

Trustworthy reviews describe real use: constraints, setup, tradeoffs, and context. Thin reviews lean on adjectives (“amazing,” “premium,” “life-changing”) without showing work.

For local services, credible reviews often include specifics: what work was done, timelines, communication, and how problems were handled. For products, look for details that would be hard to fake without handling the item.

3) What’s missing?

Absence can be louder than presence. Ask:

- Are there any meaningful negatives?
- Do they mention competitors at all?
- Do they acknowledge who the product is not for?

The best reviewers don’t just recommend. They exclude—and explain why.

4) Do the numbers look human?

You don’t need a spreadsheet to see suspicious patterns. Be wary of:

- A huge pile of five-star ratings with few mid-range reviews.
- A sudden burst of reviews in a short period.
- Ratings that read like copy variations of the same script.

None of these prove fraud. They tell you where to slow down.

5) Is there review gating or suppression?

The FTC’s 2024 rule directly targets review suppression, including misrepresentations that displayed reviews represent “all or most” when negatives were suppressed. Readers can’t always see behind the curtain, but there are clues: a business with an implausibly pristine profile, or a platform where only “recommended” reviews show.

6) Can I verify across independent places?

BrightLocal reports consumers use six review sites on average—which is both a reality check and a strategy. A pattern that repeats across independent platforms is harder to manufacture.

Your goal isn’t certainty; it’s corroboration.

The biggest red flags: incentives, “social proof,” and review suppression

Modern review manipulation isn’t always fake reviews in the obvious sense. The more common pattern is real reviews shaped by pressure.

The FTC’s 2024 final rule draws a bright line around one especially telling abuse: buying reviews where compensation is conditioned on positive (or negative) sentiment. A discount offered for an honest review is one thing; a discount offered only for five stars is something else. Readers should treat “incentivized positivity” as a structural warning sign, even when the content sounds sincere.

Another core concern is insider reviews—employees, family, contractors, or other connected parties reviewing without disclosure. The FTC has been explicit that these material ties must be clearly disclosed. When the praise is intense and the relationship is hidden, the review becomes less an opinion than an undisclosed advertisement.

Then there’s the illusion of popularity. The FTC rule also addresses fake social indicators—followers, views, other metrics used commercially to imply legitimacy. This matters because review culture now blends into influencer culture: the endorsement might be a star rating, a “what I bought” video, or a “best-of” list that funnels you to a checkout page.

“The most convincing review fraud doesn’t invent trust—it edits out doubt.”

— TheMurrow

Editor’s Note

Treat incentives as context, not proof. A disclosed commission can coexist with honest testing—but it raises the bar for methodology and corroboration.

How to read a review page like an editor (not a customer)

Most people read reviews to be reassured. An editor reads them to find what’s being concealed.

Read the distribution, not the average

A 4.6-star average can hide two very different realities: consistent satisfaction or polarizing performance. Scroll past the highlighted blurbs and look at the one-, two-, and three-star reviews. Not to doomscroll—just to understand failure modes.

When you see a cluster of similar complaints, pay attention to how the business responds. A defensive response can matter. A specific, solution-oriented response can matter more.

Look for “independence theater”

The FTC’s 2024 rule targets company-controlled “independent” review sites that misrepresent independence. Readers should be skeptical of a review site that looks neutral but exists to drive sales for one brand, one franchise network, or one category of partner.

A practical tell: the site “reviews” only one company’s offerings, or it uses the language of consumer advocacy but routes every click to the same seller.

Separate “I like it” from “it’s good”

A useful review distinguishes between preference and performance. The writer might love a product because it matches their needs, but a serious review explains what happens outside their personal context.

This is where “best-of” SEO lists often fail. They answer “what should I buy?” with a single funnel. They rarely answer “what tradeoffs am I accepting?”—because tradeoffs reduce conversions.

Case studies you’ve probably seen (even if you didn’t label them that way)

Real-world review problems tend to repeat as patterns. Here are three familiar scenarios—across shopping, services, and content.

Case study 1: The “too clean” local business profile

You find a contractor with an immaculate rating and dozens of glowing testimonials. The language feels curiously uniform: “professional,” “great communication,” “highly recommend.” You hire them and discover they’re fine—until something goes wrong and accountability disappears.

What happened? In some cases, the public profile reflects selection, not reality: only the happiest clients were encouraged to post, while dissatisfied clients were funneled into private resolution or quietly discouraged. Regulators now have more to say here: the FTC has targeted review suppression as a deceptive practice when businesses misrepresent the completeness of displayed feedback.

Case study 2: The “best-of” article that reads like a catalog

A roundup promises “the 10 best” products, each with a buy button and confident claims. It looks like journalism. It behaves like a storefront.

Even when disclosures exist, they can be vague. The FTC’s standard—disclose material connections—is necessary. Readers still need to ask whether the list was built from testing, from consensus, or from commission rates.

Case study 3: The influencer recommendation with invisible strings

An influencer praises a supplement, gadget, or app in a way that feels personal. The audience hears authenticity; the platform sees performance marketing.

The FTC has been explicit: paid endorsements require clear disclosure, and the 2024 rule targets deceptive practices including fake testimonials and misrepresented experience. The problem is cultural as much as legal: audiences are trained to read intimacy as credibility. That’s a muscle worth retraining.

Practical tactics: how to triangulate without spending your weekend researching

No one has time to become a full-time fact-checker for every purchase. The goal is a repeatable method that catches most problems quickly.

Build a “three-source rule”

For anything expensive or consequential, use three different kinds of sources:

- A platform review page (Google, Amazon, app store)
- An independent editorial review (testing-focused, clear methodology)
- A community signal (forums, local groups, long-form discussions)

BrightLocal’s finding that consumers use six review sites on average suggests many people already do a version of this. The improvement is intentionality: choose sources that aren’t all downstream of the same incentive.

Use negative reviews as diagnostic tools

One-star reviews are often emotional and messy. They’re still useful because they reveal failure modes: shipping issues, warranty hassles, customer service patterns, durability problems. If the same issue appears repeatedly, treat it as data—even if you suspect some complaints are unfair.

Look for accountability, not perfection

A business with a few negative reviews isn’t necessarily worse. A business with no meaningful criticism might be curating visibility.

Regulators are pushing platforms to take responsibility. The UK’s April 2025 regime explicitly makes website hosts accountable for preventing and removing fake reviews, and the CMA highlighted the seriousness of enforcement in its June 6, 2025 announcement regarding Amazon undertakings to curb fake reviews, under a framework that can involve massive penalties.

That policy direction matters, but readers still need a personal standard: trust the places that show their work.

What “good” looks like: transparency, methodology, and honest limits

A credible review is not an attitude. It’s a set of choices.

Look for disclosures that are specific: affiliate relationships, free products, sponsorships. The FTC’s updated Endorsement Guides (2023) make clear that material connections should be disclosed in a way ordinary readers can notice and understand.

Look for methodology: how products were selected, what was tested, what wasn’t. Testing doesn’t have to be laboratory-grade to be valuable, but it should have constraints and comparisons.

Finally, look for intellectual honesty. A trustworthy reviewer will tell you where their confidence ends. Marketing avoids that sentence. Journalism earns it.

The law is increasingly aligned with the reader’s interest: less fakery, less intimidation, less hidden sponsorship. But the deeper fix remains cultural. We need to reward reviews that are careful, not just confident.

A five-star rating can be a signal. It should never be the whole story.

“A five-star rating can be a signal. It should never be the whole story.”

— TheMurrow
T
About the Author
TheMurrow Editorial is a writer for TheMurrow covering reviews.

Frequently Asked Questions

Are fake reviews actually illegal now?

In the U.S., the FTC announced a final rule in August 2024 banning fake reviews and testimonials, including certain AI-generated fakes, and targeting practices like review suppression and sentiment-conditioned incentives. The rule becomes effective 60 days after publication in the Federal Register. In the UK, fake reviews were explicitly banned under a regime effective April 6, 2025, with enforcement powers and major potential penalties.

What’s the difference between “incentivized reviews” and illegal review buying?

Not every incentive is automatically illegal, but regulators draw lines around deception. The FTC’s 2024 rule targets buying reviews where compensation is conditioned on positive (or negative) sentiment—for example, “we’ll pay you only if you leave five stars.” Even when incentives are disclosed, readers should treat them as a reason to seek corroboration elsewhere.

If a site discloses affiliate links, can I trust the reviews?

Disclosure helps you understand incentives, and the FTC’s Endorsement Guides (updated 2023) emphasize revealing material connections like affiliate commissions. Still, disclosure doesn’t prove testing quality or fairness. Use it as a prompt: check whether the outlet explains methodology, compares competitors, and includes meaningful negatives rather than functioning like a product catalog.

How can I tell if a business is suppressing negative reviews?

You often can’t prove it from the outside, but you can spot clues: an implausibly perfect profile, only glowing “recommended” reviews, or patterns where critical feedback seems absent despite high volume. The FTC’s 2024 rule specifically targets review suppression, including misrepresentations that displayed reviews represent “all or most” when negatives were withheld. Treat “too clean” as a cue to triangulate.

Why do I regret purchases even after reading lots of reviews?

BrightLocal’s Local Consumer Review Survey 2026 found 70% of consumers regretted a purchase after reading reviews, and 14% regretted spending over $1,000. Reviews can be abundant yet misleading when incentives shape what gets posted, when “best-of” content is optimized for sales rather than accuracy, or when you’re seeing curated positivity rather than a full distribution of experiences.

How many sources should I check before I buy something important?

A practical standard is three: a major platform’s review page, an independent editorial review with clear methodology, and a community discussion source. BrightLocal reports consumers use six review sites on average for local businesses—so the behavior is already common. The improvement is choosing sources with different incentives, so you’re not relying on the same pipeline of marketing.

More in Reviews

You Might Also Like