TheMurrow

Google’s AI Overviews Aren’t ‘Stealing Your Clicks’—They’re Quietly Rewriting What Counts as a ‘Visit’ (and publishers are optimizing the wrong metric)

AI Overviews shift reading and comparison onto Google’s results page—so “engagement” happens without sessions. Pew and Ahrefs suggest the funnel itself is being rewritten.

By TheMurrow Editorial
March 15, 2026
Google’s AI Overviews Aren’t ‘Stealing Your Clicks’—They’re Quietly Rewriting What Counts as a ‘Visit’ (and publishers are optimizing the wrong metric)

Key Points

  • 1Track the shift from clicks to on-Google consumption: Pew observed CTR falling to 8% with AI summaries, versus 15% without.
  • 2Treat rankings as less predictive: Ahrefs found AI Overviews correlate with ~34.5% lower CTR even for the #1 result.
  • 3Optimize for “presence inside answers”: citations, share of voice, and downstream direct actions may matter more than sessions alone.

The old bargain of search—and how it’s being rewritten

A decade ago, the bargain of the open web felt stable. Google sent you a reader. You gave that reader an answer, a story, a recipe, a review—plus a few ads, maybe a subscription pitch, maybe a newsletter sign-up. Everyone complained about the terms, but everyone understood them.

Now the bargain is being rewritten in plain sight, at the very top of the search results page.

Google’s AI Overviews—those AI-generated summaries that increasingly sit above the “10 blue links”—are not merely another search feature. They change where the “reading” happens, what counts as a “visit,” and which businesses get paid when a user feels informed. Publishers can argue about fairness. Marketers can argue about strategy. Users may simply notice that Google answers more questions without asking them to go anywhere else.

The striking part is how quickly behavior seems to adjust. Pew Research Center, analyzing observed browsing data from 900 U.S. adults in March 2025, found that when an AI-generated summary appeared, users clicked a traditional search result 8% of the time—compared with 15% when no AI summary appeared. That’s not a small tweak to the funnel. That’s a redefinition of the funnel.

“The web’s old exchange—search, click, read—breaks when the reading happens inside Google.”

— TheMurrow Editorial
8% vs. 15%
Pew (March 2025) observed users clicked traditional results 8% of the time with an AI summary present, versus 15% without one.

AI Overviews, according to Google: a summary, plus “links for deeper reading”

Google describes AI Overviews as AI-generated summaries in Search results that synthesize answers and offer links for people who want to go deeper. In a July 2024 explainer PDF, Google says the feature uses a customized Gemini model working “in tandem” with traditional Search systems like ranking and the Knowledge Graph. The company positions AI Overviews as most useful for “information journeys” and complex questions that might otherwise require multiple searches.

Google also emphasizes a familiar-sounding promise: AI Overviews aim to surface information “backed up by top web results,” including links to supporting pages, so users can explore “a range of perspectives.” In that same explainer, Google claims AI Overviews can lead to “greater diversity of websites” being visited—and that clicks from AI Overviews are “higher quality,” meaning users supposedly spend more time on the sites they do choose to visit.

The policy language matters, too. Google says it sets a higher bar for Your Money or Your Life topics, and it tries to avoid showing AI Overviews for hard news, where freshness and factuality are especially sensitive. It also describes restrictions around certain election-related queries.

All of that reads like product stewardship: give people a helpful summary, then send them out to the web. The uncomfortable question is whether the web is still the destination—or whether it’s becoming a citation layer underneath Google’s interface.

What Google’s framing leaves out

Google’s language centers on usefulness and “quality” clicks. Publishers, however, live on volume, predictability, and monetization—metrics that don’t always track with time-on-site. A click that arrives later, or not at all, can be “high quality” and still be economically lethal.

“Google calls them ‘higher quality’ clicks. Publishers call them fewer chances to earn.”

— TheMurrow Editorial

Key Insight

Google’s product language optimizes for user satisfaction inside Search; publishers’ economics optimize for predictable sessions, pageviews, and monetizable attention.

The real shift: AI Overviews don’t just reduce clicks—they redefine the “visit”

The loudest criticism of AI Overviews is simple: they “steal clicks.” That complaint captures the pain but misses the deeper change. AI Overviews shift a growing share of user activity—reading, comparing, synthesizing—into Google’s own interface.

Historically, search traffic followed a legible path: query → click → publisher page → ad impression or subscription pitch. With AI Overviews, the user often gets what they need without leaving Google. Even when the user does “engage,” that engagement may happen through expanding the overview, scanning cited sources, or typing follow-up prompts—actions that look like engagement to Google, but do not register as sessions in a publisher’s analytics.

That matters because publishers have spent two decades optimizing the measurable: click-through rate (CTR), sessions, pageviews, recirculation. AI Overviews push value upstream, toward visibility inside the answer itself.

A more realistic “unit of value” is emerging:

- Citation and mention visibility inside AI answers (brand and topic presence)
- Downstream actions that happen later—direct visits, newsletter sign-ups, app opens, subscriptions—after an “assist” from the AI layer
- Share of voice inside AI Overviews relative to competitors, rather than rank position alone

None of that is comforting if your business model depends on users arriving, reading multiple pages, and seeing multiple ads. Yet it may be the terrain that publishers are forced to compete on.

A more realistic “unit of value” emerging in AI search

  • Citation and mention visibility inside AI answers (brand and topic presence)
  • Downstream actions later—direct visits, newsletter sign-ups, app opens, subscriptions—after an “assist” from the AI layer
  • Share of voice inside AI Overviews relative to competitors, rather than rank position alone

Why “zero-click” feels different now

Zero-click search is not new; featured snippets and knowledge panels trained users to accept answers on the results page. AI Overviews deepen the effect by delivering something more like a mini-article—coherent, confident, and often sufficient.

What’s changing, structurally

AI Overviews shift “reading” upstream into Google’s interface, so publishers can lose monetizable sessions even when their information is visibly used and cited.

What the best observed data says: AI summaries suppress clicks and end sessions

Plenty of debate about AI in search happens in the realm of anecdotes: a publisher’s traffic dip here, a marketer’s “we’re fine” there. The most useful evidence comes from observing real behavior at scale.

Pew Research Center’s analysis stands out for that reason. Pew examined browsing data from 900 U.S. adults using KnowledgePanel Digital, covering March 1–31, 2025. The dataset included roughly 2.5 million page visits and 1.1 million unique URLs. That scope helps ground the conversation in what people actually do—not what they say they do.

Pew found that 58% of respondents had at least one search in March 2025 that produced an AI-generated summary alongside traditional results. In other words, AI summaries were not an edge case; they were a common part of search behavior for a majority of users in that month.

The click behavior is where the tension sharpens. Summaries of Pew’s findings in industry coverage report that when an AI Overview appears, users click a traditional result 8% of the time, compared with 15% when no AI summary appears. That’s nearly a halving of the likelihood that a search ends with a click to the open web.

Another detail should worry anyone who believes AI Overviews will “send traffic” through citations. An eMarketer write-up highlights a reported 1% click rate on the source links within the summary. Even if you accept that number cautiously, the direction is clear: the links exist, but most users don’t treat them as the next step.
2.5M
Pew’s March 2025 dataset covered roughly 2.5 million page visits across observed browsing behavior.
58%
Pew found 58% of respondents encountered at least one AI-generated summary in Search in March 2025.
1%
An eMarketer write-up highlights a reported 1% click rate on source links inside AI Overviews—suggesting citations often don’t translate into visits.

Multiple perspectives: user convenience vs. web economics

From a user’s standpoint, fewer clicks can mean success: less friction, faster answers. From a publisher’s standpoint, fewer clicks means fewer monetizable moments—fewer ad impressions, fewer subscription prompts, fewer opportunities to build a habit.

“AI Overviews don’t eliminate the web. They make visiting it optional.”

— TheMurrow Editorial

Even the #1 result can lose: what Ahrefs found about CTR compression

If Pew tells us how humans behave, Ahrefs offers a different lens: large-scale keyword analysis. Industry research comes with caveats, but it is often the only way to see broad patterns across many queries.

In an April 17, 2025 analysis by Ryan Law and Xibeijia Guan, Ahrefs examined 300,000 keywords. Their finding is bracing for anyone who has fought for years to rank first: AI Overviews were associated with a ~34.5% lower CTR for the top-ranking page compared with similar informational keywords without AI Overviews.

The headline here is not “SEO is dead.” The headline is that ranking is becoming less predictive of traffic—even when you win.

For publishers, that creates a planning problem. Editorial teams still need to decide what to cover. Revenue teams still need to forecast. When the top result can deliver materially fewer visits simply because an AI layer appears above it, search becomes a shakier foundation.

For marketers, it complicates attribution. If a user reads an AI Overview and later visits a brand directly, the “assist” is real but hard to measure. Google may count that journey as success inside its ecosystem. The publisher sees only the missing session.

The strategic dilemma

Publishers can either:

- Double down on ranking and accept lower CTR as the new normal, or
- Treat visibility inside AI answers as a new battlefield—one that is harder to track and harder to sell to advertisers

Most will end up doing both, because they have to.

Editor’s Note

The article’s tension isn’t “SEO is dead,” but that rankings can remain strong while traffic becomes less predictable—and harder to forecast.

Google’s “quality clicks” claim—and why publishers dispute the math

Google’s explainer argues that clicks from AI Overviews are “higher quality”—for example, users spend more time on the site when they do click. That may be true in a narrow analytics sense. If AI Overviews filter out casual clicks, the remaining users might be more motivated.

The dispute is about what “quality” means when you run a newsroom or a content business.

A publisher-friendly definition of quality might include:

- Enough volume to support ad revenue and staff costs
- Repeat visitation and habit formation
- Newsletter sign-ups or subscriptions
- Predictable referral streams that can be modeled

Time-on-site alone does not pay reporters. It does not guarantee conversions. It often correlates with lower page depth if the user lands, reads, and leaves—an outcome that can look “high intent” while producing fewer ad impressions.

Google’s claim about greater diversity of websites being visited also deserves scrutiny. Even if AI Overviews cite a wider range of sources, citations do not automatically translate into visits. A site can be “represented” without being visited, the way a book can be quoted without being bought.

Where Google and publishers might both be right

AI Overviews can plausibly improve the user experience for complicated queries and reduce redundant searches. They can also reduce traffic to the very sites that create the information being summarized. Those two realities can coexist; neither cancels the other.

A new kind of competition: “share of voice” inside answers

If the visit is being redefined, publishers need new ways to think about presence. The old scoreboard—rank position and CTR—still matters, but it no longer captures what users see.

The emerging contest looks more like brand competition inside an answer layer:

- Are you cited in AI Overviews for your core topics?
- Are you associated with the right entities and questions?
- Do users recognize your brand when they scan the citations?
- Do you offer something the summary cannot: original reporting, distinctive analysis, tools, community, or authority?

Google says AI Overviews are backed by “top web results” and designed to link out for deeper reading. That suggests a practical reality: traditional ranking signals still matter because the AI system is anchored to the broader search system. In that sense, the AI layer may amplify the benefits of being a trusted, high-performing result—even if it reduces the number of clicks.

Practical takeaways for publishers and editors

No publisher can control whether an AI Overview appears. Publishers can control how legible and valuable their work is when it does.

A pragmatic checklist:

- Write for distinctiveness, not generic coverage. Summaries flatten sameness; they struggle with strong voice and original reporting.
- Make key facts and attribution clear. If AI systems cite “supporting pages,” clear sourcing and structure improve the odds of accurate representation.
- Build direct audience channels. Newsletters, apps, podcasts, and memberships reduce dependence on search sessions that may never come.
- Track visibility, not only clicks. Even if measurement is imperfect, publishers can monitor when their brand appears in AI Overviews for priority queries.

Pragmatic checklist

  • Write for distinctiveness, not generic coverage.
  • Make key facts and attribution clear.
  • Build direct audience channels.
  • Track visibility, not only clicks.

Case studies in miniature: what changes when the answer sits on Google?

Concrete examples help because the shift is experiential.

The “how do I…” query

Historically: a user searches, clicks a tutorial, scrolls past ads, maybe watches an embedded video, maybe clicks another related guide.

With AI Overviews: the user reads a step-by-step summary at the top of the results page. The user may never open the tutorial. If the user clicks, it’s often to confirm a detail, compare a method, or troubleshoot.

Implication: publishers that relied on high-volume, low-margin “how-to” traffic may see the biggest squeeze—especially when the AI summary satisfies intent.

The “compare options” query

Google positions AI Overviews as ideal for “information journeys” that would otherwise require multiple searches. That includes comparisons: products, approaches, pros and cons.

With AI Overviews: the comparison is pre-synthesized. The user’s next click—if it happens—may go to a smaller set of “decision” pages, or none at all.

Implication: traffic may concentrate on fewer late-stage clicks while early-stage exploration happens inside Google.

The “what’s the consensus?” query

For health, finance, and other YMYL topics, Google says it applies higher standards and may limit AI Overviews in sensitive areas. Even so, users increasingly look for consensus summaries.

Implication: authoritative publishers may still be cited, but they may receive fewer sessions. The reputational value of being cited rises; the business value becomes harder to capture.

What readers should watch next: the politics of measurement

The next fight is not only about traffic. It’s about measurement and bargaining power.

Publishers need to know:

- When their work is used to construct AI answers
- How often they are cited
- Whether those citations lead to meaningful downstream behavior

Google, meanwhile, controls the interface where most of the action now happens. If engagement shifts into Google’s own products, the company becomes the primary auditor of what “worked.”

Pew’s data already hints at the outcome: AI summaries appear frequently (58% of users encountered at least one in March 2025), and when they do, clicks fall (8% vs. 15%). Ahrefs adds another warning: even the top result can suffer a ~34.5% CTR drop when an AI Overview is present.

Readers should not romanticize the old web. Search has always been a gatekeeper business. Yet AI Overviews raise the stakes because they move the gatekeeper from “chooser of links” to “author of answers.”

The open web can survive that shift, but it will not survive it by pretending nothing changed.

The closing question: if fewer people visit, who pays to create what AI summarizes?

Google’s AI Overviews are framed as a tool for users: faster understanding, fewer searches, a more coherent path through complexity. The observed behavior suggests the tool works—at least in the narrow sense that people click less when the summary is present.

The harder issue is economic, not technical. Journalism, reviews, explainers, and research cost money. The web’s previous bargain funded that work through visits. AI Overviews make visits less necessary.

The question for the next few years is not whether AI summaries are “good” or “bad.” The question is whether the businesses that produce reliable information can still capture enough value to keep producing it—when the first reading happens somewhere else.

If the new unit of value is presence inside answers, publishers will adapt. They’ll chase citations, build direct audiences, and design reporting that cannot be flattened into a paragraph. But adaptation is not the same as consent. The bargain is being rewritten. The only uncertainty is who gets to negotiate.
T
About the Author
TheMurrow Editorial is a writer for TheMurrow covering trends.

Frequently Asked Questions

What are Google AI Overviews?

AI Overviews are AI-generated summaries that appear in Google Search results, often above traditional links. Google says they synthesize answers—especially for complex queries—and provide links to supporting sources for deeper reading, using a customized Gemini model alongside traditional search systems.

Do AI Overviews reduce clicks to publisher websites?

Observed browsing research suggests they can. Pew Research Center reported that when an AI-generated summary appeared, users clicked a traditional result 8% of the time, compared with 15% when no summary appeared (based on March 2025 browsing data from 900 U.S. adults).

If my site ranks #1, am I safe from traffic loss?

Not necessarily. An Ahrefs analysis of 300,000 keywords (April 2025) found AI Overviews were associated with a ~34.5% lower CTR for the top-ranking page versus similar informational keywords without AI Overviews. Ranking still matters, but it may be less predictive of visits.

Do people click the source links inside AI Overviews?

Evidence suggests source-link clicks are rare. An eMarketer write-up citing Pew-related coverage highlights a 1% click rate on source links within summaries. Even allowing for uncertainty in secondary reporting, the pattern points toward more “on-Google” consumption.

Why does Google say the clicks are “higher quality”?

Google argues that when users do click from AI Overviews, they may spend more time on the destination site—implying stronger intent. Publishers often dispute whether that definition of “quality” matches business realities like revenue, subscriptions, and predictable traffic volume.

What can publishers do right now?

Publishers can focus on what summaries struggle to replace: original reporting, distinctive analysis, and trust. They can also invest in direct channels (newsletters, apps, memberships) and monitor visibility in AI Overviews for priority queries—because presence inside the answer may matter even when the click never comes.

More in Trends

You Might Also Like