TheMurrow

The Hidden Costs of Convenience

“Free” services rarely cost nothing. They’re often financed by your attention and a durable record of your behavior—time and data that compound quietly.

By TheMurrow Editorial
January 29, 2026
The Hidden Costs of Convenience

Key Points

  • 1Recognize the real price of “free”: platforms monetize your minutes and behavioral data, making convenience a trade for attention and privacy.
  • 2Spot engineered retention: infinite scroll, autoplay, and notifications turn ordinary design into an attention tax that compounds over time.
  • 3Follow the regulatory shift: FTC flags “vast surveillance” and indefinite retention, while actions target dark patterns and sensitive location data brokers.

A decade ago, “free” meant a trial period. Today it often means a lifelong meter running quietly in the background—counting not dollars, but minutes and data points.

Open a social app for a quick check. The feed keeps going. A video autoplays. A notification pulls you back later. You never hand over cash, yet you still pay. The bill arrives in two currencies modern platforms are built to extract: attention and personal information.

Regulators have begun describing this arrangement with unusual bluntness. In a September 2024 staff report, the U.S. Federal Trade Commission said large social media and video streaming companies have engaged in “vast surveillance,” and called many firms’ data minimization and retention practices “woefully inadequate.” The report highlighted the ability to retain data indefinitely and to collect information about users and non-users through tracking technologies. That is not the language of a mere lifestyle debate.

The bargain is no longer subtle. It is operational, measurable, and increasingly contested.

The ‘free’ internet isn’t free—it’s financed by your time and a detailed record of your behavior.

— TheMurrow Editorial

Key points

Understand the “free” bargain: ad-funded platforms monetize minutes and behavioral signals, making privacy and time the real price.
Recognize the attention tax: infinite scroll, autoplay, and notifications turn convenience into engineered retention.
Track the regulatory shift: the FTC flags “vast surveillance” and indefinite retention, while regulators scrutinize manipulative design and location data brokers.

The “free” bargain: services funded by data and time

Most mainstream consumer platforms run on an ad-funded or hybrid subscription + ads model. The economics are straightforward: more time on the platform creates more ad inventory, and more interaction creates more behavioral signals that can be used to target and measure advertising.

Those incentives shape product decisions. Features that reduce friction—one-tap sign-ins, default-on personalization, seamless sharing—make growth and engagement easier. They can also make privacy harder. The same interface choices that feel “convenient” can function as quiet nudges toward more disclosure and more time spent.

Regulators are now describing the consequences as systemic rather than incidental. The FTC’s September 2024 staff report focused on large social media and video streaming firms and concluded that many companies’ data minimization/retention practices were “woefully inadequate.” The report also noted the potential to retain data indefinitely, as well as collection about non-users via tracking technologies and broad data sharing practices, including the use of tools like pixels. (FTC, Sept. 2024)

Platforms will argue—often credibly—that data collection also supports security, moderation, and product improvement. Fraud detection, for example, requires signals. Recommendation systems need feedback to get better. The problem is not that data is processed at all. The problem is how easily “necessary” processing bleeds into behavioral advertising surveillance, and how rarely consumers get a clean, comprehensible choice.

What “paying with data” really means

The transaction rarely looks like a transaction. Instead, it appears as:

- Tracking across apps and sites to infer interests and intent
- Long retention periods that outlast the purpose users assume
- Sharing or sourcing data through third parties, including data brokers
- Consent flows designed to make acceptance the path of least resistance

The result is a marketplace where “free” functions less like a price and more like a business strategy.

What the “pay with data” transaction often looks like

  • Tracking across apps and sites to infer interests and intent
  • Long retention periods that outlast the purpose users assume
  • Sharing or sourcing data through third parties, including data brokers
  • Consent flows designed to make acceptance the path of least resistance

The attention tax: time engineered, not merely spent

People talk about “screen time” as if it were a personal failing, a simple matter of discipline. That framing is convenient for companies whose revenue rises when people stay longer.

Measuring time spent is messy. “Screen time” varies depending on whether researchers include TV, count multitasking screens, or rely on self-reporting versus device-level measurement. Many widely circulated numbers are directionally useful but not primary evidence. Still, the policy conversation is shifting because governments increasingly treat manipulative design as a consumer protection issue, not just a wellness issue.

In October 2024, the OECD reported that “nine out of ten consumers” surveyed had been affected by “dark commercial patterns.” The survey included 35,000+ respondents across 20 countries—large enough to take seriously as a signal of broad exposure. (OECD, Oct. 9, 2024)

“Dark patterns” is a term that can sound academic. The experience is not. It is the countdown timer implying urgency, the subscription flow that is simple to start and irritating to end, the hidden fee revealed at the last step, the language that makes opting out feel like a mistake.
9 out of 10
In October 2024, the OECD reported “nine out of ten consumers” surveyed had been affected by “dark commercial patterns.” (35,000+ respondents, 20 countries)

When a product is designed to be hard to leave, your attention stops being a choice and starts being a resource.

— TheMurrow Editorial

The convenience moments that hide the time cost

Engineered engagement often shows up in ordinary moments:

- Infinite scroll that removes natural stopping points
- Autoplay that turns a single clip into a session
- Push notifications tuned to trigger return visits
- Frictionless renewals that make ongoing payment feel like inertia

None of these tactics is inherently illegal. Many are defensible as usability improvements. The concern is cumulative: each small convenience can also be a small lever. Over time, the platform becomes less like a tool you use and more like a place you inhabit.

Common mechanics of engineered engagement

  • Infinite scroll that removes natural stopping points
  • Autoplay that turns a single clip into a session
  • Push notifications tuned to trigger return visits
  • Frictionless renewals that make ongoing payment feel like inertia

Manipulative design becomes a regulatory issue

Design choices used to be treated as taste: the color of a button, the placement of a link. Regulators are increasingly treating design as a form of commercial conduct—because design can steer decisions.

A concrete example arrived in September 2024, when the UK Advertising Standards Authority (ASA) banned ads by Nike and Sky (Now TV) citing misleading design and choice architecture, including obscured terms around auto-renewal. (The Guardian, Sept. 2024) The point is not that these companies are uniquely bad actors. The point is that a major regulator publicly treated interface tactics as potentially deceptive.

The OECD’s “nine out of ten consumers” finding helps explain why. Dark patterns are no longer edge cases. They have become common enough to feel like the default posture of online commerce: make the “yes” button loud, the “no” button quiet, and the costs legible only after commitment.

The ethical question hiding in plain sight

Some manipulation is blatant. Much of it is subtle: a pre-checked box, a confusing menu, a cancellation step moved three layers deep. Those tactics exploit predictable human behavior—impatience, trust in defaults, aversion to hassle.

Platforms and retailers respond that consumers still make the final choice. Critics counter that choice architecture is part of the choice. If a company invests heavily in making one outcome easy and the other outcome exhausting, “consent” becomes harder to defend as truly informed.

That debate is moving from op-ed pages into enforcement actions, which suggests a deeper shift: governments are not only asking whether companies lie, but whether companies nudge people into agreements they wouldn’t otherwise make.

Key Insight

Regulators are shifting from “Did the company lie?” to “Did the interface steer people into choices they wouldn’t otherwise make?”

“Vast surveillance”: what the FTC says companies collect and keep

The most striking language in the FTC’s September 2024 staff report is not technical—it is moral clarity. The agency described “vast surveillance” by large social media and video streaming firms. It criticized data minimization and retention practices as “woefully inadequate,” including the ability to retain data indefinitely. (FTC, Sept. 2024)

That report matters because it connects everyday product experiences—feeds, shares, likes—to the less visible machinery underneath: broad collection, extensive sharing, and tracking technologies that operate beyond the boundaries of a single app.

The staff report also noted collection about non-users. That detail deserves attention. People who never sign up for a service can still be pulled into its data ecosystem through embedded tools such as pixels and similar tracking technologies placed on third-party sites.

The debate isn’t whether platforms know you. The debate is how far that knowledge travels—and how long it lives.

— TheMurrow Editorial
“Vast surveillance”
In September 2024, the U.S. FTC staff report described large social media and video streaming firms as engaging in “vast surveillance.”
“Woefully inadequate”
The same FTC report called many firms’ data minimization and retention practices “woefully inadequate,” including the ability to retain data indefinitely.

Retention and deletion: the quiet battleground

Retention is where privacy becomes time-based. Data kept briefly for a specific purpose is one thing. Data retained indefinitely is another, because it increases the chances of:

- unexpected secondary uses
- sharing with additional partners
- security exposure over time
- difficulty honoring deletion requests

The FTC report also raised concerns that some companies failed to fully honor deletion requests. (FTC, Sept. 2024) That is not merely a paperwork issue. If deletion is unreliable, consumer control becomes performative.

Why indefinite retention increases risk

  • Unexpected secondary uses
  • Sharing with additional partners
  • Security exposure over time
  • Difficulty honoring deletion requests

Data brokers and sensitive location: a case study with real stakes

If the platform economy runs on surveillance, data brokers are among its most consequential intermediaries. They trade in information people rarely realize is for sale—especially location.

The FTC’s actions against X-Mode Social (later Outlogic) offer a concrete, documented example. On January 9, 2024, the FTC announced an order prohibiting the broker from selling or sharing sensitive location data. The agency described risks that included tracking visits to medical and reproductive health clinics, places of worship, and domestic abuse shelters. (FTC, Jan. 9, 2024)

On April 29, 2024, the FTC finalized the order. The requirements included deleting or destroying previously collected location data unless it was deidentified or consumers had provided consent, along with establishing a privacy program and retention schedule. (FTC, Apr. 29, 2024)

Those details matter because they move privacy talk away from abstractions. Location trails can reveal intimate life facts—health decisions, religious practice, personal safety planning—without anyone needing to read messages or hear calls.
Jan 9, 2024
The FTC announced an order against X-Mode/Outlogic prohibiting the sale or sharing of sensitive location data, citing clinic, worship, and shelter visit risks.

How location data becomes identifiable

The FTC has noted that raw location data is often linked to mobile advertising IDs and can be used to infer sensitive visits and potentially match to individuals, especially when combined with other datasets. (FTC, Jan. 9, 2024)

Many consumers hear “deidentified” and assume “safe.” In practice, the risk depends on context, linkage, and how easily records can be connected back to real people. Location is unusually revealing because routines are distinctive: home at night, work by day, repeated visits to the same places. You do not need a name at first to narrow the search.

The attention–privacy feedback loop: why the system reinforces itself

Treat attention and privacy as separate issues and the platform economy seems messy but manageable. Treat them as linked and a clearer picture emerges: the attention–privacy feedback loop.

More time on a platform generates more behavioral signals: clicks, dwell time, watch duration, shares, the contours of a social graph. Those signals make recommendations and ad targeting more profitable. Higher profit per minute increases pressure to keep people engaged longer, which generates still more data. The cycle repeats.

This dynamic is not inherently villainous. Better recommendations can genuinely help users find content they enjoy. More relevant ads can be less annoying than random ones. The question is what happens when optimization targets—engagement, retention, conversion—become the dominant values, overriding concerns like user autonomy, comprehension, and restraint.

Where the loop becomes a consumer harm question

The OECD’s findings about widespread exposure to dark commercial patterns, and the FTC’s language about “vast surveillance,” point toward the same underlying worry: consumers are being treated less as customers and more as raw material.

The loop also complicates individual solutions. Telling people to “use less” ignores the industrial design effort invested in making platforms hard to leave. Telling people to “read the privacy policy” ignores the reality that the policy is often unreadable by design and, in practice, may not reflect the full ecosystem of sharing and tracking described by regulators.

Editor's Note

Individual discipline helps, but the article’s core claim is structural: business incentives reward retention and surveillance, and interface design operationalizes them.

Practical takeaways: how to reduce the real costs of “free”

No single setting restores a simpler internet. Still, readers can make targeted moves that reduce exposure—especially around location and frictionless renewals, the two areas regulators have flagged with concrete actions and language.

Privacy moves that matter

Focus on categories with outsized consequences:

- Limit location sharing: turn off location access for apps that don’t need it to function. Sensitive location harms are well-documented in the FTC’s X-Mode/Outlogic action.
- Audit ad identifiers: mobile advertising IDs can link location and behavior. Reducing their use can lower cross-app tracking potential (though the exact controls vary by device).
- Be skeptical of “delete” promises: the FTC has raised concerns about deletion requests not being fully honored by some firms. Take screenshots of confirmations and consider reducing what you share upfront.

Attention moves that matter

Target the mechanics of compulsion rather than relying on willpower:

- Disable autoplay on video platforms where possible
- Silence non-essential notifications; keep only human-to-human messages or time-sensitive alerts
- Add friction back: log out after use, remove saved payment methods for services you don’t want to renew effortlessly, and resist “one-tap” defaults for purchases

None of these steps is a moral stance. They are budget controls—for time and data.

Practical moves to cut time and data costs

  • Limit location sharing for apps that don’t need it
  • Audit or reset mobile advertising identifiers
  • Be skeptical of “delete” promises; document confirmations
  • Disable autoplay where possible
  • Silence non-essential notifications
  • Add friction back (log out, remove saved payments, resist one-tap defaults)

The broader implication: what readers should watch next

Regulatory attention is moving toward design and data retention. The OECD’s survey suggests dark patterns are widespread. The FTC’s report suggests retention and collection practices remain aggressive, including indefinite retention and collection about non-users. Those are not trends that resolve themselves without oversight—or without consumers demanding better terms.

The bill comes due—whether or not you ever pay a subscription

The modern internet’s defining contradiction is that it feels personal while operating at industrial scale. Your feed looks tailored, your recommendations feel intimate, your map knows where you parked. Behind that convenience sits a system the FTC has described as “vast surveillance,” with retention practices it called “woefully inadequate.”

The question is not whether advertising belongs online. Advertising has long subsidized media. The question is what kind of advertising economy we are willing to tolerate: one that requires expansive tracking, indefinite retention, and manipulative design—or one that can fund services without turning daily life into a dataset.

The OECD’s statistic—nine out of ten consumers affected by dark commercial patterns across 20 countries—suggests the problem is not rare. The FTC’s actions against a location data broker, and its critique of major platforms, suggest regulators are no longer content to treat these harms as theoretical.

People will keep using “free” products. Few can opt out entirely. The more pressing task is to renegotiate the bargain—through clearer rules, better defaults, and consumer pressure for services that earn loyalty without extracting a private tax.
T
About the Author
TheMurrow Editorial is a writer for TheMurrow covering technology.

Frequently Asked Questions

What does it mean to “pay with data” if I never type in personal details?

Behavioral data often comes from what you do, not what you enter—views, clicks, dwell time, device signals, and sometimes tracking across other sites via tools like pixels. The FTC’s September 2024 staff report described broad collection and sharing practices, including collecting data about non-users through tracking technologies. “Payment” can happen passively, without a form field.

Are “dark patterns” illegal?

Some are, depending on jurisdiction and context. The OECD reported in October 2024 that nine out of ten consumers surveyed had been affected by dark commercial patterns, showing how common the tactics are. Enforcement varies, but regulators increasingly treat manipulative design as a consumer protection issue—illustrated by the UK ASA banning certain Nike and Sky ads in September 2024.

Why does location data get singled out so often?

Location can reveal sensitive facts quickly: visits to medical clinics, places of worship, or shelters. The FTC’s January 9, 2024 action against X-Mode/Outlogic explicitly cited those risks when prohibiting the sale or sharing of sensitive location data. Because routines are distinctive, location trails can become identifiable, especially when linked with other datasets.

If companies say data helps with security and fraud prevention, isn’t data collection necessary?

Some data processing can be legitimate for security, fraud prevention, and product improvement. The harder question is proportionality: how much collection, how long retention lasts, and whether data is shared onward for behavioral advertising. The FTC’s September 2024 report criticized data minimization and retention practices as “woefully inadequate,” suggesting the current balance often favors extraction over restraint.

What does “indefinite retention” mean in practice?

Indefinite retention means information can be stored without a clear endpoint, even after the original purpose fades. The FTC’s September 2024 staff report highlighted the ability of major firms to retain data indefinitely. Longer retention increases risk: more chances for secondary uses, more sharing, more exposure in the event of security failures, and more difficulty ensuring deletion requests are fully honored.

Can data be collected about me even if I don’t use a platform?

Yes. The FTC’s September 2024 staff report noted the collection of data about non-users, often via tracking technologies embedded on other sites (such as pixels). That can allow platforms to learn about browsing behavior or device activity outside the platform itself, even without an account.

More in Technology

You Might Also Like