When “Private” is a Marketing Claim Instead of System Design

The Privacy Promise Gap and The BetterHelp FTC Settlement

Share

Privacy Disasters
Return to Resources Page
Privacy Promise Gap

Millions of users. Sensitive mental health data. Shared with advertisers.

No breach. No hack. No ransomware.

Just a product decision—and a regulatory failure.

In 2023, the U.S. Federal Trade Commission (FTC) ordered BetterHelp to pay $7.8 million (USD) for sharing users’ sensitive health information with third parties like Facebook and Snapchat for advertising purposes. Although BetterHelp marketed itself as a mental health platform that respected user privacy and confidentiality laws, its data protection and data processing practices told a different story.

The Privacy Breakdown

Users came to BetterHelp seeking therapy, entrusting the platform with deeply sensitive information, including mental health conditions, personal struggles, email addresses, and other identifiers.

Behind the scenes, this data was shared with third-party advertising and social media platforms for ad targeting, analytics, and user acquisition optimization.

This included sending information via tracking pixels and embedded software development kits (SDKs) — technical mechanisms that quietly sent information from the system to third party advertising platforms. This wasn’t just data sharing. It was the Privacy Promise Gap—the distance between what users were told and how the system actually behaved.

This Wasn’t Just Marketing. It Was Misaligned.

Under U.S. law, particularly Section 5 of the Federal Trade Commission Act (15 U.S.C. § 45), companies cannot engage in deceptive or unfair practices. In this context BetterHelp’s public messaging emphasized confidentiality, privacy, and secure therapy.

At the same time:

This is not a disclosure issue. It is Trust Misalignment at the system level.

Where BetterHelp Went Wrong

Mistake #1: Designing for Data Sharing While Promising Privacy

BetterHelp positioned itself as a confidential therapy service. However, its architecture told a different story. The platform included third-party trackers, behavioral analytics tools, and marketing integrations.

When a system is built for data sharing, labeling it “private” creates Confidentiality Theater: an appearance of confidentiality and privacy without meaningful enforcement.

Mistake #2: Treating Sensitive Data Like Standard Data

Mental health data is not ordinary personal data, due to its sensitivity, it warrants heightened protection. In this case BetterHelp failed to:

As a result, highly sensitive mental health data was processed as if it were standard analytics input.

Mistake #3: Relying on Disclosure Instead of Alignment

Although BetterHelp disclosed elements of its data practices, FTC standards require that disclosures must be clear, prominent, and contextual.

A privacy policy cannot override user expectations shaped by product design. Where system behavior contradicts the stated assurances, those assurances are legally ineffective.

The Real Issue: Trust as Infrastructure

BetterHelp wasn’t just handling data; it was handling trust.

Users shared information under an assumption of confidentiality.

The system processed that information as marketing input.

That gap between expectation and system behavior is what triggered regulatory enforcement.

How to Fix This (If You’re Building Similar Systems)

Fix #1: Eliminate the Privacy Promise Gap

If a service claims to protect user privacy, its technical and organizational design must support that claim. In other words, a service must close the Privacy Promise Gap. In the case of BetterHelp, this requires concrete changes, including the following measures:

The result is a system in which system behavior is aligned with user expectations who were promised confidentiality.

Fix #2: Segment and Protect Sensitive Data

Data that is particularly sensitive requires greater privacy protections. To do this, services should create and maintain technical separation for categories of high-risk data, including mental health, biometric, and financial data.

Enforce appropriate safeguards through the implementation of:

The result is that sensitive data is treated in accordance with its heightened risk profile.

Fix #3: Replace Disclosure with Alignment

User interfaces must accurately reflect system reality by:

Privacy cannot rely on documentation alone. It must be enforced through system design.

The Principle That Matters

The BetterHelp case reinforces a core principle in U.S. privacy enforcement: Your promises are enforceable.

Three hard lessons emerge:

You can write a privacy policy. You cannot engineer trust after you’ve broken it.

About the Author

Priya Balakrishnan is a privacy and Governance, Risk, and Compliance (GRC) leader with deep expertise in GDPR, U.S. state privacy laws, and global data protection regulations. She designs and leads scalable compliance frameworks aligned with SOC 2, ISO 27001, NIST, and other leading standards, integrating privacy, security, and business strategy to build resilient, audit-ready organizations. Holding CISA, CISM, CIPM, and CDPSE certifications, Priya brings a strategic, forward-looking approach to governance and AI-era risk management. She currently works as the AI Governance, GRC, and Privacy Manager at ExtraHop. She is also the creator of Privacy Byte-Size, where she translates complex privacy and data protection issues into clear, actionable insights for professionals and consumers worldwide.

Privacy Bootcamp Student

Study the Smart Way With Privacy Bootcamp

Privacy Bootcamp Student
  • Comprehensive, all-in-one training source
  • Pass on your first attempt — or your money back*
  • Gain real exam experience with a live testing environment