When Product Defaults Became a Legal Failure

Share

Privacy Disasters
Return to Resources Page
TikTok GDPR Fine

In September 2023, the Irish Data Protection Commission fined TikTok €345 million. There was no cyberattack. No data breach. No ransomware. Instead, the violation centered on how children’s accounts were configured by default.

Under GDPR, default settings are not product preferences. They are legal obligations.

The Privacy Breach

The Irish Data Protection Commission (DPC) investigated TikTok’s processing of personal data belonging to users aged 13–17 between 31 July and 31 December 2020. On 15 September 2023, the DPC issued its final decision, concluding that TikTok had infringed Articles 12, 13, and 25 of the General Data Protection Regulation (GDPR).

During the period under review, accounts created by minors were set to public by default. As a result, content posted by teenagers could be viewed broadly unless privacy settings were manually changed. This was found to be non-compliant. The regulator also found deficiencies in how transparency information was presented to children, particularly regarding visibility settings and data processing practices.

Although TikTok subsequently changed many of these configurations, GDPR enforcement assesses compliance during the relevant period. Later remediation does not eliminate liability for past violations.

What Went Wrong

This enforcement action was not about malicious intent; it was about system design and regulatory accountability.

Article 25(2) GDPR requires controllers to ensure that, by default, only personal data necessary for each specific purpose is processed. For children, this obligation carries heightened importance. Public-by-default settings required minors to take active steps to protect their privacy, rather than protecting them automatically.

Transparency—or a lack thereof—was also central to this case. Articles 12 and 13 of the GDPR require that information be provided in a concise, transparent, intelligible, and easily accessible form, using clear and plain language. This is particularly true where children are involved. The DPC concluded that the information provided did not sufficiently meet that standard.

The deeper structural issue was that product configuration decisions were treated as user experience choices rather than compliance controls. Under the GDPR, those defaults are legally consequential.

How Can These GDPR Compliance Failures Be Rectified?

Technical Fix #1: Enforce Private-by-Default Controls at the Infrastructure Level

The first corrective measure is architectural enforcement.

Accounts identified as belonging to minors should be configured as private by default, with restricted public discovery, messaging, and commenting capabilities. These controls should be embedded directly into account creation workflows and enforced through role-based configuration rules.

Audit logs should confirm that privacy-protective defaults are consistently applied. Compliance cannot depend on user action. It must be enforced by system design.

This approach operationalizes Article 25’s requirement for data protection by default.

Technical Fix #2: Integrate Article 25 Review into Product Governance

The second measure is procedural integration.

Any feature affecting visibility, sharing, or audience reach should trigger a mandatory privacy architecture review before deployment. Child-specific impact assessments should be conducted for features affecting minors. Automated validation testing should confirm that live system configurations align with declared privacy defaults.

Product release pipelines should include a privacy approval checkpoint. Deployment should not proceed unless default protections meet regulatory requirements.

Privacy by design is not documentation. It is governance embedded in engineering.

Conclusion

The €345M fine imposed against TikTok demonstrates a broader regulatory shift. Regulators are no longer focusing solely on privacy notices. They are examining how systems behave in practice.

If a platform allows minors’ data to be publicly visible by default, that is not a minor configuration issue. This decision makes clear that it is a compliance failure under Article 25 GDPR. Under modern enforcement standards, defaults define responsibility. And responsibility carries financial consequences.


This is the first installment of our Privacy Disasters series, created in collaboration with Priya Balakrishnan. In this series, we explore privacy failures in an attempt to uncover what lessons we can learn about data protection and risk management.

Priya Balakrishnan is a privacy and Governance, Risk, and Compliance (GRC) leader with deep expertise in GDPR, U.S. state privacy laws, and global data protection regulations. She designs and leads scalable compliance frameworks aligned with SOC 2, ISO 27001, NIST, and other leading standards, integrating privacy, security, and business strategy to build resilient, audit-ready organizations. Holding CISA, CISM, CIPM, and CDPSE certifications, Priya brings a strategic, forward-looking approach to governance and AI-era risk management. She currently works as the AI Governance, GRC, and Privacy Manager at ExtraHop. She is also the creator of Privacy Byte-Size, where she translates complex privacy and data protection issues into clear, actionable insights for professionals and consumers worldwide.

Privacy Bootcamp Student

Study the Smart Way With Privacy Bootcamp

Privacy Bootcamp Student
  • Comprehensive, all-in-one training source
  • Pass on your first attempt — or your money back*
  • Gain real exam experience with a live testing environment