TikTok Faces $29M Fine In The UK For Failing To Protect Children’s Privacy


Pointers at Glance

  • TikTok is facing a £27 million ($29 million) fine after the UK’s Information Commissioner’s Office (ICO) provisionally found that the company breached child data protection laws for two years.
  • The alleged law breach occurred from May 2018 through July 2020, with the ICO noting that the company may have processed data of children under the age of 13 without parental consent.

UK’s ICO said the company may have failed to provide proper information to its users concisely, transparently and easily understood and processed special category data without legal grounds.

Special category data refers to sensitive personal data in sexual orientation, religious beliefs, ethnic and racial origin, political opinions, and genetic and biometric data.

The ByteDance-owned video social network has fallen under growing scrutiny over its data privacy practices. The US Federal Trade Commission (FTC) penalized ByteDance $5.7 million back in 2019 for violating the Children’s Online Privacy Protection Act (COPPA).

More recently, TikTok was forced to stop a planned privacy policy switch in Europe that would have meant it would stop asking users for consent to targeted advertising. Sandwiched in between all that, a UK High Court judge recently greenlighted a class-action lawsuit against TikTok over handling children’s data after it was filed initially by a 12-year-old back in 2020.

In response to increasing concerns over its data privacy practices, TikTok has tried to appease regulators. In 2019, it started restricting virtual gifting to those over 18 before opening a “trust and safety hub” in Europe. TikTok has disabled direct messaging for under 16s and introduced features like family safety mode and screen time management.

Today’s revelation stems from an investigation in the UK ICO initiated in 2019, as the regulatory body revealed that it would be looking into how TikTok collects private data. The investigation sought to discover whether its practices constitute a breach of the General Data Protection Regulation (GDPR), which requires companies to implement robust measures to protect underage users, that includes addressing how the platform allows children to interact with adults.

Skip to content