TikTok faces mass child safety claims in Italy

Updated as of: 26 November 2025

A Dutch NGO has launched collective action proceedings against TikTok in Italy over alleged violations of data protection and other digital regulations.

Shutterstock.com/JarTee

SOMI yesterday said it has sued TikTok on behalf of children using the platform in Italy, claiming it fails to protect underage users’ personal data and has not implemented effective safeguards to counter harmful content.

The NGO said it is targeting alleged violations of the GDPR, Digital Services Act (DSA) and AI Act and seeks to protect children from “unlawful exploitation of their data for commercial purposes” and obtain compensation for users of the app from 25 June 2023.

The NGO currently has six other collective proceedings open against TikTokMeta, and X in the Netherlands, Belgium, Denmark and Germany. 

SOMI’s open proceedings against big tech

  • Meta (Facebook and Instagram) - Collective claims in the Netherlands, Germany, Denmark and Belgium over alleged unlawful tracking and profiling, weak consent, and manipulative, addictive design features.

  • X - Actions in Germany, Belgium and the Netherlands challenging data use, manipulative practices and lack of transparency around algorithms and ads.

  • TikTok – Proceedings in countries including the Netherlands and Germany focus on extensive data collection and monetisation of children’s and young people’s data. 

SOMI’s DSA case alleges TikTok offers profiling-based advertisements to 13 to 17-year-olds, uses an opaque and addictive personalised feed, and fails to assess, mitigate and properly moderate systemic risks to children. 

TikTok has also allegedly breached the GDPR by processing children's data without valid or age-appropriate consent, allowing children under the age of 13 to bypass age verification measures, and conducts profiling using algorithms to work out sensitive psychological traits and behavioural information. SOMI also said child accounts violate privacy by design requirements as high-protection settings are not set by default, whereas tracking and personalisation are automatically active. 

The NGO also claims TikTok breached the AI Act by using a recommender system that relies on addictive techniques, and has implemented inadequate risk management or oversight of its allegedly opaque algorithms.

The claim follows two enforcement actions against TikTok by Italian regulators for ineffective child safety measures. In January 2021 the Italian data protection authority, known as the Garante, ordered TikTok to stop processing data from users without verified ages for nearly a month after a 10 year-old girl’s death was linked to her participation in an asphyxiation challenge on the platform. As a result, TikTok deleted over 500,000 accounts belonging to under 13-year-olds. 

Italy’s Competition and Market Authority in March 2024 fined TikTok €10 million, the maximum amount permitted under consumer law, for failing to protect children and vulnerable users from harmful content on the platform. 

TikTok did not respond to a request for comment.