Preliminary conclusions: TikTok is in breach of the EU's Digital Services Act
Published Sunday 8 February 2026 at 22:09

#DigitalEU
According to a preliminary study by the Commission, TikTok is in breach of the Digital Services Act due to its addictive design, including scrolling, autoplay, push notifications, and a highly personalized recommendation system. The February 2024 investigation was the first opened into TikTok under the DSA
TikTok has not adequately assessed how these addictive features could harm the physical and mental well-being of users, including minors and vulnerable adults. By constantly "rewarding" users with new content, certain design features of TikTok fuel the desire to keep scrolling. Scientific research shows that this can lead to compulsive behavior and reduce users' self-control.
TikTok ignores important indicators of compulsive use -- such as the amount of time minors spend on TikTok at night, the frequency with which users open the app, and other potential indicators, and appears to fail to implement reasonable, proportionate, and effective measures to mitigate the risks arising from its addictive design. The commission found that TikTok's time management tools were "easy to dismiss" including for young users, while parental controls required "additional time and skills from parents to introduce" them.
At this stage, the Commission considers that TikTok should change the fundamental design of its service. For example, by deactivating key addictive features such as "endless scrolling," implementing effective "breaks," including at night, and adapting the recommendation system.
These preliminary findings do not prejudge the outcome of the investigation. If they are confirmed, the commission can impose a fine of up to six percent of the company's annual turnover. TikTok vowed to challenge "these findings through every means available" to them.
At the end of 2025, BROD presented its in-house monitoring framework for compliance of Very Large Online Platforms (VLOPs) like TikTok and Search Engines (VLOSEs) with the EU’s strengthened Code of Conduct on Disinformation. You can find the first report with the scores of the platforms and evidence-based conclusions here.
Meanwhile, the lower chamber of the French parliament approved legislation that would effectively ban social media access for children under 15, with age-verification and platform compliance provisions pending Senate approval. The Spanish government announced plans to ban social media access for those under 16, requiring mandatory age verification, and Slovenian government officials stated that a law is being drafted to prohibit social media use by children under 15. The prime minister of Czech Republic publicly expressed support for banning social media for children under 15, with potential legislation under consideration within the cabinet. Both Greece and Denmark signaled plans for similar measures to restrict access for under-15s or under-16s.
Last year, the European Parliament adopted a non-binding resolution recommending a minimum age of 16 for social media access (with limited exceptions) as part of wider digital safety rules.
