Brussels – Concerns are mounting across Europe regarding the addictive nature of TikTok and its potential harm to users, particularly young people. The European Commission has issued preliminary findings indicating that TikTok’s design actively encourages compulsive use, violating the Digital Services Act (DSA), a landmark EU law designed to create a safer digital space. This comes as reports surface detailing a troubling trend within TikTok “Lives” – the deliberate humiliation of users, often fueled by virtual gifting and gamification, creating a volatile and potentially damaging online environment.
The DSA, enacted to regulate online platforms and protect fundamental rights, is now at the center of a push to hold TikTok accountable. Amnesty International is urging the Commission to urgently enforce the law, arguing that TikTok has, for years, prioritized capturing children’s attention “at all costs.” Lisa Dittmer, Amnesty International Researcher on Children and Young People’s Digital Rights, stated that the Commission must demonstrate its willingness and capability to stop this abuse and provide a safer online environment for all.
TikTok’s Addictive Design Under Scrutiny
The European Commission’s assessment points to specific design features within TikTok as being particularly problematic. These include the constant stream of novel content, the “infinite scroll” function, autoplay, push notifications, and a highly personalized recommendation system. These elements, the Commission argues, work in concert to keep users scrolling for extended periods, effectively shifting their brains into “autopilot mode” and diminishing their control over their own engagement with the platform. Radio France reported that the EU is demanding changes to TikTok’s interface to address these addictive qualities.
TikTok’s recommender system, driven by an algorithm that analyzes user interactions – likes, follows, watch time – is a key component of this design. The Commission believes this system actively encourages compulsive behavior. This concern is amplified by recent reporting from Le Monde, which details how humiliation has become a disturbing trend within TikTok Lives, fueled by the platform’s mechanics.
The Rise of Humiliation in TikTok Lives
According to Le Monde, TikTok Lives have become a breeding ground for humiliation, where users are subjected to degrading treatment, often encouraged by viewers through virtual gifts. These gifts, purchased with real money, are sent to streamers and can be used to trigger specific actions, including public shaming or ridicule of other users. The report highlights how this dynamic creates a toxic environment where humiliation is not only tolerated but actively incentivized.
The gamification of TikTok Lives, coupled with the pressure to maintain viewership and receive gifts, exacerbates the problem. Streamers may feel compelled to engage in increasingly provocative or harmful behavior to attract attention and retain their audience. This creates a cycle of escalating humiliation, with potentially devastating consequences for those targeted.
Virtual Gifts and Social Pressure
The economic incentives within TikTok Lives play a significant role in this dynamic. Boursorama details how these virtual gifts create a sense of social hierarchy and pressure within the platform. Users who receive more gifts are often granted preferential treatment, while those who receive fewer may be subjected to ridicule or exclusion. This dynamic can be particularly harmful to vulnerable individuals, who may be more susceptible to manipulation and abuse.
EU Action and Future Steps
The European Commission’s preliminary findings are a significant step towards holding TikTok accountable for its potentially harmful design. The Commission is expected to publish its full assessment in the coming weeks, outlining specific measures that TikTok must take to address these concerns. Amnesty International is calling for swift and decisive action, emphasizing the need to protect children and young people from the risks associated with the platform. The Commission’s investigation was initiated following a report on February 9, 2026, highlighting the growing concerns surrounding TikTok’s impact on user wellbeing.
The ongoing scrutiny from European regulators underscores the increasing pressure on social media companies to prioritize user safety and address the addictive nature of their platforms. The outcome of the Commission’s investigation will likely have far-reaching implications for TikTok and the broader social media landscape. The next key date is the expected publication of the Commission’s full assessment, which will detail the specific actions TikTok must take to comply with the DSA.
What do you suppose about the potential regulation of TikTok? Share your thoughts in the comments below.
