TikTok: EU Charges Over Content Rule Breach

by mark.thompson business editor

Brussels, February 28, 2024 — TikTok is facing formal charges from the European Union over concerns it isn’t doing enough to protect users, particularly children, from harmful content. The EU alleges the social media giant violated the Digital Services Act (DSA), a landmark law designed to regulate online platforms.

EU Cracks Down on TikTok’s Content Moderation

The European Commission accuses TikTok of failing to adequately safeguard users against illegal and inappropriate material.

  • The European Commission launched formal proceedings against TikTok on February 27, 2024.
  • Concerns center around TikTok’s content moderation systems and their effectiveness in protecting users.
  • The DSA requires large online platforms to take greater responsibility for the content hosted on their services.
  • TikTok could face substantial fines—up to 6% of its global turnover—if found in violation.
  • This action signals a more assertive approach by the EU in regulating Big Tech.

Is TikTok a safe space for younger users? That’s the question at the heart of this EU investigation. The Commission believes TikTok hasn’t done enough to ensure a secure online environment, particularly for minors, and is now taking formal action.

What are the Specific Allegations?

The EU’s concerns revolve around several key areas. According to a statement released on February 27, 2024, the Commission believes TikTok failed to comply with DSA requirements related to risk assessments, content moderation, and transparency. Specifically, the EU is questioning the effectiveness of TikTok’s algorithms in recommending content to users, and whether these algorithms adequately protect younger audiences from harmful material.

The Digital Services Act, which came into force on February 17, 2024, imposes strict obligations on very large online platforms (VLOPs) like TikTok, including annual risk assessments and independent audits.

The Commission also raised concerns about TikTok’s transparency regarding advertising and the traceability of sponsored content. The DSA mandates clear labeling of advertisements and provides users with information about why they are seeing specific content.

TikTok’s Response

TikTok has acknowledged the Commission’s concerns and stated it is cooperating with the investigation. A spokesperson for the company said TikTok is committed to protecting its users and is actively working to address the issues raised by the EU. However, the company maintains that it has already implemented significant measures to enhance safety on its platform.

The formal proceedings initiated by the EU mark a significant escalation in regulatory scrutiny of TikTok. If found in violation of the DSA, TikTok could face fines of up to 6% of its global turnover, a potentially substantial penalty for the hugely popular platform.

What’s Next?

The Commission will now conduct a thorough investigation into TikTok’s practices. TikTok will have an opportunity to respond to the allegations and present evidence to demonstrate its compliance with the DSA. The investigation could take several months to complete, and the outcome will likely set a precedent for how other large online platforms are regulated in the EU.

This case underscores the growing pressure on social media companies to prioritize user safety and address the potential harms associated with their platforms. The EU’s assertive approach signals a willingness to enforce the DSA and hold tech giants accountable for their actions.

What are your thoughts on TikTok’s content moderation policies? Share your opinions in the comments below.

You may also like

Leave a Comment