EU Investigates Snapchat Over Child Safety & DSA Violations

by priyanka.patel tech editor

Brussels is intensifying its scrutiny of social media platforms, and Snapchat is now squarely in the crosshairs. The European Commission announced Thursday it has formally opened proceedings against Snap Inc., the parent company of Snapchat, alleging violations of the Digital Services Act (DSA). The investigation centers on concerns that Snapchat isn’t doing enough to protect children and teenagers from harmful content and predatory behavior, raising questions about the platform’s age verification systems and content moderation practices.

The DSA, which came into force in February 2024, aims to create a safer digital space for users across the European Union. It places significant obligations on large online platforms to address illegal content, protect fundamental rights, and be transparent about their algorithms. This formal investigation marks a significant escalation in the EU’s enforcement of the DSA, signaling a willingness to hold major tech companies accountable for user safety. The stakes are high for Snapchat, potentially facing substantial fines – up to 6% of its global annual revenue – or even a temporary ban on operations within the EU if found in violation.

At the heart of the Commission’s concerns is Snapchat’s current age verification process. While the platform requires users to state they are at least 13 years old, the system relies solely on self-reporting. EU regulators argue this is insufficient, failing to prevent access for younger children and inadequately identifying users under 17 who are entitled to additional protections under the DSA. The Commission is as well investigating whether Snapchat’s privacy settings adequately protect young users’ data and whether adults with malicious intent can easily circumvent safety measures to contact minors.

EU Investigation Focuses on Snapchat’s Safety Measures

The formal proceedings will examine five key areas of Snapchat’s operations. Beyond age verification, the Commission is scrutinizing the platform’s content moderation policies, specifically regarding the prevalence of advertisements for harmful products like drugs, e-cigarettes, and alcohol targeted at young people. Regulators are questioning whether Snapchat’s systems are robust enough to effectively identify and remove this type of content. The investigation will also assess whether Snapchat’s design choices – including its ephemeral messaging feature, where content disappears after being viewed – hinder effective content moderation.

This isn’t the first time Snapchat’s practices have come under scrutiny. The EU’s investigation builds upon a preliminary inquiry launched by Dutch authorities in September 2023. The Dutch Authority for Consumers and Markets (ACM) investigated reports of illegal sales of e-cigarettes to minors through Snapchat. The ACM continues to be involved in the EU-wide investigation, sharing data, and expertise. Germany’s Federal Network Agency (BNetzA) is also contributing information, highlighting the collaborative approach to DSA enforcement across member states.

Snapchat’s Stock Price Drops Following EU Announcement

News of the formal investigation sent shockwaves through the financial markets. Shares of Snap Inc. Plummeted approximately 10% on Thursday, reflecting investor concerns about potential fines and the costs associated with upgrading Snapchat’s safety systems. The regulatory pressure adds to an already challenging year for the company, which has been working to improve its financial performance.

Snapchat has publicly stated its commitment to user safety and its willingness to cooperate with regulators. “Safety is a top priority, and we perform closely with experts and regulators to understand and address evolving challenges,” a Snap Inc. Spokesperson said in a statement. However, the European Commission appears skeptical, with Henna Virkkunen, Vice-President for Technology Sovereignty, suggesting that Snapchat may have underestimated the stringent security requirements of the DSA.

Broader EU Crackdown on Online Safety

The action against Snapchat is part of a broader EU offensive to enhance online safety, particularly for young users. The Commission simultaneously announced preliminary findings in investigations against several adult content platforms, including Pornhub and XVideos, also focusing on inadequate age verification measures. This coordinated effort underscores the EU’s determination to create a safer digital environment for all citizens.

The potential consequences for Snapchat are significant. Beyond substantial financial penalties, repeated violations of the DSA could lead to a temporary suspension of the platform’s operations across the EU. Experts view this case as a potential precedent, particularly given Snapchat’s unique “ephemeral” nature, which presents specific challenges for content moderation. The outcome of this investigation will likely shape the regulatory landscape for other messaging-focused platforms.

What’s Next for Snapchat and the DSA?

The investigation is expected to be a lengthy process, involving detailed technical audits and requests for internal documentation. Snap Inc. Has the opportunity to address the Commission’s concerns by offering “commitments” – concrete steps to improve its safety measures – which could lead to a negotiated settlement. However, if the Commission finds evidence of intentional or negligent disregard for user safety, it could issue a non-compliance decision and impose significant penalties.

In the coming weeks, the Commission will seek detailed information from Snapchat regarding its algorithms, reporting systems, and potential leverage of “dark patterns” – deceptive design techniques that manipulate users into making choices they might not otherwise make. Snapchat will necessitate to demonstrate that its safety technologies are effective in practice, reliably distinguishing between adults and minors and providing accessible reporting mechanisms for illegal content. The outcome of this process will have a lasting impact on the safety standards for social media users across Europe.

Disclaimer: This article provides information about an ongoing legal and regulatory matter. It is not intended as legal advice. Readers should consult with qualified legal professionals for guidance on specific legal issues.

The European Commission is expected to provide an update on the investigation in the coming months. Stay tuned to time.news for further developments on this story. What are your thoughts on the EU’s actions? Share your comments below.

You may also like

Leave a Comment