Australian regulators have launched an investigation into Meta (Facebook and Instagram), TikTok, Snapchat, and Google’s YouTube over concerns they are failing to adequately protect young users from harmful online content. The probe, announced Tuesday by the eSafety Commissioner, centers on potential breaches of age verification protocols and the exposure of children to inappropriate material. This investigation into social media platforms and youth safety is a growing global concern.
The eSafety Commissioner’s office stated the investigation will examine how these platforms assess the age of users and the measures they take to prevent children from accessing content that is harmful or age-inappropriate. The inquiry comes amid increasing scrutiny of the impact of social media on children’s mental health and well-being, and follows similar actions taken by other international regulators. The focus is on whether the platforms are complying with Australian laws designed to protect children online.
According to a press release from the eSafety Commissioner, the investigation will specifically look at the platforms’ compliance with the Online Safety Act 2021 , which requires social media services to provide a safe online environment for Australian users, including children. The Act grants the eSafety Commissioner broad powers to investigate and enforce online safety standards.
What Prompted the Investigation?
The investigation was triggered by a range of concerns, including reports of children being exposed to content promoting self-harm, eating disorders, and cyberbullying. There’s as well been growing criticism of the platforms’ ability to effectively verify users’ ages, allowing children to create accounts despite age restrictions. The eSafety Commissioner has previously issued warnings to social media companies about their responsibilities to protect young users, but these warnings appear to have been insufficient to prompt meaningful change.
The Australian government has been increasingly vocal about the need for greater accountability from social media companies. Minister for Communications, Michelle Rowland, has emphasized the importance of protecting children online and has signaled her support for the eSafety Commissioner’s investigation. “The safety of our children is paramount, and we will not hesitate to take action against companies that fail to meet their obligations,” she said in a statement.
Which Platforms Are Under Scrutiny?
The investigation encompasses four major social media platforms: Meta’s Facebook and Instagram, TikTok, Snapchat, and Google’s YouTube. Each platform faces similar questions regarding their age verification processes and content moderation policies. The eSafety Commissioner will be seeking information from each company about how they identify and remove harmful content, and how they prevent children from accessing it.
Meta (Facebook & Instagram): These platforms have faced repeated criticism for allowing harmful content to proliferate, particularly content targeting young users. Concerns include exposure to unrealistic beauty standards, cyberbullying, and predatory behavior. Meta has outlined several initiatives aimed at protecting teens, but regulators are questioning their effectiveness.
TikTok: The short-form video platform has exploded in popularity among young people, but it has also been linked to concerns about harmful challenges, inappropriate content, and data privacy. TikTok has implemented age restrictions and content moderation policies, but critics argue they are not robust enough.
Snapchat: Known for its disappearing messages, Snapchat has raised concerns about its potential to facilitate cyberbullying and the sharing of inappropriate content. The platform has introduced features designed to enhance privacy and safety, but regulators are examining whether these measures are sufficient.
YouTube: As the world’s largest video-sharing platform, YouTube hosts a vast amount of content, some of which is unsuitable for children. YouTube Kids is a separate app designed for younger audiences, but concerns remain about children accessing inappropriate content on the main YouTube platform.
What Happens Next?
The eSafety Commissioner has the power to compel the social media companies to provide information and documents related to the investigation. If the investigation finds that the platforms have breached the Online Safety Act, the eSafety Commissioner can issue enforceable undertakings, require them to implement specific safety measures, or even impose financial penalties. The maximum penalty for a breach of the Act is AUD $10 million (approximately $6.5 million USD).
The investigation is expected to take several months to complete. The eSafety Commissioner has not provided a specific timeline, but has indicated that she will provide regular updates on the progress of the inquiry. The findings of the investigation could have significant implications for the way social media platforms operate in Australia and could set a precedent for other countries grappling with the challenges of online safety.
Stakeholders, including parents, child safety advocates, and the social media companies themselves, will be closely watching the outcome of this investigation. The results could lead to stricter regulations and increased accountability for social media platforms, ultimately aiming to create a safer online environment for young people. Updates on the investigation will be posted on the eSafety Commissioner’s website: https://www.esafety.gov.au/.
Disclaimer: This article provides information for general knowledge and informational purposes only, and does not constitute legal advice.
What do you think about the Australian investigation into social media platforms? Share your thoughts in the comments below, and please share this article with anyone who might identify it useful.
