On TikTok, videos relating to suicide and eating disorders highlighted by the algorithm

by time news

The algorithm of the social network TikTok is questioned in a report published Wednesday, December 14 by the Center for Combating Online Hate (CCDH), in the United States. The research conducted by the center demonstrates how harmful content, including videos relating to self-harm and eating disorders, is recommended by the social network’s algorithm to its young users.

To highlight the risks posed by the social network on the mental health of young people, the American non-profit organization conducted a life-size experiment. CCHR experts opened fake profiles of thirteen-year-old teenagers in the United States, United Kingdom, Canada and Australia, using profile presentations suggesting a particular vulnerability of these adolescents to developmental disorders. food – including for example the words ” losing weight “.

The researchers then brought these accounts to life by “liking” videos dealing with these harmful subjects, to see if the TikTok algorithm would respond in this way. Within minutes of joining the platform, in just two minutes and six seconds, TikTok’s algorithm recommended suicide-related videos to them (razor blades, discussions about suicide and self-harm…). In just eight minutes, he also suggested content about weight loss and eating disorders.

Read also: Article reserved for our subscribers Social networks will have to be more “transparent” in their moderation policy, warns Arcom

Opacity as to the mode of operation of the algorithms

“It’s like being stuck in a room of distorted mirrors where you’re constantly being told you’re ugly, you’re not good enough, maybe you should kill yourself”, said the president of the center, Imran Ahmed, whose organization has offices in the United States and the United Kingdom. He added that the social network “literally sends the most dangerous messages possible to young people. »

The problem pointed out by the experiment is mainly the mode of operation of the algorithms making law on the social networks. These work by identifying the subjects and content that interest a user, who receives all the more identical suggestions the more often he consults them.

The Chinese social network is normally banned for users under the age of thirteen, while its official rules prohibit videos that encourage eating disorders or suicide. Two failures exposed by the study.

Faced with this circular and opaque mode of operation of algorithms, adolescents and children are the most vulnerable, because they spend more time on social networks, face very strong peer pressure and see content proliferate there. harmful, according to Josh Golin, executive director of the NGO Fairplay, which pleads for more regulation of online content in order to protect children. “All these damages are linked to the economic model” social networks, according to Mr. Golin, regardless of the platform.

Read also: TikTok acknowledges that its European users’ data is accessible from China

TikTok challenges study results and methodology

TikTok disputed the results of the study, questioning its methodology, in a statement released shortly after the report was released. In its statement, the platform notes that the researchers did not use it as typical users and therefore claims that the results were skewed. The company also said that a user’s account name does not affect the type of content they receive.

“We regularly consult with health experts, remove violations of our policies and provide access to support resources for anyone who needs them”underlines the press release from TikTok, which is owned by ByteDance Ltd, a Chinese company now established in Singapore.

US users who search for eating disorder content on TikTok normally receive a message offering mental health resources and contact information for the National Eating Disorder Association.

Despite these efforts by the platform, however, CCDH researchers found that content on eating disorders had been viewed billions of times: fifty-five hashtags relating to this subject accounted for more than 13 million views. They also noticed that young users were using coded language about eating disorders in order to evade content moderation.

Read also: On TikTok, propaganda videos for the paramilitary group Wagner

A law for more regulation under study in the American Congress

“The amount of harmful content offered to teenagers on TikTok shows that self-regulation has failedsaid Mr. Ahmed, pleading for federal rules to be introduced in the United States to force platforms to do more to protect children.
He also noted that the version of TikTok offered to Chinese audiences is more regulated, being designed to promote math and science content to younger users and limiting the time 13- and 14-year-olds can spend on the TikTok. social network every day.

The world

Special offer for students and teachers

Access all our content unlimited from 8.49 euros per month instead of 9.99 euros

Subscribe

In the United States, a bill has been introduced in Congress to impose new rules limiting the data about young users that social media platforms can collect. It aims to create a new office within the Federal Trade Commission tasked with protecting the privacy of young social media users.

One of the bill’s sponsors, Massachusetts Democratic Sen. Edward Markey, said Wednesday he was hopeful lawmakers from both parties could agree on the need for the tougher regulations.
“Data is the raw material big tech uses to track, manipulate and traumatize our nation’s youth every day,”
he particularly considered.

The World with AP

You may also like

Leave a Comment