TikTok, 320 million videos blocked in 2021: what worked and what didn’t

by time news

Time.news – TikTok has to deal with videos contrary to its rules. Between illegal activity, nudity, harassment and bullying, in all of 2021 it blocked more than 320 million content, 85.8 million of which only between October and December. Compared to the same period of the previous year, blocked videos have almost doubled and today represent around 1% of all uploads.

The data, which emerges from the “Report on the application of the Community Guidelines” of TikTok, it does not necessarily mean that there are more prohibited content on the platform. In all likelihood, the ability to intercept them has increased: it would therefore be due, the company emphasizes, “to the continuous improvements in recognition and audio and voice detection” and “proactive detection”.

One in three videos blocked automatically

Between October and December 2021, 90% of the videos were in fact deleted before they were first viewed (it was just over 80% the year before) and 94% were removed within 24 hours of publication. In particular, explains TikTok Head of Trust & Safety, Cormac Keenan, compared to the beginning of the year “content removals with zero views grew by 14.7% for harassment and bullying, by 10.9% for hatred, of 16.2% for violent extremism and 7.7% for suicide, self-harm and dangerous actions “.

Like other platforms, TikTok’s controls also mix human restraint and automatic controls. The former remains fundamental, but “proactivity” largely depends on automation. The data also demonstrate this. Between January and March of last year, software removed about 14% of videos that violated the platform’s rules. Between October and December they reached 33%. One in three.

How wrong is TikTok

In addition to the overall numbers, however, the accuracy must also be measured. In fact, there are the so-called “false positives”, ie videos blacked out by mistake. In the case of TikTok, about one in twenty videos are re-entered after an appeal. The share is minority but remains significant. And most importantly, it has not shown the same progress made elsewhere: the error rate was 4.5% in the first quarter of 2021 and was 5.5% in the fourth. It is one of the parameters to look at in the coming months, because it measures the reliability of the control and – in particular – the evolution of the automated one.

Propaganda and hatred: the gray areas

There is another challenge, perhaps the most complex. The TikTok report states that 45% of the videos were removed because they put the safety of minors at risk, 19.5% because it related to illegal activities, 11% for nudity or sexual activities, 8.5% for violent or explicit content, 7.4% because it related to suicide and self-harm and 5.7% for bullying or harassment. The vast majority of removals therefore concern topics in which it is clear what is lawful and what is not. In which it is relatively easy to distinguish the granted from the forbidden. On the other hand, it is much more complicated (for everyone, from Facebook to Twitter to Youtube) to do it when the boundary between rules and freedom of expression is more blurred.

And on this point TikTok will have to insist: only 1.5% of the contents were obscured for “hateful behavior”, 0.8% for “violent extremism” and 0.6% for “integrity problems” and authenticity “. As the war in Ukraine is demonstrating, TikTok is no longer (only) the social media for kids and futile content: it is an all-round communication platform, where information and propaganda, news and hoaxes, documents and tarot circulate, analysis and rants. Its integrity comes from the ability to distinguish them.

You may also like

Leave a Comment