“White attacks black” and YouTube blocks the channel. Too bad we were talking about chess. It happened to the Croatian player Antonio Radic, known by the online pseudonym Agadmator, owner of the most famous YouTube chess channel in the world, with over 1 million subscribers. The lockdown lasted only 24 hours and according to a study reported by the Independent, the black and white references in the videos may have confused AI algorithms ‘trained’ to detect racism and other hate speech.
Investigating the AI glitch were computer scientists Ashique R. KhudaBukhsh and Rupak Sarkar of Carnegie Mellon University. “We don’t know what tool YouTube is using but they rely on artificial intelligence to detect racist language, this kind of incident can happen,” KhudaBukhsh explained.
In YouTube’s moderation system, according to the study, any AI system could misinterpret comments if not properly programmed to understand the context.
To prove this theory, the two researchers used state-of-the-art voice software to select more than 680,000 comments collected from five popular chess YouTube channels. After reviewing a selection of 1,000 comments that had been classified by the AI as hate speech, they found that 82% of them had been misclassified due to the use of words like “black”, “white” , “attack” and “threat”, all commonly used in the chess language. The paper was presented this month at the Association for the Advancement of AI annual conference.