The artificial intelligence (AI) systems that Facebook uses to moderate on the social network are not capable of effectively detecting and removing content with manifestations of hate and excessive violence, according to The Wall Street Journal (WSJ), citing the position of the company’s engineers reflected in the documents. Despite this, already in 2019, the company reduced the time that live moderators can devote to resolving complaints about such content, which led to an overestimation of the role of AI in Facebook’s public statistics, the WSJ notes.
Facebook’s artificial intelligence is incapable of identifying racist statements and videos from the instigators of the shooting with reasonable certainty, the WSJ notes. In one case, the AI was unable to tell the difference between videos of car crashes and cockfighting, which puzzled engineers for weeks. “The problem is that we cannot and probably will never be able to get a model that would even identify most violations, especially in sensitive issues,” the newspaper quoted a Facebook employee as saying from a mid-2019 report.
Facebook employees also noted problems in recognizing hate speech in the languages of the world. In particular, at the end of 2020, the company considered the possibility of creating such a filter for the Arabic language. However, the company did not have sufficient information on the various dialects and had problems even with Standard Arabic.
Note that in October, the head coach of the Volgograd football club “Rotor” Dmitry Khokhlov said that Facebook interprets his name as a dismissive name for Ukrainians and is blocking posts mentioning it. Mr. Khokhlov filed a lawsuit against Facebook for 150 million rubles, in which, in particular, he asked the court to prohibit Facebook “to commit certain actions against Khokhlov”. Facebook users have reported blocking posts with the word “crest” since at least 2015.
About other investigations of the WSJ about problems in Facebook – in the material of “Kommersant” “Facebook has been torn off the cover.”