“Facebook has played a central role in the rise of the climate of hatred” in Burma

by time news

In a report published Thursday, September 29, Amnesty International dissects the catalytic role played by Facebook in the Rohynga massacres in Burma in 2017, abuses which pushed 700,000 of them to migrate to Bangladesh. In this damning document, the British NGO describes a company that ignores human rights. Consequently, she asks Meta, the parent company of Facebook, to compensate the refugees, many of whom still live in Bangladeshi camps, and launches, Thursday, an online petition to put pressure on the American company.

By tracing the thread of events that led to the disaster, and by interviewing many Rohingya, Amnesty demonstrates Facebook’s inaction in the face of the surge in racist and violent posts targeting the Muslim minority, and this since well before 2017. It points to the weakness of the moderation of the social network and denounces its tendency to promote hate messages.

lawyer and researcher Patrick de Brunauthor of the report, depicts a company fully aware of the harm caused to public debate, based in particular on the documents updated by the whistleblower Frances Haugen.

Read also: Article reserved for our subscribers “Facebook Files”: in Burma, the limits of the social network’s measures against calls for hatred

The 2017 massacres were orchestrated by the Burmese military. Why do we consider that Facebook has its share of responsibility?

Our survey shows that Facebook has played a central role in the rise of the climate of hatred since 2012. For many Burmese, the social network was the only Internet service accessible, and Facebook considerably amplified violent messages there. Some actors sought to dehumanize the Rohingya, which resulted in disinhibiting the military on the ground responsible for ethnic cleansing and mass rape.

How does Facebook’s algorithm actually favor hate messages?

By choosing which messages to display in each user’s feed, it highlights the content that generates a lot of reactions, whether these reactions are positive or negative – which is the essence of hateful content. Facebook gave them an unprecedented audience in Burma.

How could Facebook moderators allow clear calls for murder to spin, even after these posts were reported to them?

This is symptomatic of Facebook’s approach in the countries of the South: the staff dedicated to moderation are very limited there, as are the resources invested in artificial intelligence responsible for identifying hate speech and linguistic tools. However, even if Facebook had put as many resources in Burma as in the West, we doubt that it would have been enough. Moderation is an exercise that is problematic for Facebook, everywhere in the world.

Read also: “Bangladesh: Rohingya, the trap of exile”, on Arte.tv, shows the horizonless life of Burmese Muslim refugees

Was Facebook aware of the risks in Burma?

The company received dozens of warnings. Local actors were quick to point out hate message campaigns orchestrated by the military and their allies, such as Ma Ba Tha, the Association for the Protection of Race and Religion. Year after year, human rights activists and researchers from Western universities have issued new alerts, going so far as to visit Facebook headquarters. The message conveyed was the following: if nothing is done, Burma could become the new Rwanda. According to several Facebook employees we interviewed, it is impossible that these messages were not heard in the upper echelons of the company.

How, then, can the company’s inaction be explained?

During these years, Facebook has become an incredibly rich company, whose interest was to prioritize its growth, even if it meant neglecting human rights. His culture was “move fast and break things” [« move fast and break things »]. Upon its arrival in this country crossed by particularly sensitive ethnic conflicts, the company did not study the risks induced by its practices, nor tested its algorithms.

The only time she acted was in 2014. After several days of deadly riots, several official calls for more effective moderation, the government was forced to shut down Facebook in Burma. The income of the social network was threatened, the company suddenly woke up and sought to do something, or to show that it was doing something. At the suggestion of local activists, she created a pack of virtual flower stickers. Peace activists may have slipped them into hateful conversations hoping to appease them. But, according to Rohingyas we interviewed, it had the opposite effect: Facebook’s algorithm increased the circulation of these violent messages.

Read the survey: Article reserved for our subscribers How Facebook’s algorithm escapes the control of its creators

What should be done to prevent these drifts of the algorithm?

The company argues that improving moderation is enough to correct this problem, but, according to our observations, this is quite insufficient: it is the algorithm that must change, since it is not neutral , by Facebook’s own admission. However, its recipe is at the heart of the company’s economic model.

To force Facebook to change it, we see only one solution: state regulation. In Facebook’s application, for example, the viral content thread should be optional, and not appear first. This is all the more crucial as hate speech continues to thrive on the platform, in Burma as in India or Ethiopia. The future worries us, since Facebook is currently accelerating the circulation of the most viral videos in order to compete with TikTok.

Read also Exactions against the Rohingya in Burma: Facebook in the sights of an American judge

Is there a chance that Facebook agrees to compensate the Rohingya?

Its legal responsibility seems clear to us. Several Rohingya groups have launched well-founded lawsuits in the UK and the US. The majority of the refugees remain parked in Bangladesh, in deplorable conditions, they are even denied access to education. Facebook’s money could be used to remedy this problem. But when I speak with them, these refugees also tell me that they would like things to change in depth. Such a problem should not be repeated.

Read also Article reserved for our subscribers On Facebook, the fiasco of moderation in Arabic

You may also like

Leave a Comment