Issued by CEMO Center - Paris
ad a b
ad ad ad

‘Your profit fuelled genocide’: Rohingya sue Facebook for £150bn

Monday 06/December/2021 - 05:37 PM
The Reference
طباعة

Facebook contributed to the 2017 genocide of Rohingya Muslims by allowing hate speech against the persecuted minority to be propagated in Myanmar, according to a legal action launched this morning in Britain and the United States.

Lawyers for victims of the genocide are demanding more than £150 billion in compensation in one the largest group claims for victims of a crime against humanity brought before a domestic court anywhere in the world.

They allege that Facebook’s algorithms promoted and amplified hate speech against the Rohingya, who live in the far west of Myanmar and are regarded with racist contempt by many among the majority Buddhist population.

They say that, despite admitting to shortcomings in its monitoring of anti-Rohingya content, Facebook failed to employ moderators capable of reading the Burmese or Rohingya languages or of understanding Myanmar’s fraught political landscape.

They accuse the company of failing to remove posts inciting violence or to shut down pages that propagated hate speech, despite repeated warnings since 2013 from human rights groups and media reports that such content was adding to the explosive situation in Rakhine state.

“At the core of this complaint is the realisation that Facebook was willing to trade the lives of the Rohingya people for better market penetration in a small country in southeast Asia,” a complaint submitted to the Northern District Court in San Francisco says.

“Facebook is like a robot programmed with a singular mission: to grow. And the undeniable reality is that Facebook’s growth, fuelled by hate, division, and misinformation, has left hundreds of thousands of devastated Rohingya lives in its wake.”

A letter of notice submitted to Facebook UK on Monday in advance of a formal claim this month says: “Our clients . . . have been subject to acts of serious violence, murder and/or other grave human rights abuses perpetrated as part of the campaign of genocide and crimes against humanity conducted in Myanmar … As has been widely recognised and reported, this campaign was fomented by extensive material published on and amplified by the Facebook platform.”

In August 2017 the Myanmar security forces responded to small-scale violence by Rohingya militants with deadly attacks on villages in Rakhine state. Witnesses consistently described shocking violence as Rohingya were forced into Bangladesh, where they now live in wretched conditions in the world’s biggest refugee camps.

A UN report in 2018 described how mothers were gang raped in front of young children, and girls as young as 13, as well as pregnant women, were raped, some of them with sticks and knives.

Survivors have recounted how Myanmar soldiers and local Buddhist civilians shot, stabbed and burnt men, women and children, burying them in mass graves. Some used acid to dissolve the faces of the dead in what appears to have been deliberate attempts to prevent identification.

The campaign of violence was described by the UN human rights chief as “a textbook example of ethnic cleansing” and the organisation’s special envoy on human rights in Myanmar has said that the expulsion of the Rohingya bears “the hallmarks of a genocide”. A UN report concluded that an estimate of 10,000 dead was conservative.

Tension between Rohingya and other ethnic groups in Myanmar goes back decades, but it began to worsen at around the time that Facebook launched there in 2011. The country’s near monopoly has allowed it be used to spread disinformation, lies and fake images, including a large volume of material targeting the Rohingya.

“We must fight them the way Hitler did the Jews, damn kalars!” one user wrote, in a post that was highlighted by the Reuters news agency and was still accessible in August. “Kalar” is a racially offensive term applied to Rohingya.

Another post read: “These non-human kalar dogs, the Bengalis, are killing and destroying our land, our water and our ethnic people. We need to destroy their race.” A third urged: “Pour fuel and set fire so that they can meet Allah faster.”

The US claim filed in court says: “The scope and violent nature of . . . persecution changed dramatically in the last decade, turning from human rights abuses and sporadic violence into terrorism and mass genocide. A key inflection for that change was the introduction of Facebook into Burma in 2011, which materially contributed to the development and widespread dissemination of anti-Rohingya hate speech, misinformation, and incitement of violence — which together amounted to a substantial cause, and perpetuation of, the eventual genocide.”

In 2018 Facebook’s founder, Mark Zuckerberg, told American senators that the company was hiring dozens more Burmese speakers. But even today, many expressions of hate are still openly posted. “Kalars are kalars and they will never become Burmese”, the Facebook user Aung Myo Naing wrote last week.

Facebook has not yet responded to a request for comment on the legal action.

 

"