Myanmar: Facebook’s business model profits from ‘echo chamber of hatred’ which fuelled Rohingya atrocities– new report

Published By Amnesty International UK [English], Wed, Sep 28, 2022 9:09 AM


Inflammatory content keeps Facebook users on the platform longer, meaning more targeted advertising can be sold and creating perverse incentive to spread hate

The multibillion-dollar giant needs to pay compensation to Rohingya

Amnesty calls on Meta to fundamentally change its business model and algorithms

‘While the Myanmar military was committing crimes against humanity against the Rohingya, Meta was profiting from the echo chamber of hatred created by its hate-spiralling algorithms’ - Agnès Callamard

Facebook owner Meta’s dangerous algorithms and reckless pursuit of profit substantially contributed to the atrocities perpetrated by the Myanmar military against the Rohingya people in 2017, Amnesty International said in a new report published today.

The Social Atrocity: Meta and the right to remedy for the Rohingya, details how Meta knew that Facebook’s algorithmic systems were supercharging the spread of harmful anti-Rohingya content in Myanmar, but the company still failed to act.

The Rohingya are a predominantly Muslim ethnic minority based in Myanmar’s northern Rakhine State. In August 2017, more than 700,000 Rohingya fled Rakhine when the Myanmar security forces launched a targeted campaign of widespread and systematic murder, rape and burning of homes. The violence followed decades of state-sponsored discrimination, persecution, and oppression against the Rohingya that amounts to apartheid.

Meta uses engagement-based algorithmic systems to power Facebook’s news feed, ranking, recommendation and groups features, shaping what is seen on the platform. It profits when Facebook users stay on the platform as long as possible, by selling more targeted advertising. The display of inflammatory content – including advocating hatred –constitutes incitement to violence, hostility and discrimination and is an effective way of keeping people on the platform longer. The promotion and amplification of this type of content is key to Facebook’s surveillance-based business model.

“In 2017, the Rohingya were killed, tortured, raped, and displaced in the thousands as part of the Myanmar security forces’ campaign of ethnic cleansing. In the months and years leading up to the atrocities, Facebook’s algorithms were intensifying a storm of hatred against the Rohingya which contributed to real-world violence. “While the Myanmar military was committing crimes against humanity against the Rohingya, Meta was profiting from the echo chamber of hatred created by its hate-spiralling algorithms. “Meta must be held to account. The company now has a responsibility to provide reparations to all those who suffered the violent consequences of their reckless actions. Meta risks contributing to further serious human rights abuses, unless it makes fundamental changes to its business model and algorithms.”

In the months and years before the crackdown, Facebook in Myanmar became an echo chamber of anti-Rohingya content. Those linked to the Myanmar military and radical Buddhist nationalist groups flooded the platform with anti-Muslim content, posting disinformation claiming there was going to be an impending Muslim takeover, and portraying the Rohingya as “invaders”.

In one post shared more than 1,000 times, a Muslim human rights defender was pictured and described as a “national traitor”. The comments included such threatening and racist messages as ‘He is a Muslim. Muslims are dogs and need to be shot’, and ‘Don't leave him alive. Remove his whole race. Time is ticking’.

Content inciting violence and discrimination went to the very top of Myanmar’s military and civilian leadership. Senior General Min Aung Hlaing, the leader of Myanmar’s military, posted on his Facebook page in 2017: “We openly declare that absolutely, our country has no Rohingya race.” He went on to seize power in a coup in February 2021.

In 2014, Meta attempted to support an anti-hate initiative known as ‘Panzagar’ or ‘flower speech’. It created a sticker pack for Facebook users to post in response to content advocating violence or discrimination. The stickers had messages such as, ‘Think before you share’ and ‘Don’t be the cause of violence’.

But activists soon noticed Facebook’s algorithms interpreted the stickers as a sign that people were enjoying a post and began promoting them increasing the number of people who saw them.

The UN’s Independent International Fact-Finding Mission on Myanmar ultimately concluded that the “role of social media [was] significant” in the atrocities in a country where “Facebook is the Internet”.

Mohamed Showife, a Rohingya activist, said: “The Rohingya just dream of living in the same way as other people in this world… but you, Facebook, you destroyed our dream.”

The report details how Meta repeatedly failed to carry out appropriate human rights due diligence on its operations in Myanmar, despite its responsibility under international standards.

Internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harm. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.

Meta received repeated communications and visits by Myanmar civil society activists between 2012 and 2017 when the company was warned that it risked contributing to extreme violence. In 2014, the Myanmar authorities even temporarily blocked Facebook because of the platform’s role in triggering an outbreak of ethnic violence in Mandalay. Meta repeatedly failed to follow warnings, and consistently failed to enforce its own policies on hate speech.

Amnesty’s investigation includes analysis of new evidence from the ‘Facebook Papers’ – a cache of internal documents leaked by whistleblower Frances Haugen.

Amnesty is today launching a new campaign calling for Meta Platforms, Inc. to meet the Rohingya’s demands for remediation.

Rohingya refugee groups have made direct requests to Meta to provide compensation by funding a US $1 million education project in the refugee camp in Cox’s Bazar, Bangladesh. This represents just 0.002% of Meta’s $46.7 billion profit from 2021. In February 2021, Meta rejected the Rohingya community’s request, stating: “Facebook doesn’t directly engage in philanthropic activities.”

Showkutara, a 22-year-old Rohingya woman and youth activist, told Amnesty International: “Facebook must pay. If they do not, we will go to every court in the world. We will never give up in our struggle.”

There are at least three active complaints against Meta seeking remediation for the Rohingya. Civil legal proceedings were filed against the company in December 2021 in both the United Kingdom and the USA.

Press release distributed by Media Pigeon on behalf of Amnesty International UK, on Sep 28, 2022. For more information subscribe and follow


Press Office

[email protected]
+44 (0) 20 7413 5566