Facebook helped fuel hate-campaign against Rohingyas

Send
Saleem Samad
Published : 17:56, Nov 27, 2019 | Updated : 18:04, Nov 27, 2019

Saleem SamadHere is how Facebook which dominates social media has failed the vulnerable communities, especially concerning the hate campaign and fake news against the Rohingyas and so-called illegal Muslim Bangalees in Assam.
The Facebook admin team has miserably failed to delete 93 percent of posts containing speech violating its own “Community Standards” home rules, a study claimed.
The biggest allegation comes from India, describing how Facebook failed to delete hundreds of memes, images, and posts targeting caste, LGBT, and religious minorities.
The posts demonized Rohingya Muslims, the minority group that had been targeted for persecution in Myanmar. The Bangla-speaking Muslims in neighbouring state of Assam, India were also victims of the wrath of ruling party henchmen.
Social media research unit Equality Labs found that 93 percent of the posts reported to Facebook that contained speech violating the organisation's own rules remained on the platform.
Facebook has failed to halt the persecution of Rohingyas fleeing Myanmar to Bangladesh. Analysis by BuzzFeed News sheds new light on Facebook's failures on the Rohingya issue.
The United Nations calls Myanmar’s treatment of the Muslim Rohingya minority a genocide and says Facebook has done little to tackle hate speech.
Lawmakers from the Arakan state of Myanmar’s persecuted Rohingya minority regularly posted hateful anti-Rohingya content on Facebook. In some cases, these explicitly called for violence preceding the atrocities since the military campaign of ethnic cleansing began in August 2017.
Posts by members of Rakhine state’s parliament compared Rohingya to dogs, adding that Muslim women were too ugly to rape, falsely stated that Rohingyas torched their own houses for a payout from NGOs and accused Muslims of seeking to become the dominant group in the state by having too many children.
Bangladesh now host over 1.1 million Rohingyas after 700,000 fled Myanmar since August, 2018 when insurgents’ attack triggered a military crackdown, which the UN say constituted to ethnic cleansing. File Photo/ReutersAs the Rohingya crisis worsened in Arakan, the analysis showed that Facebook took no action for months and years. The platform finally removed many posts after BuzzFeed News sent links to the concerned person on Facebook.
UN investigators in a damning report also took Facebook to task, describing it as a “useful instrument for those seeking to spread hate,” adding that the company’s response to concerns about its role had been “slow and ineffective.”
In Myanmar, Facebook admitted shortcomings — as elsewhere — after its policies were cited for exacerbating ethnic cleansing, and promised to reform its processes, by hiring more content moderators.
Hate campaign also targeted Bangla-speaking Muslims in the Indian state of Assam which went viral on Facebook, even as the country’s government launched a controversial programme to crack down on people immigrating illegally from Bangladesh.
The report, titled “Megaphone for Hate,” released by Avaaz, a nonprofit social media activist network found serious failure of the popular social media platform Facebook.
Comments and posts that called Bangalee Muslims “pigs,” “terrorists,” “dogs,” “rapists,” and “criminals,” — seemingly in violation of Facebook’s standards on hate speech, showed the Avaaz review.
Facebook once prided itself as a largely neutral platform for content. But the company has devalued its status amid calls from the UN and other groups to take greater responsibility for what users post — especially involving calls for violence.
In response to Buzzfeed, a spokesperson for Facebook said: “The company respects and seeks to protect the rights of marginalized communities in India and elsewhere, and pointed to its rules against hate speech.”
Facebook said that it proactively deleted almost all problematic content on its platform before it’s reported, but hate speech is tougher to recognize because of linguistic and cultural context. But many fear it is too little, too late.
Equality Labs also says Facebook’s staff lacks the diversity that would enable it to moderate hate speech targeting minority groups.
Meanwhile, the Facebook founder Mark Zuckerberg has publicly stated he hopes to move toward automating a substantial part of its content management process using artificial intelligence tools, a statement echoed by company officials in interviews with BuzzFeed News.
NGOs and free speech defenders say it’s tough to imagine that intelligent machines will produce the kind of linguistic and cultural understanding it will take to combat these forms of speech.
Saleem Samad is an independent journalist, media rights defender, also a recipient of Ashoka Fellow (USA) and Hellman-Hammett Award. He can be reached at [email protected].

/ab/hb/
***The opinions, beliefs and viewpoints expressed in this article are those of the author and do not reflect the opinions and views of Bangla Tribune.
Top