Is Facebook contributing to genocide in Myanmar?
Social media giant under scrutiny for spreading hate speech that has fueled persecution of nation's Rohingya Muslim minority
It’s not every day that a country of over 51 million people goes online virtually overnight.
But that’s been the case in Myanmar, which until recently had one of the world’s lowest internet penetration rates but is now largely plugged into the digital age. And as many will attest, Facebook is the internet in Myanmar.
Yet Facebook is also fanning the flames of communal conflict, violence perpetrated by state security forces and others that United Nations officials say qualifies as “crimes against humanity”, “ethnic cleansing” and potentially even “genocide” against the nation’s Muslim Rohingya minority.
Marzuki Darusman, chair of the United Nations’ Independent International Fact-Finding Mission on Myanmar, recently said that Facebook has played a “determining role” in a humanitarian crisis that has seen over 600,000 Rohingya flee across the border into Bangladesh.
“[Facebook] has … substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly of course a part of that,” he said.
Facebook’s news feed chief Adam Mosseri recently said on a Slate podcast that Facebook is still working toward finding the right approach to the situation in Myanmar. “Connecting the world isn’t always going to be a good thing … We lose some sleep over this.”
Facebook has provided a platform where rumor, disinformation and misinformation runs rife. While the dissemination of hate speech in Myanmar is not limited to social media, with traditional media, movies, cartoons, television and even songs playing a role, Facebook’s reach outstrips traditional distribution models by an order of magnitude.
The recent Cambridge Analytica scandal – where over 50 million Facebook users private data was compromised – signifies a big-picture problem, mainly a failure to regulate social media and technology company business practices that have big data at their core.
But in Myanmar, the bigger question is: if the company is profiting from a market, what responsibility does it have to ensure that it doesn’t somehow contribute to bloodshed on a mass scale?
Ten years ago, a SIM card in Myanmar cost somewhere in the vicinity of US$2,500, limiting cell phone access to a narrow elite. But after the state relinquished its effective monopoly on mobile networks, market competition has brought that price down to around US$1.50.
With the subsequent explosion of new mobile users, the state’s previously tight censorship of the digital sphere has been completely eroded.
Facebook accounts can be bought ready-made in mobile phone stores. There are now 30 million accounts registered in Myanmar, although the figure may be slightly overblown as many people are known to have more than one account. Facebook and its Messenger service were previously provided free-of-charge upon entry to the Myanmar market.
Facebook is now positioned to glean deeper insights into the at-large Myanmar population’s cognitive whirring than Special Branch or any bungled census ever was. In a country where hard data is scant after decades of authoritarian military rule, that’s gold dust.
In Myanmar’s resource constrained setting, Facebook has also been a boon for government, which uses the platform to issue official statements and disseminate its counter-narratives to mainstream, mostly foreign, media accounts of recent events.
For every conflict that takes place on the ground anywhere in the world these days, there is likely another one playing out in tandem in cyberspace.
To view Facebook in Myanmar (depending, of course, on who your “friends” are) is to wade into a newsfeed where selfies and photos of someone’s lunch might well sit alongside the occasional mutilated corpse.
Whether you believe the portrayed victim was killed by state security forces or an ethnic armed group trying to frame government troops, again, depends on the individual and their circle of online friends.
Much has recently been made of the effects of cognitive bias and the echo chamber on social media worldwide. In Myanmar, there is a strong case to be made for the deeply polarized and highly-charged binary narrative having helped – at least in part to manufacture tacit consent for the expulsion of the Rohingya.
Content on social media, as well as in state and private media, has served to reinforce essentialist ideas about the “other.” Images of death and violence have spread like wildfire over social media in Myanmar. And most users have limited ability to gauge the veracity and significance of what they’re seeing.
Social media’s echo chamber effect, meanwhile, serves to harden views, amplify divisions, and exacerbate siege mentalities. In this way, it’s not simply hate speech that’s the issue: it’s the feedback loop of confirmation bias.
In Myanmar, as elsewhere, people are vulnerable to believing demonstrable falsehoods simply because it fits with their preconceived notions of what they think might be true.
Needless to say, this has provided a new frontier for propagandists and spin doctors. A matter not discussed anywhere near enough in the context of Myanmar’s fast spread of hate speech is that it’s not always coming from real people.
The use of astroturfing by malignant actors – that is, creating the appearance of a social consensus or reality through posting what appear to be grassroots comments and content – has arguably played at least some role in inciting recent violence, or at least amplifying the level of perceived threat in specific populations.
Just where this is coming from, however, is still anyone’s guess.
During communal riots in Mandalay in 2013, access to Facebook was shut down (by mobile networks, not Facebook itself) in an apparent bid to curb the spread of inflammatory rhetoric, rumors and fake news at a time of heightened tensions and roaming violent mobs.
To some extent, monitoring groups have been able to identify potential flash-points and counter incitement mobilized by hardline elements who use social media.
Facebook recently took the unusual step of deleting the account of a prominent anti-Muslim Myanmar monk Wirathu. However, account closures haven’t stopped him in the past; he can simply open a new account under a different name or use someone else’s as a proxy.
The monk has said he plans to use Youtube and Twitter to continue disseminating his “nationalist” message.
This is a band-aid solution for a veritable Hydra of a problem. For a private corporation to act as global arbiter of what constitutes free speech and what constitutes hate speech presents obvious concerns – particularly given the company’s track record of opacity. It’s also based on a system that is largely reactive.
Users have the option to flag offensive or objectionable content for assessment. In Myanmar, there is the option to report Facebook posts for nudity, harassment, violent imagery, suicide/self-harm, spam, unwanted sales and hate speech.
Reported posts then enter a murky system whereby deskbound workers – often in the Philippines – sift through them to ascertain whether or not it is in breach of the platform’s terms and conditions.
As Myanmar is a country of concern, it’s unclear if it has been given an elevated focus or is now the subject of more proactive monitoring at Facebook. Facebook is currently in the process of hiring a Myanmar community operations and market specialist in Dublin, Ireland. The company does not currently have an in-country presence.
In theory, one thing Facebook has the capacity to do is crack down on the creation of fake accounts. It would appear there has been some effort to this end by requiring ID from its users.
It is unclear if the company’s efforts to attain photo ID from users has been carried out on exactly the same scale everywhere in the world.
However, this presents an obvious problem for activists, whistleblowers, journalists and anyone who think this is dangerous overreach for a private company, not to mention those in Myanmar who’ve had their documentation voided, seized or incinerated as they fled their burned out villages for refugee camps in Bangladesh.
Leaked internal guidance from Facebook has shown that “enthusiasm” and “sadism” form part of the criteria by which content is judged. Hate speech stipulations are made on material being specific and actionable. Video content is considered differently to photos.
Speaking with Slate, Facebook executive Mosseri indicated that building in fact-checking to their Myanmar operation was something now under consideration.
How this will play out in practice, however, is an intriguing question. Take, for example, a recent Facebook post from military commander-in-chief Senior General Min Aung Hlaing’s office.
It contained images of a beheaded Rohingya man, hacked-up members of ethnic Rakhine minorities, and murdered Hindu children – all supposedly victims of “bloodthirsty Bengalis” according to a caption on one of the gorier photos.
The government refuses to acknowledge the term Rohingya and refers to the ethnic group instead as “Bengalis” to indicate their supposed origins in neighboring Bangladesh.
Contained in the commander-in-chief’s album are also photos captioned: “Mujahidin conflicts between the two communities in 1942. 30,000 of ethnic Rakhine were killed and thousands of people fled from their native villages.”
A simple search, however, reveals the photos are not from 1942 but instead 1971 during Bangladesh’s bloody liberation war. The dead in one photo in its original form are captioned as Bengalis killed by Pakistani forces.
The Myanmar government has been enthusiastic to point out supposed “fake news” reported by media that have highlighted state atrocities against the Rohingya but in this instance have propagated it.
If Facebook were to fact-check content proactively, what implications would this have?
Myanmar’s parliament recently approved a US$4.8 million budget to monitor the internet for people who use it to “harm the stability” of the country. If the budget is honestly used to crack down on online incitement, then that could be a positive development.
However, given the case of two Reuters journalists now jailed for over 100 days under the draconian Official Secrets Act for their reporting on atrocities in Rakhine state, there are obvious concerns over how authorities will choose to define “harming stability.”
At the same time, how Facebook handles the myriad issues it faces in Myanmar also deserves greater transparency. Without doubt, Myanmar’s people need to be better-informed digital citizens.
But in an environment where digital literacy is low and critical thinking was not fostered by the education system under decades of military rule, many people lack the inclination or ability to fact-check. This is not, to be sure, just a problem in Myanmar.
People the world over need to learn how to distil the barrage of noise they face daily over social media, and understand how it affects them physically, mentally and emotionally. Facebook users need to question why they’re seeing what they’re seeing.
Supplying Facebook’s users with the means to critically assess information – whether it come from activists, rebel groups or the government – is what will ultimately make social media a tool for spreading democracy rather than hate, death and destruction in Myanmar.