A US federal judge has ordered Facebook to release records of accounts connected to anti-Rohingya violence in Myanmar that the social media giant had shut down, rejecting its argument about protecting privacy as "rich with irony".
The judge in Washington, DC, on Wednesday, criticised Facebook for failing to hand over information to investigators seeking to prosecute the country for international crimes against the Muslim minority Rohingya, according to a copy of the ruling, reports Reuters.
Facebook had refused to release the data, saying it would violate a US law barring electronic communication services from disclosing users' communications.
But the judge said the posts, which were deleted, would not be covered under the law and not sharing the content would "compound the tragedy that has befallen the Rohingya".
"Facebook taking up the mantle of privacy rights is rich with irony. News sites have entire sections dedicated to Facebook's sordid history of privacy scandals," he wrote.
A spokesperson for Facebook said the company was reviewing the decision and that it had already made "voluntary, lawful disclosures" to another UN body, the Independent Investigative Mechanism for Myanmar.
More than 730,000 Rohingya Muslims fled Myanmar's Rakhine state in August 2017 after a military crackdown that refugees said including mass killings and rape. Rights groups documented killings of civilians and the burning of villages.
Myanmar authorities say they were battling an insurgency and deny carrying out systematic atrocities.
The crackdown by the army, during the rule of Nobel laureate Aung San Suu Kyi's civilian government, did not generate much outcry in the Buddhist-majority nation.
The Gambia wants the data for a case against Myanmar it is pursuing at the International Court of Justice (ICJ) in The Hague, accusing Myanmar of violating the 1948 UN Convention on Genocide.
In 2018, UN human rights investigators said Facebook had played a key role in spreading hate speech that fuelled the violence.
An investigation that year found more than 1,000 examples of hate speech on Facebook, including calling Rohingya and other Muslims dogs, maggots and rapists, suggesting they be fed to pigs, and urging they be shot or exterminated.
Facebook said at the time it had been "too slow to prevent misinformation and hate" in Myanmar.
In Wednesday's ruling, US magistrate judge Zia M. Faruqui said Facebook had taken the first step by deleting "the content that fueled a genocide" but had "stumbled" by not sharing it.
"A surgeon that excises a tumour does not merely throw it in the trash. She seeks a pathology report to identify the disease," he said.
"Locking away the requested content would be throwing away the opportunity to understand how disinformation begat genocide of the Rohingya and would foreclose a reckoning at the ICJ."
Shannon Raj Singh, human rights counsel at Twitter, called the decision "momentous" and "one of the foremost examples of the relevance of social media to modern atrocity prevention & response".