For years, activists in Southeast Asia warned Facebook that content on the platform could lead to real-life violence. Then it did.

A hand holds a mobile phone with a Facebook logo and the message "this site cannot be reached."

Last month, when violent mobs attacked members of a Muslim minority group in Sri Lanka, the government responded by shutting down access to social messaging networks. At least one person was killed.

The government's contention at the time is a fairly familiar criticism at this point — through services like Facebook, inflammatory hate speech and propaganda spread uncontrolled and fed violence.

But as Mark Zuckerberg defends his social media network, activists in Sri Lanka say they've been warning about this for years.

“We first tracked it as a problem in 2013,” said Sanjana Hattotuwa, a researcher with the Center for Policy Alternatives, an advocacy group based in Colombo, Sri Lanka.

That’s when the organization began monitoring and flagging what Hattotuwa calls “dangerous content: content that incites violence and … hate against specific communities, identity groups, [and] genders.”

In recent weeks, Facebook CEO Mark Zuckerberg has admitted his platform hasn’t done enough.

"For most of our existence, we focused on all the good that connecting people can bring,” Zuckerberg said during his company’s quarterly earnings call on Wednesday. “But it’s clear now we didn’t do enough to prevent these tools from being used for harm as well.”

Zuckerberg delivered the same line when he appeared before Congress earlier this month, following revelations that political consulting firm Cambridge Analytica harvested the personal data of as many as 87 million users. During that same hearing, the CEO also apologized for being “too slow to spot and respond to Russian interference” during the 2016 US presidential election.

As the company faces intense scrutiny in the US, it’s also finding itself confronting mounting questions about how its platforms are being used to “cause harm” to users abroad, particularly in developing countries.

Facebook’s operations in Sri Lanka have become a focal point in recent days, following a report published by the New York Times, detailing how Facebook and WhatsApp — a Facebook-owned messaging platform — were used to spread rumors that escalated to real-world violence against the country’s minority Muslim population.

In Sri Lanka, the CPA published its first report on how Facebook was used to spread hate speech in the country in 2014.

Calls for Facebook to promptly remove content violating Facebook’s own community standards and take steps to address the issue went unanswered for years, Hattotuwa said.

“We had made it explicitly clear to Facebook that this a problem. All our pleas, our requests … had been met with silence,” Hattotuwa said.

Facebook broke its silence earlier this month when, in the midst of the Cambridge Analytica scandal, it responded to an open letter signed by 13 organizations in Sri Lanka.

In its response, Facebook pledged to increase the number of Sinhala-speaking content reviewers and to continue to work with civil society groups on the ground.

Facebook, it appears, employs few Sinhalese moderators, though no specific number has been provided, according to the New York Times. The company has no office or staff in Sri Lanka, a deficit that Hattotuwa says he’d like to see addressed.

He also wants to see Facebook make a commitment to respond to content that’s been flagged by users within 24 hours, a step the company has pledged to take in Myanmar, a country plagued by similar problems.

“What we really want is something that is fast, efficient and effective, [and] that Facebook doesn’t do two things it has historically done,” Hattotuwa said. “One, it takes days for a response to be generated. And No. 2, it allows that content which was reported to stay on the platform.”

Despite the chaos the platform has — at least in part — caused in his country, Hattotuwa said he doesn’t believe Facebook is “all evil.”

“Facebook has been integral in strengthening democracy, in strengthening descent, in creating the opportunity for critical conversations under authoritarian rule to take place,”  Hattotuwa said. “But it seems to be the case that actors are using it more and more for the promoting of the worst of society and the worst of what we can be. Can Facebook help us become our better angels? That’s what we want to see.”

Sign up for our daily newsletter

Sign up for The Top of the World, delivered to your inbox every weekday morning.