Exclusive: U.N. Human Rights Experts Meet With Facebook on “Overly Broad” Definitions of Terrorist Content

United Nations Special Rapporteur Fionnuala Ní Aoláin has asked Facebook Chief Executive Mark Zuckerberg to add precision and rigor to the social network’s guidelines on terrorism-related content. In a letter to Zuckerberg and a significant meeting last week with Facebook executives, Ní Aoláin said the existing definitions risk catching others, such as legitimate opponents of oppressive authorities, in a dangerous net. The rapporteur told Just Security her office will take a similar approach to “other platforms whose practices mirror Facebook.”

Ní Aoláin, a professor at the University of Minnesota Law School and at the University of Ulster’s Transitional Justice Institute in Belfast, Northern Ireland, is the U.N. Special Rapporteur on the Promotion and Protection of Human Rights and Fundamental Freedoms While Countering Terrorism. (Full disclosure: She also serves as an executive editor of Just Security). The special rapporteur is appointed by the U.N. Human Rights Council, and reports to the council and the General Assembly on alleged violations of human rights and other freedoms in the global drive to prevent terrorism. The rapporteur also identifies and promotes good policies and practices.

In a July 24 letter to Zuckerberg, Ní Aoláin expressed concern with “the overly broad definition of terrorism and terrorist organizations used by Facebook as well as the seeming lack of a human rights approach to content-moderation policies.” She cited Facebook’s community standards and an April 23 blog post by two top company officials on how the network uses technology and its “counterterrorism team” to find and remove terrorist propaganda.

Ní Aoláin takes particular issue with Facebook’s definition of terrorist organizations. According to the community standards, Facebook bars individuals and organizations involved in terrorist activity, organized hate, mass or serial murder, human trafficking, organized violence, or criminal activity. It also removes content that “expresses support or praise for groups, leaders, or individuals” for such entities.

The rapporteur specifically cited Facebook’s definition of a terrorist organization as “any non-governmental organization that engages in premeditated acts of violence against persons or property to intimidate a civilian population, government, or international organization in order to achieve a political, religious, or ideological aim.”

“The use of such a sweeping definition is particularly worrying in light of a number of governments seeking to stigmatize diverse forms of dissent and opposition (whether peaceful or violent) as terrorism,” Ní Aoláin wrote in her letter to Zuckerberg.

Several recent instances illustrate the problem. In one example, The Guardian reported in May 2017 that a Facebook manual listing terrorist leaders and organizations included the Free Syrian Army, the western-backed opposition force to Bashar al-Assad’s regime in Syria. Moderators were instructed to take down supportive references and symbols of the group.

Similarly, The Daily Beast reported in September that activists in Myanmar had their posts blocked and accounts locked for documenting the ongoing persecution of the Rohingya minority. Some of the removed posts had Facebook messages generally citing its “community standards.” In developing countries like Myanmar, where Facebook is a vital communication tool, losing one’s account can be devastating. An Amnesty International researcher told the news organization that the takedowns appeared to be part of a concerted effort by opponents of the Rohingya to report posts to Facebook to get them removed. Facebook has recently introduced an “appeal” feature, but it is limited to certain contexts and does not appear to provide any transparency into the decision-making process.

George Washington a Terrorist?

Facebook’s broad definition of terrorism does not comport with common or expert understanding of the term. Under Facebook’s definition, the Continental Congress and Washington’s Army might have been censored as terrorist organizations in the American Revolution, just as today’s authoritarian leaders seek to brand opponents to their regimes as “terrorists.”

Ní Aoláin’s letter highlights the dangers that arise as social media networks start to take more responsibility for the content on their sites. While Facebook says it draws on academic literature and various experts and feedback, it may not always have  – or draw on – expertise in complex social, political, or national security phenomenon.

Facebook spokeswoman Ruchika Budhraja told Just Security after the meeting with Ní Aoláin  that the company wants the public to “better understand our thinking and the frameworks we use to make decisions” about content and users on the site.

“By transparently sharing our definition of terrorism, our goal is to create opportunities for increased dialogue with important stakeholders, as was the case here,” Budhraja said in an email. “We welcome this dialogue and hope to continue our conversations with the Special Rapporteur and others who are thinking deeply and working tirelessly on these issues.”

The role of companies like Facebook in setting their own community standards and enforcing them for billions of users worldwide also illustrates the increasing influence wielded by the private sector in setting social and security norms.

“Companies like Facebook are increasingly engaged in forms of regulation traditionally ascribed to States,” Ní Aoláin writes in her letter. She cites the U.N. Guiding Principles on Business and Human Rights (UNGPs) as “an authoritative global standard for preventing and addressing adverse human rights impacts linked to business activity.” These principles encourage businesses to limit their impact on human rights like free speech, and to develop effective, user-friendly, and transparent methods to mitigate those impacts. While they are not legally binding, she writes, they represent “an important step towards matching the impact of businesses on human rights with corresponding levels of corporate responsibility.”

Instead of charting its own course, Ní Aoláin encourages Facebook to align its practices with the framework articulated by the U.N. Human Rights Council. Definitions should be “compatible with standards set by international law, including international human rights law and international humanitarian law,” she writes.

Precision in Definitions

Ní Aoláin suggests Facebook look to the more specific definition developed by her predecessor, Martin Scheinin, that reflects best practices in the field. It defines terrorism as an action or attempted action where:

1. The action:
(a) Constituted the intentional taking of hostages; or
(b) Is intended to cause death or serious bodily injury to one or more members of the general population or segments of it; or
(c) Involved lethal or serious physical violence against one or more members of the general population or segments of it; and

2. The action is done or attempted with the intention of:
(a) Provoking a state of terror in the general public or a segment of it; or
(b) Compelling a Government or international organization to do or abstain from doing something; and

3. The action corresponds to:
(a) The definition of a serious offence in national law, enacted for the purpose of complying with international conventions and protocols relating to terrorism or with resolutions of the Security Council relating to terrorism; or
(b) All elements of a serious crime defined by national law.

The rapporteur further defines terrorist incitement as “an offence to intentionally and unlawfully distribute or otherwise make available a message to the public with the intent to incite the commission of a terrorist offence, where such conduct, whether or not expressly advocating terrorist offences, causes a danger that one or more such offences may be committed.”

Most notably, the rapporteur’s definition of terrorism is limited to violence against the general population, instead of including military targets. Under Facebook’s definition, a non-state organization engaged in a non-international armed conflict with a state would be deemed a “terrorist organization,” even if the group complied with international humanitarian law. It is unclear where the Facebook definition originated, whether experts were consulted, or if it is grounded in any legal principles, Ní Aoláin wrote.

Ní Aoláin notes that other offenses covered by Facebook’s broad definition but not fitting the special rapporteur’s precise definition still “may amount to advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence.” Such posts may well be appropriately censored, but an over-expansive definition of terrorism is not needed to do so.

Beyond the broad standards articulated in the community standards, Facebook provides little insight into how it sets, evaluates, and enforces its rules, the rapporteur wrote.

“Detailed information on such procedures and the criteria that determine which incidents will be dealt with by AI, human moderators, or both is … not publicly available,” Ní Aoláin writes.  As a result, it is difficult to predict what will and will not be removed. This compounds the broad definition’s chilling effect, as posters cannot predict in what manner their posts will be reviewed.

The Need for International Engagement

In an April 24 blog post publishing internal enforcement guidelines and expanding Facebook’s appeals process, the company announced it would conduct a listening tour across the globe to gather feedback on the community standards and current enforcement practice. The writer of the post, Vice President of Global Policy Management Monika Bickert, noted that she’d previously worked as a criminal prosecutor on issues ranging “from child safety to counter terrorism.”

There are some signs that Facebook is incorporating lessons from the international community. On Aug. 9, Facebook published a blog post discussing how it regulates speech generally. In that post, a senior Facebook executive wrote that the company “look[s] for guidance in documents like Article 19 of the International Covenant on Civil and Political Rights.”

Ní Aoláin said in an e-mail to Just Security that, to her knowledge, this is Facebook’s first reference to Article 19 as a guiding document in its policy development. It’s still unclear whether Facebook will publicly incorporate other international documents and principles, including the special rapporteur’s definition of terrorism.

The Facebook post also says, “We moderate content shared by billions of people and we do so in a way that gives free expression maximum possible range,” and that “the core concept here is whether a particular restriction of speech is necessary to prevent harm.”

In an email after her meeting with Facebook executives last week, Ní Aoláin called the discussion “very productive and open,” and said it would be the first of a series of meetings on issues identified in her letter. UN Special Rapporteur on Freedom of Expression David Kaye also is participating in the meetings.

“Facebook have indicated a willingness to discuss these issues in a productive way, engaging with the human rights and humanitarian law compliance concerns raised by the mandate,’ Ní Aoláin said.  “The mandate [of her U.N. office] will also be reaching out to other platforms whose practices mirror Facebook and who have not been as transparent and open about their working methods and criteria.”

In an email to Just Security following the meeting with Facebook, Kaye stated, “This is not a Facebook only issue – it’s something that all social and search platforms need to address.”

In a recent report to the United Nations on the subject, Kaye raised concerns that companies trying to regulate their users’ content, including Twitter and Facebook, have adopted “excessively vague” definitions of terrorism and dangerous organizations.

Ní Aoláin’s letter “highlights how the Facebook definition diverges from international practice,” Kaye told Just Security. He explained his concerns about the effects of that divergence.

“It undermines the ability of Facebook to push back against the efforts of governments to impose ever broader definitions of terrorism,” he said, “when the standard is not tied closely to standards in human rights law and other international instruments.”

Like Ní Aoláin, Kaye also told Just Security that Facebook’s representatives were “eager to engage” the UN rapporteurs on these issues.

The challenge Facebook faces in taking responsibility for its content is no simple task, and its efforts so far are to be encouraged. But there is no need to reinvent the wheel. By developing definitions and practices in consultation with relevant international bodies that already have wrestled these issues to ground, Facebook could gain not only greater rigor in making its crucial judgments, but also important international legitimacy as it grapples with some of the world’s thorniest societal quandaries.

Just Security Washington Editor Viola Gienger and Co-Editor-in-Chief Ryan Goodman contributed to this report.

Photo: Mark Zuckerberg, CEO of Facebook Inc., addresses the interactive dialogue of the Summit for the adoption of the post-2015 development agenda, at UN Headquarters, Sept. 26, 2015 

About the Author(s)

Isa Qasim

Legal Researcher at Just Security. He is a law student at Yale Law School and former Investment Associate at Bridgewater Associates.