In February 2023, Meta posed a question to its board regarding the ongoing removal of content that uses the Arabic term to describe individuals identified under its Dangerous Organizations and Individuals policy.
The US-based company has faced significant criticism from human rights organizations, particularly following conflict in Gaza. Activists have accused Meta of stifling dissenting pro-Palestinian voices and curtailing freedom of speech.
Meta indicated that ‘shaheed’ leads to more content removals under its guidelines than any other word or phrase on its platforms.
The advisory panel’s research was nearing completion when in October 2023 Israel was attacked by Hamas occurred, prompting the former to launch a full-scale military operation in Gaza, exacerbating tensions in the Middle East.
Due to these developments, the publication of the policy advisory opinion was halted to ensure its recommendations aligned with the usage of ‘shaheed’ on Meta’s platforms in this context.
The board noted that additional research confirmed its recommendations to Meta on moderating ‘shaheed,’ even amidst extreme events, to uphold human rights standards during crises.
Furthermore, Meta’s approach failed to account for the diverse meanings of ‘shaheed,’ many of which do not glorify or approve, resulting in the removal of posts by Arabic and other language speakers (including many Muslims) that did not violate the Dangerous Organizations and Individuals policy.
The independent Meta Oversight Board emphasized that while “shaheed” is sometimes used by extremist organisations to glorify people who die while comitting violent and heinous acts of terrorism, Meta’s response should prioritize respect for all human rights, including freedom of expression.
The board urged the social media company to lift its blanket ban on using the term to describe designated dangerous individuals and to adopt a more nuanced content analysis approach that considers context when dealing with content containing the word.