In recent times, social media platforms have become the epicenter of global communication, enabling users to share their thoughts, experiences, and perspectives. However, with this power comes responsibility, and the recent report by Business for Social Responsibility (BSR) on Meta, the parent company of Facebook, Instagram, and WhatsApp, sheds light on concerns surrounding content moderation and freedom of expression, particularly in the context of Palestinian social media posts.
On September 22nd, BSR released a comprehensive report that delves into how Meta handled Palestinian social media content during Israel’s actions in Gaza and the Al Aqsa Mosque in May. The findings of the report underscore the challenges and implications of content moderation practices, raising important questions about the role of social media giants in shaping narratives and limiting voices.
The Unjust Removal of Palestinian Content: The central focus of the BSR report is Meta’s alleged unjust removal of Palestinian social media posts. During the May 2021 escalation of violence, Palestinian social media users reported cases of censorship, removal, and suppression of their content, specifically content critical of Israeli actions. This raised concerns about the fairness of content moderation policies and their impact on freedom of expression, especially for marginalized communities.
Content Moderation Challenges: The report highlights the stringent content moderation policies employed by Facebook and Instagram, platforms under Meta’s umbrella. Palestinian users claim that these policies disproportionately censor critics of Israeli repression, hindering their ability to express their perspectives on the ongoing conflict. Digital rights groups documented numerous instances where Palestinian content was either removed, hidden, or suppressed, adversely affecting the documentation of Israeli human rights abuses.
One key aspect revealed in the report is Meta’s over enforcement of content in Palestinian Arabic. The report identifies three likely causes for this overenforcement: the inclusion of major Palestinian political factions, such as Hamas, in Facebook’s blacklist; the lack of review by speakers of the Palestinian dialect of Arabic; and issues related to the linguistic and cultural competence of human reviewers training Meta’s Arabic speech algorithm.
Contrasting Approaches: The report draws attention to the disparity in Meta’s treatment of Palestinian content compared to Hebrew-language posts. It suggests that Meta’s installation of an algorithmic “hostile speech classifier” for Arabic, but not for Hebrew, contributes to this discrepancy. The findings raise questions about the impartiality and neutrality of content moderation algorithms and the potential impact on different language communities.
Digital Rights Advocacy and Validation: Digital rights advocates welcomed the BSR report as it validated the experiences of Palestinians who have long contended with content removal and censorship. The report serves as a crucial documentation of lived experiences, dispelling any notion that these actions are merely the result of system glitches. Marwa Fatafta, the Middle East and North Africa policy manager for digital rights group Access Now, emphasized that the report exposes the root causes of the issue, providing a foundation for advocacy and change.
However, despite the positive reception, some critics argue that the report falls short in acknowledging Meta’s intentional actions. While BSR concluded that Meta did not intentionally seek to harm Palestinian users’ expression, digital rights defenders contend that the documented actions and inactions by the company indicate otherwise. This disagreement emphasizes the complexity of evaluating the intent behind content moderation decisions.
The Israeli Cyber Unit’s Role: One notable omission from the report is a discussion of the role played by the Israeli Cyber Unit in Meta’s content suppression. Established in 2015 within Israel’s Ministry of Justice, the unit issues thousands of removal requests to Facebook, claiming violations of Israeli law or the social media companies’ terms of service. This highlights the broader issue of external pressures and government requests influencing content moderation decisions on major social media platforms.
Digital rights groups have consistently urged Meta to provide transparency regarding its evaluation of requests from the Israeli Cyber Unit. Users deserve clarity on whether their content removal is a result of governmental requests, emphasizing the need for transparency in the interactions between tech companies and state entities.
Meta’s Response and Recommendations: BSR’s report concludes with 21 recommendations to Meta, addressing improvements in assessing Arabic-language content and advocating for increased transparency in content moderation policies. Meta responded by committing fully or partially to recommended while assessing the feasibility of six others. However, Meta explicitly rejected one recommendation suggesting funding research into how social media platforms interpret counter-terrorism laws.
Despite Meta’s response, advocates find it underwhelming and not sufficiently encouraging. The vague language used by Meta in outlining how it would implement the recommendations raises concerns about the company’s commitment to meaningful change. Advocates stress the need for a concrete action plan with clear timelines and full transparency to ensure accountability.
Conclusion: The BSR report on Meta’s content moderation practices offers a critical examination of the challenges faced by Palestinian users in expressing their perspectives on social media. It highlights the broader issues of biased algorithms, external pressures, and the need for transparency in content moderation decisions. As social media continues to play a pivotal role in shaping global narratives, the report serves as a call to action for tech companies to reevaluate their content moderation policies, prioritize transparency, and safeguard freedom of expression for all users. Ansab Hussain Qureshi