Meta has frequently been condemned for restricting Palestinian experiences and narrative on all its platforms.
Instagram has disabled the account Paldream48, which was known for documenting solidarity with the Palestinian cause at the FIFA World Cup.
The campaign aims to raise awareness and gain global support and solidarity for Palestine throughout the World Cup and has been the driving force behind Palestinian visibility in the tournament.
Now, activists are calling for people to write a letter to Meta objecting to its decision and raise awareness about the censorship.
They also labelled the move as a “crime” and said that it serves as a reminder of the bias many major organisations around the world have against the Palestinian cause and people.
“‘Palestinian Dream’ campaign condemns the arbitrary closure of its account on Instagram without prior warning. The Palestinian dream will remain present in everyone’s souls and Palestine will remain present in the heart of the World Cup. Post this poster with a mention to the attached account @paldream48 and follow our new account with a mention to it #الحلم_بيموتش (the dream doesn’t die).”
Even though the FIFA World Cup in Qatar did not end yet, Palestine has already been crowned the early victor.
One may think that Palestine is one of the 32 nations whose teams competed in this World Cup based on the abundance of Palestinian flags, Palestinian armbands and bracelets, and “free Palestine” slogans heard in stadiums, fan zones, on the streets, and on social media. In fact, it has been referred to as the tournament’s “33rd country” by various Latin American media outlets.
The World Cup this year is being staged for the first time ever in an Arab nation. As a result, it has been more accessible to individuals from the region than any prior World Cup from a geographical, logistical, and cultural standpoint.
Meta’s repression of Palestinian voices
According to a report commissioned by the parent company of the social media platforms, Meta, Facebook and Instagram’s speech standards violated the fundamental human rights of Palestinian users during a war that saw frequent Israeli assaults on the Gaza Strip in May 2021.
The research, which was commissioned by Meta last year and produced by the impartial consultancy Business for Social Responsibility, or BSR, focused on the company’s censoring procedures and claims of bias amid Israeli forces brutal violence against Palestinians last spring.
The findings of BSR’s analysis reflect long-standing allegations of unequal speech enforcement in the Palestinian struggle against the Israeli occupation since Meta removed Arabic content about the violence at a much higher rate than Hebrew-language posts. The investigation discovered that the discrepancy persisted in posts that were examined by both manual employees and computerised software.
“Meta’s actions in May 2021 appear to have had an adverse human rights impact … on the rights of Palestinian users to freedom of expression, freedom of assembly, political participation, and non-discrimination, and therefore on the ability of Palestinians to share information and insights about their experiences as they occurred,” said the long-awaited report.
Israeli police cracked down on protesters in the West Bank after the forcible eviction of Palestinian families from the Sheikh Jarrah neighbourhood in occupied East Jerusalem. They also launched military airstrikes against Gaza that injured thousands of Palestinians and killed 256 people, including 66 children, according to the UN.
Many Palestinians who tried to use Facebook and Instagram to document and condemn the violence discovered that their posts abruptly vanished without warning.
More than a dozen civil society and human rights organisations published an open letter in August criticising Meta for delaying the publication of the report, which the business had initially promised to do in the “first quarter” of the year.
While acknowledging that Meta has made improvements to its regulations, BSR also criticises “a lack of oversight at Meta that allowed content policy errors with significant consequences to occur”.
The same institutional issues that rights organisations, whistleblowers, and scholars have all blamed for the company’s prior humanitarian failures were cited by BSR as the reason for the starkly different treatment of Palestinian and Israeli posts.
The BSR investigation concluded that Meta, a firm with over $24 billion in cash on hand, lacks personnel with knowledge of different cultures, languages, and histories and is utilising subpar algorithmic technology to control speech globally.
An “Arabic hostile speech classifier” that employs machine learning to detect potential policy breaches and has no equivalent in Hebrew not only subjected Palestinian users to algorithmic screening that Israeli users are not subjected to, the report also highlighted that the Arabic system was ineffective.
“Arabic classifiers are likely less accurate for Palestinian Arabic than other dialects, both because the dialect is less common, and because the training data — which is based on the assessments of human reviewers — likely reproduces the errors of human reviewers due to lack of linguistic and cultural competence,” said the report.
Beyond Meta’s shortcomings in categorising posts about Sheikh Jarrah, BSR also draws attention to the company’s “Dangerous Individuals and Organizations” policy, or “DOI,” which lists thousands of individuals and organisations that its billion-plus users are not allowed to “praise,” “support,” or “represent.”
The full list, which The Intercept obtained and published earlier this year, revealed that the policy places a heavy emphasis on Muslim and Middle Eastern organisations, which opponents characterised as a formula for obvious racial and religious bias.
Legal experts disagree with Meta’s interpretation of federal anti-terrorism laws, despite the company’s assertion that it is required by law to suppress mention of organisations that have been sanctioned by the United States government.