Earlier this month, a software engineer with Facebook from Egypt wrote an open note to colleagues with a warning: Facebook is losing trust among Arab users.
He said that Facebook provided assistance to activists who used it to communicate during 2011. But censorship during the Israeli-Palestinian conflict made Arab and Muslim users suspicious of it.
As proof of this, the engineer included a screenshot of the Gaza Now news outlet, which has nearly 4 million followers.
When liking Gaza, Facebook now displays a message that says: You may want to check with the news outlet to see what types of content it usually shares.
The engineer wrote: I did an experiment and tried to like as many Israeli news pages as possible, and I did not receive a similar message once.
He noted that the company’s regulations were biased against Arabic content. This post sparked a series of comments from other colleagues.
One of them asked why an Instagram post from actor Mark Ruffalo about Palestinian displacement received a warning about sensitive content.
Another said: Advertisements from Islamic organizations that raise money during Ramadan have been suspended by artificial intelligence.
Another Facebook employee wrote about the mistrust of Arab and Muslim users.
“The next mistake could be the straw that breaks the camel’s back and we could see our communities migrate to other platforms,” he said.
A large segment of the staff is talking about whether the world’s largest social network exhibits anti-Muslim and anti-Arab prejudice.
Some are concerned that Facebook is selectively enforcing its supervisory policies around relevant content.
Others believe that she is overly imposing, and fear that she may be biased towards one side or the other.
One thing they have in common is the belief that Facebook is once again forcing failed enforcement decisions around a politically charged event.
One employee wrote: We are trying to get the company as a whole to acknowledge and make a real effort rather than empty vulgarity in addressing the real grievances of the Arab and Muslim communities.
The situation inside Facebook:
The situation has become so tense within the company that a group of about 30 employees got together earlier this month.
The group has filed internal appeals to recover content they believe has been improperly blocked or removed.
The group wrote: This content is very important because People all over the world depend on us to be their eye on what is going on around the world.
The perception of prejudice against Arabs and Muslims affects the company’s brands as well.
Recently, Facebook applications were subjected to a negative evaluation campaign in the Google and Apple app stores.
The campaign was inspired by the decline in user confidence due to the recent escalation between Israel and Palestine.
Some Facebook employees have reached out to both Apple and Google to try to remove the negative reviews.
One person wrote in response to the post: This is the result of years and years of implementing policies, and large parts of some of the population are considered terrorists by internal definitions. The corollary is that our manual and automation execution systems are biased.
A Facebook spokesperson acknowledged that the company made mistakes. He noted that the company has a team on the ground with Arabic and Hebrew speakers to monitor the situation.
“We know there are many issues that affected people’s ability to participate through our apps and we’ve fixed them, they weren’t supposed to happen,” he said in a statement.
He added: We are sorry for anyone who felt they could not draw attention to important events, or who felt that this was a deliberate suppression of their voices. This has never been our desire, and we never want to silence a particular community or point of view.
Social media companies, including Facebook, have long cited its use during 2011 as evidence that their programs democratize information.
Mai al-Mahdi, a former Facebook employee who worked in content management and crisis management from 2012 to 2017, said the social network’s role in revolutionary movements was the main reason she joined the company.
But the time that Mahdi spent in the company changed her views.
While she oversaw the training of content managers in the Middle East from her Dublin position, the company was criticized for being US-centric and failing to recruit enough people with management experience in the region.
Facebook’s random approach:
Gillian York, director of international freedom of expression at the Electronic Frontier Foundation, has studied content moderation within the world’s largest social network.
She said the company’s approach to enforcement around Palestinian-related content has always been haphazard, which has led users to promote the hashtag #FBCensorsPalestine.
The people who handle government relations on the public policy team also influence Facebook’s rules and what should or should not be allowed across the platform.
This creates a conflict of interest as lobbyists responsible for pleasing governments can lobby on how to manage content.
Facebook hired Jordana Cutler, a former adviser to Israeli Prime Minister Benjamin Netanyahu, to oversee public policy in a country of nearly 9 million people.
As head of public policy for the Middle East and North Africa, Ashraf Zeitoun was responsible for the interests of more than 220 million people in 25 Arab countries and regions, including the Palestinian territories.
Facebook employees have raised concerns about Cutler’s role and the interests it prioritizes its own.
“You’re an employee of Facebook, not an employee of the Israeli government,” said Zeitoun, who remembers arguing with Coulter over whether the West Bank should be considered occupied territory in Facebook’s bases.
The United Nations defines the West Bank and Gaza Strip as being occupied by Israel.
Zeitoun added: Facebook’s devotion of resources to Israel has led to a change in internal politics.
Israeli members of the public policy team often press for content removal and policy decisions without a real counterpart that directly represents Palestinian interests.
“The public policy team helps make sure that governments and civil society understand Facebook’s policies, and that we understand the countries in which we operate,” a company spokesperson said.
He noted that the company now has a member of the policy team focused on Palestine and Jordan.
Facebook and Israel:
Facebook’s rules provide special protections for referring to Jews and other religious groups, allowing the company to remove hate speech targeting people because of their religion.
Members of the policy team pushed for the word “Zionist” to be equal to “Jewish”, and guidelines were enforced providing special protection for the term “settler”.
Facebook’s internal rules created an environment that could stifle discussion and criticism of the Israeli settler movement.
While Facebook users around the world have complained about the ban or removal of their content from Palestinians,
The company’s growth team submitted a document on May 17 evaluating how the conflict in Gaza affected user feelings.
The team concluded that Israel was the first country in the world to report content, with nearly 155,000 complaints within a week.
It ranked third in reporting content under Facebook’s policies of violence and hate violations, surpassing more populous countries such as the United States, India and Brazil.
One of the Facebook employees questioned whether the requests from Israel had any effect on the company’s censorship of Arabic and Islamic content.
Israel has just over twice as many Facebook users as the Palestinian Territories.
But it reported 10 times the amount of content, and more than eight times the number of complaints of hate violations than Palestinian users.
Activists have questioned for years whether pressure from the Israeli government affected decision-making about content on Facebook.
The Arab Center for Social Media Development tracked 500 removals of content across major social platforms during the conflict.
And thesuggest The efforts of the Internet Unit of the Israeli Ministry of Justice are also behind many of these reported violations.
People’s Fears of Censorship:
As external pressure mounted, the informal team of about 30 employees within Facebook filed internal complaints.
They submitted more than 80 appeals about the removal of content on the Israeli-Palestinian conflict.
They found that the vast majority of decision reversals were due to false positives from automated systems.
The engineer wrote: “This created further distrust of our platform and reaffirmed people’s fears of censorship.
Facebook executives appear satisfied with the company’s handling of Arabic and Islamic content during the escalating tension in the Middle East.
“Facebook has not identified any ongoing systemic issues,” said James Mitchell, who is responsible for moderating the content.
He also noted that the company was using precise terminology to flag potential hate speech content, allowing it to be removed automatically.
He said his team is committed to conducting a review to see what the company can do better in the future. But he admitted only one mistake, which is the ban on content that includes the phrase Al-Aqsa.
Internal documents show that over five days, Facebook’s automated systems deleted about 470 posts referring to Al-Aqsa.
Facebook attributed the removals to terrorism and hate speech. Some employees were unhappy with Mitchell’s comments.
One wrote: We have accurate terminology, yet we told nearly two billion Muslims that we mixed their third holiest site with a dangerous organization.
“We sent a message to a large group of our audience that we don’t care about knowing the things that are so basic and important to them,” he added. It helped reinforce the stereotype that Muslims are terrorists and the idea that freedom of expression is restricted for some residents.