(JTA) – The independent panel that rules on disputes at the world’s largest social media company is taking up two challenged posts about the ongoing Israel-Hamas war, in cases with potential ramifications for how users will be able to talk about the war online.
The cases are the first to be taken on under a new expedited process adopted by the oversight board at Meta, which owns Facebook and Instagram, earlier this year.
The two appeals relate to posts about the war from both sides of the conflict. One is about a Facebook video appearing to show a Hamas militant kidnapping a woman during the terror group’s Oct. 7 attacks in Israel. The other was an Instagram video appearing to show the aftermath of a strike outside Al-Shifa Hospital in the Gaza Strip.
Meta had initially removed both of the videos, claiming they violated the company’s terms of service, which specifically prohibit sharing violent content. But the company has since changed its mind, restoring both with tags warning of graphic content. The oversight board said it would issue a decision on the matter within 30 days, and the company is bound to follow the board’s decisions.
Social media has become the primary way most of the world has engaged with the war, as more and more images and videos purporting to be from Israel and Gaza circulate online — joining a stew of content that includes a large amount of misinformation and doctored or mislabeled images. Meta and other social media companies X/Twitter and TikTok have struggled with how to balance allowing freedom of expression on their platforms with curbing violent imagery or the spreading of terror propaganda.
In the case of the disputed posts, Meta initially claimed both violated its rules on sharing violent and graphic content. The company has also designated Oct. 7 as a terrorist attack, subject to rules that any content showing “identifiable victims” of such an attack is forbidden from its platforms. It has updated its own rules frequently since Oct. 7, most recently determining that hostage footage shown “in order to raise awareness and condemn the attacks” is permissible.
“Meta’s goal is to allow people to express themselves while still removing harmful content,” the company wrote in an update to its policies Tuesday. “If the user’s intent in sharing the content is unclear, we err on the side of safety and remove it.”
The disputed Oct. 7 Facebook video was posted by a user who appeared to be condemning Hamas and “urge[d] people to watch the video to gain a ‘deeper understanding’ of the horror that Israel woke up to on October 7, 2023,” according to the oversight board’s description of the post. The board did not share the post itself, but the description suggests that the video showed Noa Argamani, who became an early symbol of the hostage crisis after being abducted with her boyfriend from the Nova music festival. She remains a hostage in Gaza.
The video purporting to show the hospital bombing, meanwhile, was posted by a user who referred to the Israeli army as the “usurping occupation” and tagged various human rights organizations, the board said. The Al-Shifa hospital has become an epicenter of both Israel’s military operation and the larger information war, as Israel targeted the hospital while claiming that Hamas was using it as a command center — a claim that Israel later backed up by taking media outlets on a tour of a tunnel network connected to the hospital.
Even though both posts were restored, the oversight board’s rulings on them could affect how Meta moderates content about the war, and how permissive the company will be about images depicting its victims. Meta’s oversight board has taken up other Jewish issues in the past, including the company’s failure to remove a Holocaust-denying post and its decision to remove a post of a journalist criticizing Kanye West’s praise of Hitler.
“Crisis situations are not an excuse for social media platforms to suspend rules or default to censorship, they’re a reminder to double down on efforts to protect voice and safety,” Thomas Hughes, director of the Oversight Board Administration, said in a statement. “The Israel-Hamas conflict underscores the many challenges to content moderation during crisis situations. The Board looks forward to reviewing how Meta is following through on its human rights commitments, as well as past recommendations from the Board on how to manage crises.”
Jewish former Meta executive Sheryl Sandberg has also become ensnared in an information war related to the conflict, as her claim that Hamas raped female Israeli victims of Oct. 7 has been disputed on the very platforms she used to oversee.
Both TikTok and X have faced intense criticism for allowing antisemitic content to spread on their platforms, and — in X’s case — for owner Elon Musk’s own engagement with antisemitic content.
—
The post Meta oversight board will rule — at unusual speed — on two deleted posts related to Israel-Hamas war appeared first on Jewish Telegraphic Agency.