Oversight Board: Meta Inappropriately Removed Content On Gaza War

By Arutz-7
Posted on 12/20/23 | News Source: Arutz-7

Automated content moderation tools deployed amid “a surge in violent and graphic content” on Facebook and Instagram after Oct. 7 went too far in removing posts that social media users should have been able to see, an independent oversight panel at their parent company, Meta, ruled on Tuesday.

The finding came in a review of two cases in which human moderators had restored content that computer moderation had removed. One was about a Facebook video appearing to show a Hamas terrorist kidnapping a woman on Oct. 7. The other was an Instagram video appearing to show the aftermath of a strike near Al-Shifa hospital in Gaza.

The cases were the first taken up by the Meta’s Oversight Board under a new expedited process meant to allow for speedier responses to pressing issues.

In both cases, the posts were removed because Meta had lowered the bar for when its computer programs would automatically flag content relating to Israel and Gaza as violating the company’s policies on violent and graphic content, hate speech, violence and incitement, and bullying and harassment.

“This meant that Meta used its automated tools more aggressively to remove content that might violate its policies,” the board said in its decision. “While this reduced the likelihood that Meta would fail to remove violating content that might otherwise evade detection or where capacity for human review was limited, it also increased the likelihood of Meta mistakenly removing non-violating content related to the conflict.”

As of last week, the board wrote, the company had still not raised the “confidence thresholds” back to pre-Oct. 7 levels, meaning that the risk of inappropriate content removal remains higher than before the attack.