NEW YORK (Reuters) -Meta Platforms' Oversight Board on Wednesday told the company to keep up a Facebook post superimposing the faces of U.S. Democratic presidential candidate Kamala Harris and her running mate Tim Walz onto a parody "Dumb and Dumber" movie poster showing the characters pinching each other's nipples through their clothing.
The board accused the company of acting too aggressively against an obvious political parody with the post. The original movie poster depicts two male characters known for engaging in gross-out bawdy antics.
Meta originally took the post down for violating a rule against "derogatory sexualized photoshop" manipulations of images but restored it once the board informed Meta it was examining the case, the board said in a blog post.
The company told the board it "does not consider that pinching a person’s nipple through their clothing qualifies as sexual activity," the board said.
A Meta spokesperson told Reuters its initial decision was a mistake resulting from human error.
The board, which is funded by Meta but operates independently, did not comment directly on Meta's rationale but said the post in question showed a "non-sexualized derogatory depiction" of political figures and therefore was not in violation of Facebook rules.
Meta's initial removal was a worrying sign of the company's tendency toward "overenforcement" of its policies against satire and political speech, it added.
"This post is nothing more than a commonplace satirical image of prominent politicians and is instantly recognizable as such," the board wrote.
It said that the company's failure to handle the post appropriately on its first try "raises serious concerns about the systems and resources Meta has in place to effectively make content determinations in such electoral contexts."
The case highlights the fine line the world's biggest social media platform must walk in moderating posts pertaining to the highly charged U.S. political environment. Conservatives frequently complain that the platform removes too much of their content, while progressives generally say it does too little to police misinformation and abuse on properties Facebook, Instagram, WhatsApp and Threads.
Meta relies heavily on automated AI-powered enforcement of its rules. This controls the cost of moderating the posts of its more than 3 billion users, but elicits gripes from users who say the systems often fail to detect parody and context.
Aside from AI, Meta's content moderation is generally performed by workers at third-party contractors globally.
(Reporting by Katie Paul in New York; Editing by David Gregorio and Mark Porter)