Internet

Why Facebook, Instagram may have a 'problem' over this Arabic word – The Times of India


Meta — the parent company of popular social media platforms like Facebook and Instagram — is facing criticism from its own Oversight Board regarding the moderation of the Arabic word “shaheed.” The term, often translated as “martyr,” has been flagged and removed more frequently than any other word or phrase on Meta’s platforms, including Facebook and Instagram, according to a report by Engadget.

The issue lies in Meta’s current approach, which treats “shaheed” solely as a reference to violence or praise of extremism. However, the Oversight Board argues that “shaheed” carries multiple meanings and is often used neutrally in reporting, academic discussions, and human rights contexts. This “blanket ban” on the word in association with “dangerous individuals” identified by Meta significantly impacts Arabic-speaking users and stifles legitimate discourse, as per the report.

The Oversight Board recommends Meta move away from the automatic removal of content containing “shaheed” and instead focus on identifying clear signals of violence or violations of other established policies. Additionally, the board urges Meta to improve transparency regarding its use of automated systems in content moderation.

This decision holds significant weight as “shaheed” is likely the most censored term on Meta’s platforms.The Oversight Board co-chair, Helle Thorning-Schmidt, expressed concern that Meta’s current strategy prioritises censorship over safety, potentially marginalising entire user groups while failing to achieve its intended goals. Furthermore, the policy could restrict media and public discourse by discouraging reporting on sensitive topics. . “The Board is especially concerned that Meta’s approach impacts journalism and civic discourse because media organizations and commentators might shy away from reporting on designated entities to avoid content removals,” said Thorning-Schmidt.

This is not the first time Meta has been criticised for biased moderation against Arabic users. A previous report revealed that content moderation was less accurate for Palestinian Arabic, leading to wrongful account suspensions. Meta also apologised in 2023 after automated translations inserted the word “terrorist” into Palestinian user profiles on Instagram.

The Oversight Board’s ruling highlights the slow pace of policy change within Meta. While Meta requested their input over a year ago, the decision comes after a pause to assess the impact on situations like the Gaza conflict. Meta now has two months to respond to the recommendations, with additional time likely needed for actual policy implementation.

“We prioritise user safety and strive for fair policy application,” stated a Meta spokesperson. “Global challenges exist in scaling moderation efforts. We sought the board’s guidance on ‘shaheed’ and will respond to their feedback within 60 days.”



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.