The Unfair Consequences of Global South Content Moderation

Category Computer Science

tldr #

Cornell University research has found that content moderation systems based on Western norms disproportionately punish users in the Global South. These systems often misinterpret context, miss important swearing in different languages, and flag content as offensive that is acceptable in the cultures of users. This leads to real-life consequences for those affected, such as loss of photos, messages and business incomes, and the feeling of being harassed. A different kind of content moderation is necessary to avoid such unfair penalties.

content #

Social media companies need content moderation systems to keep users safe and prevent the spread of misinformation, but these systems are often based on Western norms, and unfairly penalize users in the Global South, according to new research at Cornell University.Farhana Shahid, lead researcher and doctoral student in information science, interviewed people from Bangladesh who had received penalties for violating Facebook's community standards. Users said the content moderation system misinterpreted their posts, removed content that was acceptable in their culture, and operated in ways they felt were unfair, opaque and arbitrary.

Facebook is the most highly used social media platform in Bangladesh

"Pick any social media platform and their biggest market will be somewhere in the East," said co-author Aditya Vashistha, assistant professor of information science. "Facebook is profiting immensely from the labor of these users and the content and data they are generating. This is very exploitative in nature, when they are not designing for the users, and at the same time, they're penalizing them and not giving them any explanations of why they are penalized." .

Content moderation policies are more often based on Western values than those around the world

Shahid will present their work in April at the Association for Computing Machinery (ACM) CHI Conference on Human Factors in Computing Systems.

Even though Bengali is the sixth most common language worldwide, Shahid and Vashistha found that content moderation algorithms performed poorly on Bengali posts. The moderation system flagged certain swears in Bengali, while the same words were allowed in English. The system also repeatedly missed important context. When one student joked, "Who is willing to burn effigies of the semester?" after final exams, his post was removed because it might incite violence.

Content moderators are often employed locally to remove problematic content

Another common complaint was removing posts that were acceptable in the local community, but violated Western values. When a grandmother affectionately called a child with dark skin a "black diamond," the post was flagged for racism, even though Bangladeshis do not share the American concept of race. In another instance, Facebook deleted a 90,000-member group that provides support during medical emergencies because it shared personal information—phone numbers and blood types in emergency blood donation request posts by group members.

Facebook profits immensely from the content and data that users generate

The restrictions imposed by Facebook had real-life consequences. Several users were barred from their accounts—sometimes permanently—resulting in lost photos, messages and online connections. People who relied on Facebook to run their businesses lost income during the restrictions, and some activists were silenced when opponents maliciously and incorrectly reported their posts.

Participants reported feeling "harassed," and frequently did not know which post violated the community guidelines, or why it was offensive. Facebook does employ some local human moderators to remove problematic content, but the arbitrary flagging led many users to assume that moderation was entirely automatic. Several users were embarrassed by the public punishment and angry that they could not appeal, or that their appeal was ignored.

Penalties of violating Facebook community standards can lead to real-life consequences for the user

"Obviously, moderation is needed, given the amount of bad content out there, but the effect isn't equally distributed for all users," Shahid said. "We envision a different type of content moderation system that doesn't penalize people, and maaybe a system that recognizes users from different areas, and modifies the set of rules based on who they are and where they're coming from." .

hashtags #
worddensity #