Content Governance

We work at the nexus of balancing freedom of expression and online safety to safeguard vulnerable groups in low- and- middle-income regions.

Featured

Reimagining ‘Tech Accountability’ in the Global Majority

Submission: Oversight Board on Altered Video of President Biden

Marginalized and underrepresented communities often face disproportionate harm, including online harassment, hate speech, and targeted misinformation on internet platforms. Content moderation policies can unintentionally amplify these challenges by failing to adequately address systemic biases, cultural nuances, and contextual understanding. Given the enormous volume of content on social media platforms, it is practically impossible to review each piece manually. There is over-reliance on automation and algorithms, but they struggle with nuance, leading to both over- and under-enforcement. Further, the policies themselves favor dominant voices and fail to address region or culture-specific challenges. The diverse regulatory landscape with varying approaches across different jurisdictions adds to the complexity. There is an urgent need to balance mitigating online harms while safeguarding freedom of expression that will require widespread and meaningful collaborations, particularly by putting underrepresented voices in the driver's seat.