Starting August, we will release a biweekly newsletter, rounding up what's happening in tech and Internet in Global South, why it matters and what you or we can do about it. We'll post about opportunities to collaborate or participate in policy consultations. If you're interested, sign up to join the waitlist.
Our Response to Meta’s Updated Manipulated Media Policy
We welcome Meta’s decision to revise its Manipulated Media policy to include labeling a broader range of audio, video and image content. In addition, Meta announced that it will provide additional context through a more prominent label if a digitally altered media poses material risks to mislead the public, and otherwise does not violate policies.
In October last year, we submitted comments to the Oversight Board, indicating that existing definitions under Meta’s Manipulated Media policy is too narrow and needs to include a broader range of audio, video and image content. We further highlighted that Meta needs to develop a unified Manipulated Media policy that focuses action based on harm, as opposed to the technicalities of how a digitally manipulated content was created. In our opinion piece for Context and comments published in the Financial Times, we emphasized that cheapfakes—content created using basic editing tools—pose significant risks to communities in the Global Majority, in addition to those created with more sophisticated or generative AI tools.
While we appreciate Meta taking a positive step towards a more unified, risk-based and broader policy on digitally altered media, these measures should be deemed as a floor and not the ceiling. Digitally altered media continues to pose serious risks of exacerbating mis- and disinformation with harmful offline consequences among vulnerable and low literacy groups in the Global Majority.
Labeling alone is unlikely to mitigate risks faced by marginalized communities in the Global Majority, and even prominently labeled digitally manipulated media can invite violence or stoke riots in fraught democracies. Meta is aligned with the Oversight Board’s recommendation to not remove manipulated content if it does not violate other policies. However, if read within a broader context, there is substantive documentation on disparities in existing policies that lead to disproportionate harms. These will only likely be exacerbated with digitally altered media.
We look forward to Meta’s continued investment in detecting, enforcing and revising rules on digitally altered media and adopting a harm-based approach.
Related Resources
Press Release
Research
Research