Submission: Oversight Board on Altered Video of President Biden

Shahzeb Mahmood

Senior Researcher

Theodora Skeadas

Community Member

In October, the Oversight Board announced a case in relation to an altered video of President Biden in the U.S. Despite a user reporting the content as altered media, Meta decided to leave the content on the platform indicating that its Manipulated Media policy only applies to content generated by artificial intelligence or to those in which a subject is shown saying words that they did not say. Available news coverage also indicates that the video has been altered.

The Oversight Board selected this case to assess whether Meta’s policies adequately cover altered videos that could mislead people into believing that politicians have taken actions, outside of speech, that they have not. Tech Global Institute members, Theodora Skeadas and Shahzeb Mahmood, responded to the Board’s request to submit comments, indicating we did not agree with Meta’s decision and recommended removal of the altered video because of its possible impact on civic participation and political processes.

Our key takeaways are highlighted below. The full comments are available here.

  • Altered media content exists along a spectrum of sophistication and significantly influences public perceptions of political figures. This includes advanced deepfakes and sophisticated AI-driven manipulation, creating hyper-realistic but fabricated videos, as well as simpler cheapfakes. Often, misleading edits, such as context alterations, video splicing, audio and video modification, and image manipulation, contribute to the creation of false narratives about political figures.
  • Meta’s existing Misinformation and Manipulated Media policies are inadequate to address altered video content featuring political figures, as these policies primarily target text or caption-based video content created using advanced AI/ML tools, leaving out content manipulated through simpler means or conduct-based methods. This poses a significant threat to the integrity of democratic institutions and processes, underscoring the need for more comprehensive and inclusive policies to address misinformation and altered media.
  • In this case, the digitally altered content neither represented a genuine political critique nor a valid expression of dissent, but instead constituted a false portrayal of political figures with the deliberate and disproportionate intent to discredit the individual, therefore, should be restricted. Such action should follow an assessment based on the three-part test outlined in Article 19 of the ICCPR. Furthermore, this content potentially amounts to manipulative interference, capable of dissuading voters from making free and independent choices, and the failure to remove it contradicts Meta’s human rights obligations under Article 25 of the ICCPR. This content could have been removed under Meta’s Bullying and Harassment policy that disallows content with severe sexualized commentary and derogatory attacks.
Shahzeb Mahmood

Senior Researcher

Shahzeb Mahmood is a senior researcher at Tech Global Institute, specializing in Internet and technology laws. He previously served as legal counsel to telecom, Internet and FAANG companies.

Theodora Skeadas

Community Member