A viral video shows a young woman leading an exercise class at a roundabout in the Burmese capital, Nyapyidaw. Behind her a military convoy approaches a checkpoint to make arrests in Parliament. Did she inadvertently film a coup? She continues to dance.
The video later became a viral meme, but for the first few days, amateur sleuths online debated whether it was subjected to a green screen or otherwise manipulated, often using auditing and forensics jargon. picture.
For many viewers online, the video captures the absurdity of 2021. Yet claims of audiovisual manipulation are increasingly used to make people question whether what’s real is fake.
At Witness, in addition to our ongoing work to help people film the reality of human rights violations, we conducted a global effort to better prepare for increasingly sophisticated audiovisual manipulation, including so-called deepfakes. These technologies provide tools to make it look like someone has said or done something they never did, to create an event or person that never existed, or to more easily edit a video. .
The hype fails, however. The political and electoral threat of real deepfakes lends itself well to the headlines, but the reality is more nuanced. The real grounds for concern became clear during the expert meetings that the witness Brazil, South Africa, and Malaysia, as well as in the we and Europe, with people who have experienced damage to their reputation and their credentials, and professionals such as journalists and fact-checkers tasked with fighting lies. They highlighted the current harms of non-consensually manipulated sexual images targeting ordinary women, journalists and politicians. This is a real, existing and widespread problem, and recent reports confirmed its growing scale.
Their testimony also showed how complaints Deepfakery and video manipulation were increasingly used for what law professors Danielle Citron and Bobby Chesney call the “liar’s dividend,” the ability of the powerful to claim plausible denial over incriminating footage. Statements like “This is a fake” or “It has been manipulated” have often been used to denigrate a leaked video of a compromising situation or to attack one of the few sources of civilian power in authoritarian regimes: credibility smartphone images of state violence. It is based on stories of state sponsored deception. In Myanmar, the military and authorities themselves have repeatedly shared false images and challenged the veracity and integrity of actual evidence of human rights violations.
During our discussions, journalists and human rights activists, including those from Myanmar, described fearing the burden of tirelessly proving what is real and what is wrong. They feared that their work would not only become rumors of debunking, but also having to prove that something was genuine. Skeptical publics and public factions question the evidence to strengthen and protect their worldview, and to justify their actions and partisan reasoning. In the United States, for example, conspirators and supporters of the right dismissed former President Donald Trump’s awkward concession speech after the attack on Capitol Hill saying “it’s a fake.”
There are no easy solutions. We need to support the building of audiovisual forensic and auditing skills in community and professional leaders around the world who can help their audiences and community members. We can promote the general accessibility of platform tools to facilitate the visualization and the dispute of the perennial “superficial” videos badly contextualized or edited which are satisfied to badly interpret a video or to make a basic assembly, as well as more sophisticated deepfakes. Responsible “authenticity infrastructureThat makes it easier to know if and how an image has been manipulated and by whom, for those who want to “show their work”, can help if it is developed from the start with an awareness of how it could also be misused .
We also have to frankly recognize that the promotion of auditing tools and skills may in fact perpetuate a conspiratorial approach to media ‘disbelief by default’ which is in fact at the heart of the problem with so many videos that In reality show reality. Any approach to providing better skills and better infrastructure must recognize that conspiratorial reasoning is one step away from constructive doubt. Media literacy approaches and forensic tools that send people into the burrow rather than promoting common sense judgment can be part of the problem. We don’t all need to be instant open source investigators. First we need to apply simple frameworks like the SIFT methodology: Stop, find the source, find reliable coverage, and trace the original context.