In the second season of BBC mystery thriller The Capture, deepfakes threaten the future of democracy and Britain’s national security. In a dystopia set in London today, hackers are using AI to insert these highly realistic fake images and videos of people into live news broadcasts to destroy the careers of politicians.
But my team’s research has shown how difficult it is to create compelling deepfakes in the real world. In fact, tech and creative professionals have started collaborating on solutions to help people spot fake videos of politicians and celebrities. We have a good chance of staying ahead of the scammers.
In my research project, Virtual Maggie, I attempted to use deepfakes to digitally resurrect former British Prime Minister Margaret Thatcher for a new drama. After months of work, we were unable to create a virtual Maggie acceptable for broadcast.
Producing compelling high-definition deepfakes requires state-of-the-art hardware, a lot of computer time, and human intervention to correct glitches in the output. That didn’t stop me from enjoying The Capture, even though I knew the Ben Chanan drama wasn’t a storyline likely to play out in the near future. Like any good dystopia, it had the seeds of something that might one day be possible.
The use of deepfakes since their debut in 2017 has been shocking. The majority of deepfakes on the internet are assaults on women, grabbing facial images without consent and inserting them into pornographic content. Deepfake expert Henry Ajder found that 96% of deepfakes found online were pornographic, and 100% of them were video images of women.
The premise of The Capture is based on fact. Deepfakes threaten democracy. During the 2019 UK General Election, artist Bill Posters posted a provocative video of Boris Johnson saying we should vote for Jeremy Corbyn.
The poster deepfake was much more convincing than the glitchy Russian deepfake showing Ukrainian President Volodymyr Zelenskyy asking his troops to surrender. Yet unlike the Kremlin, the British artist made it obvious that his AI Boris was unreal by having direct viewers of “Boris” to a deepfake website. It was intended to highlight our vulnerability to false political propaganda.
Deepfakes may not yet be convincing enough to fool people. But creative work usually involves an unwritten agreement between creator and viewers to suspend their disbelief.
A fork in the road
The threat of deepfakes has led to an intensive search for technological solutions. A coalition of businesses has formed the Content Authenticity Initiative (CAI) to provide “a way to assess the truth in the media presented to us”.
It is a promising approach. CAI collaborators and tech companies Truepic and Qualcomm have created a system that embeds an image’s history into its metadata so it can be verified. American photographer Sara Naomi Lewkowicz has done an experimental project with CAI that incorporates source information into her photos.
But creative and tech professionals don’t necessarily want to hamper emerging deepfake technology. Researchers at the Massachusetts Institute of Technology Media Lab have brainstormed ways to put deepfakes to good use. Some of them are in health care and treatment.
Research engineers Kate Glazko and Yiwei Zheng use deepfakes to help people with aphantasia, the inability to create mental images in your mind. The breakup simulator, currently in development, aims to use deepfakes to “relieve the anxiety of difficult conversation through repetition”.
The most profound positive uses of deepfakes include campaigns for political change. The parents of Joaquin Oliver, who was killed in a Florida high school shooting in 2018, used technology to bring him back in a hard-hitting video calling for gun control.
To show creativity
There are also cultural applications of deepfakes. At the Dali Museum in Florida, a deepfake Salvador Dali welcomes visitors, tells them about him and his art. Researcher Mihaela Mihailova says this gives visitors “a sense of immediacy, closeness and personalization”. Deepfake Dali even gives you the option to take a selfie with him.
Deepfakes and AI-generated characters can be educational. In Shanghai, during the lockdown, Associate Professor Jiang Fei noticed that his students’ attention was dropping during online lessons. To help them focus better, he used an animated version of himself to present his teaching. Jiang Fei said, “The students’ enthusiasm in the classroom and the improvement in the quality of homework have made obvious progress.”
Channel Four used their 2020 Alternate Christmas message to entertain viewers with a deepfake queen, while seriously emphasizing not trusting everything we see on video.
A growing network of film producers, researchers and AI technologists in the UK, hosted by the University of Reading and funded by the Alan Turing Institute, is seeking to harness the positive potential of deepfakes in screen production creative. Filmmaker Benjamin Field told the group during a workshop how he used deepfakes to “resurrect” the animator who created Thunderbirds for Gerry Anderson: A Life Uncharted, a documentary about the television hero’s troubled life for children.
Field and his co-producer, Anderson’s youngest son Jamie, discovered old audio tapes and used deepfakes to construct a “filmed” interview with the famous puppeteer. Field is one of a small group of creatives determined to find positive ways to use deepfakes in broadcasting.
Deepfakes and AI characters are part of our future and the examples above show how this could be at least partly positive. But we also need laws to protect people whose footage is stolen or abused, and ethical guidelines for how filmmakers use deepfakes. Responsible producers have already formed an AI partnership and drafted a code of conduct that could help avoid the doomsday vision of the future we saw in The Capture.
This article is republished from The Conversation under a Creative Commons license. Read the original article.
Dominic Lees does not work for, consult, own stock or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond his academic appointment.