The Dangerous New Technology That Will Make Us Question Our Basic Idea of Reality

frankie's/Shutterstock.com

We are about to enter a new era of computer-generated audio and video recordings.

US president Donald Trump—fake-news provocateur—now denies that the infamous “Access Hollywood” tape is real.

Of course we know that the recording, in which he makes lewd remarks about sexually assaulting women, is authentic. Everyone has seen and heard it, Billy Bush was fired because of it, and Trump himself confirmed that it was him in a video statement last October. Except for die-hard Trump supporters, few will probably believe the president’s attack on the authenticity of the recording. But Trump’s assertions foreshadow a fight for reality that might become too real, too soon.

We are about to enter a new era of computer-generated audio and video recordings that will question our basic idea of reality.

Listen to Trump reading his tweets—except he has never read these sentences out loud. Computer software based on artificial intelligence generated his voice. Or listen to Barack Obama giving this speech. His voice is an audio mash-up of different interviews he has given in the past, and the video was computer generated by scientists at the University of Washington, who used the audio track to generate matching facial expressions.

In this way, bad actors could use this new technology to spread fake news and try to influence politics and elections. People in positions of power could use it to discredit recordings that show them in an unflattering light.

The technology still has some glitches, but forensic specialists predict that computers will be able to generate convincing, fabricated audio and video recordings at a rapid pace in the next few years. “We should absolutely worry about it,” says Hany Farid, a computer science professor at Dartmouth College. “This will take fake news to a whole new level.”

Farid imagines that nefarious actors could program computers to generate videos of, for example, Trump claiming to launch nuclear missiles against North Korea. These fabricated videos could then spread at lightning speed across social media before the mistake can be corrected, possibly leading to armed conflict between two nuclear nations. “Agents of disinformation will use it in places that we haven’t even imagined,” says Claire Wardle, research director of First Draft, who has extensively studied the phenomenon of (mis)information online. Once misinformation has spread—as we have seen with questions about Obama’s birth certificate and claims that he is Muslim—it is almost impossible to erase. “It’s very difficult to kill an idea,” says Amy Webb, a quantitative futurist and founder of the Future Today Institute.

In the past, we have been duped by manipulated images that have circulated online: the shark on a freeway in post-Harvey Houston; this video allegedly showing hurricane Irma; John Kerry with Jane Fonda speaking at an anti-war rally; an eagle snatching a baby. Part of Farid’s work in digital forensics consists of verifying photos and videos for news organizations, such as the New York Times, the Associated Press, and Reuters. Farid might check if the shadows in a photo are consistent with a single light source, or if an object has been inserted into the image which means its shadow is off.

But digital forensics is unprepared for a potential onslaught of fabricated audio and video recordings. Specialists will need different tools to debunk computer-generated videos; instead of looking for manipulations, they will have to scan for perfection. But even with the aid of special computer software, verification today is still a time-consuming task: It often takes Farid hours to conclude if a photo is fake or authentic.

If computers start flooding social-media platforms with fake video files, we will be playing catch-up to try and discern real from fake. And that’s a losing battle. Even if researchers can find ways to quickly debunk computer-generated video files, such as by using biometric data to analyze the blood flow on a person’s face, Farid is afraid that media manipulators will learn from the new analysis methods and use these techniques in their next algorithms. “It’s like spy vs. spy in the movies,” says Benjamin Decker, who traces disinformation online for Storyful, a digital newsroom that verifies breaking news and delivers news videos to media organizations.

The experts are not optimistic about their chances at battling this new threat. “In my view of the endgame, the adversaries will win,” Farid says. While he is busy authenticating photo and video material, the news—real or not—has already disseminated across social media channels around the world. He’s already being outpaced by fake news, and that’s before these new technologies have even matured.

If visual material was verified before being published, we could prevent fabricated videos from flooding the internet—but this is not how the news cycle and social media works. Everyone can publish anything without permission, and for media organizations, there is an enormous pressure to be first. “The news cycle is fast and constant,” says Webb. “We are becoming habituated to constant dopamine hits.”

Even journalists at established media organizations occasionally accidentally share misinformation. “In a breaking news situation, something comes to your desk and it’s full of disinformation, there is a decent chance that you might take the bait because you are so busy,” Decker says. Most people today don’t verify if video clips are authentic before posting, and that’s unlikely to change when the news cycle gets flooded with computer-generated videos, predicts Webb.

And it’s not just the fake videos we should worry about: It’s the enduring belief in the real ones, too. Once viewers learn that videos can be computer generated, this technology could essentially undermine the credibility of all audio and video recordings. This may lead viewers to conclude that anything they read, hear, or see online could be fake. Ultimately, this lack of trust in facts could further erode democracy. “It’s hard not to see a dystopian future,” says Webb. “We all need to slow down for five minutes and think this through.”

Farid believes that technology is our curse and also our savior. But for us to solve this problem, we will need a lot more cooperation and innovation.