Misinformation woes could multiply with 'deepfake' videos

A journalist viewing a "deepfake" video. Such videos are becoming more realistic due to advances in artificial intelligence.
A journalist viewing a "deepfake" video. Such videos are becoming more realistic due to advances in artificial intelligence. PHOTO: AGENCE FRANCE-PRESSE

WASHINGTON • If you see a video of a politician speaking words he never would utter, or a Hollywood star improbably appearing in a cheap adult movie, do not adjust your television set - you may just be witnessing the future of fake news.

"Deepfake" videos that manipulate reality are becoming more sophisticated due to advances in artificial intelligence, creating the potential for new kinds of misinformation with devastating consequences.

As the technology advances, worries are growing about how deepfakes can be used for nefarious purposes by hackers or state actors.

Law professor Robert Chesney of the University of Texas, who has researched the topic, argues that deepfakes could add to the current turmoil over disinformation and influence operations.

"A well-timed and thoughtfully scripted deepfake or series of deepfakes could tip an election, spark violence in a city primed for civil unrest, bolster insurgent narratives about an enemy's supposed atrocities, or exacerbate political divisions in a society," he and University of Maryland professor Danielle Citron said in a blog post for the Council on Foreign Relations.

Mr Paul Scharre, a senior fellow at the Centre for a New American Security, a think-tank specialising in artificial intelligence (AI) and security issues, said it was almost inevitable that deepfakes would be used in upcoming elections.

A fake video could be deployed to smear a candidate, he said, or to enable people to deny actual events captured on authentic video.

Video manipulation has been around for decades and can be innocuous or even entertaining, as in the digitally-aided appearance of Peter Cushing in 2016's Rogue One: A Star Wars Story, 22 years after his death.

The popularisation of apps which make realistic fake videos threatens to undermine the notion of truth in news media, criminal trials and many other areas, researchers say.

"If we can put any words in anyone's mouth, that is quite scary," said Dr Siwei Lyu, a professor of computer science at the State University of New York, in Albany, who is researching deepfake detection. "If we cannot really trust information to be authentic, it's no better than to have no information at all."

Representative Adam Schiff and two other US lawmakers recently sent a letter to National Intelligence Director Dan Coats asking for information about what the government is doing to combat deepfakes. "Forged videos, images or audio could be used to target individuals for blackmail or for other nefarious purposes," they wrote. "Of greater concern for national security, they could also be used by foreign or domestic actors to spread misinformation."

Researchers have been working on better detection methods for some time, with support from private firms such as Google and government entities like the Pentagon's Defence Advanced Research Projects Agency, which began a media forensics initiative in 2015.

While deepfakes have been evolving for several years, the topic came into focus with the creation of a video last April that appeared to show former US president Barack Obama using a curse word to describe his successor Donald Trump - a coordinated stunt from filmmaker Jordan Peele and BuzzFeed.

Mr Scharre said an important way to deal with deepfakes is to increase public awareness and make people more sceptical of what used to be considered incontrovertible proof. "After a video has gone viral, it may be too late for the social harm it has caused," he said.

AGENCE FRANCE-PRESSE

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on January 29, 2019, with the headline Misinformation woes could multiply with 'deepfake' videos. Subscribe