Security

Doctored Jim Acosta video shows why fakes don't need to be deep to be dangerous

After much hullabaloo earlier this year about "deep fakes," the machine-learning based fake videos that Senator Marco Rubio called the modern equivalent of nuclear weapons, it turns out that low-tech doctored videos can be just as effective a form of disinformation, as a fake video promoted by the White House this week demonstrates—an attack that could just as easily be deployed against you or your enterprise.

After a tense confrontation between President Trump and CNN reporter Jim Acosta at a press conference, during which a female White House intern attempts to take the microphone from Acosta, White House press secretary Sarah Sanders shared a doctored video of the incident that has been modified to look like Acosta attacked the intern.

Watching both videos makes it clear that the original has been edited. The doctored video was first published on Twitter by Infowars editor Paul Joseph Watson, and has been modified to make it look like Acosta "karate-chopped" the intern. The video also removes clear audio of the reporter saying "Pardon me, ma'am."

It remains unclear whether Watson did more than remove the audio and zoom in on Acosta's arm. Buzzfeed News reports that the video-to-gif conversion process may be responsible for the jerkiness of the video. Gif uses many fewer frames per second than regular mp4 video. "Digitally it's gonna look a tiny bit different after processing and zooming in, but I did not in any way deliberately 'speed up' or 'distort' the video," Watson told Buzzfeed. "That's just horse shit."

The incident underscores the fears that video can be easily manipulated to discredit a target of the attacker's choice—a reporter, a politician, a business, a brand. Unlike so-called "deep fakes," however, where machine learning puts words in people's mouths, low-tech doctored video hews close enough to reality that it blurs the line between the true and false.

FUD (fear, uncertainty and doubt) is familiar to folks working in the security trenches, and deploying that FUD as a weapon at scale can severely damage a business as well as an individual. Defending against FUD attacks is very difficult. Once the doubt has been sowed that Acosta manhandled a female White House intern, a non-trivial portion of viewers will never forget that detail and suspect it might be true.

Reputational damage for an enterprise can cause stock prices to plummet, and result in long-term consequences for customers and shareholders who may no longer trust the truth about your business when they hear it going forward.

This is the real danger of FUD, of course—when no one can be sure any longer what is true and what is false, attackers who wish to manipulate citizens, consumers, or shareholders at scale can do so with ease. Defending against such attacks is extremely difficult, because even an individual or enterprise that has done nothing wrong, such as CNN's Acosta, still faces a tarnished reputation as a result.

PREVIOUS ARTICLE

« A Samsung-like folding screen is just what the iPhone needs, but don't expect it anytime soon

NEXT ARTICLE

Black Friday starts early at Best Buy with big deals on iPads, Apple Watches, and MacBook Pros »
author_image
IDG Connect

IDG Connect tackles the tech stories that matter to you

  • Mail

Recommended for You

Tech Cynic: VR, the never-popular technology

Tech Cynic – IT without the rose-tinted spectacles

Five months on, GDPR doubts remain for this lawyer

Martin Veitch's inside track on today’s tech trends

How can smart solutions help address Southeast Asia's urban challenges?

Keri Allan looks at the latest trends and technologies

Poll

Is your organization fully GDPR compliant?