Deepfakes and deep fraud: The new security challenge of misinformation and impersonation

With the significant improvements in deepfake technology, how prepared are governments and organisations in mitigating the security threats?

Deepfakes, until recently, have just been an amusing part of the internet. Videos emerged of various celebrities in the wrong movie or interview, some were quite poorly made but others were almost like the real thing. They were entertaining and funny; not really given much thought and left to a corner of the internet. However, it was not long before politicians were the next target, videos emerging of significant figures like Barack Obama, Nancy Pelosi and Donald Trump.

It was at this point that some serious concerns started to develop over the security implications of this technology.

So, what are deepfakes and how do they work? A deepfake is a video or audio clip where someone's face or voice has been replaced with another person's likeness using Artificial Intelligence. The name comes from the combination of "deep learning" and "fake", as machine learning methods are used to create these videos. Most commonly, it is a deep learning process that involves generative adversarial networks (GANs).

The reason deepfakes have mostly been reserved for public figures is because these AI models require large volumes of image/video frames to correctly overlay the target face. But the technology is rapidly developing, and the final products are becoming seamless. With these improvements, it is clear that seeing is no longer believing.

The age of fabrication

In an age of misinformation and fake news, it is more important than ever to separate fact from fiction. The internet has become a breeding ground for alternative facts and conspiracy theories, and companies like Facebook, YouTube and Twitter are under fire for not controlling the spread of misinformation. Deepfakes pose a new kind of threat - videos that can now manufacture false statements by political leaders and key figures.

To continue reading this article register now