Deepfakes are just the beginning for cybersecurity's Faceswap nightmare Credit:

Deepfakes are just the beginning for cybersecurity's Faceswap nightmare

First, came the harmless “face-swap” apps. A fun, interactive application of machine learning technology embraced by younger mobile users in droves. Then, came the less-harmless applications of machine learning like the recent and controversial rise and fall of “DeepFakes” videos. From here, security vendors warn, this practice of face-swapping could morph into a monster with chilling implications for the cybersecurity world.

Summarizing this trend, Nick FitzGerald,  senior research fellow at security company ESET, says that there are several recent developments in machine learning that are “likely to have major implications for computer security, and even digital media in general.”

“The first is “adversarial machine learning” whereby adversarial examples can be generated to fool a machine-learning-based classifier, such that the sample is (to humans) apparently one thing, but it is reliably miss-classified as something else by the machine learning model used by the software. Adversarial examples of images and audio files – and potentially software examples such as PDF and executable files – have already been demonstrated.”

“Such adversarial examples are generated by the attacker being able to observe the operation of the classifier, extracting features that the learning model apparently depends on and then “perturbing” natural examples with enough “noise” that the software classifier is fooled, but the resulting audio or image is still clearly recognisable as something it is not, according to the machine classification.”

“The obvious implication for endpoint security products is that those that mainly or only depend on machine-learning-based approaches are just as susceptible to such adversarial example attacks, whereby an attacker with access to the product can reverse engineer enough features used by the model such that they could then produce malware files that would not be classified as such by that security product.”

You’re viewing IDG Connect Insider content. Please enter your email to continue.


«Exclusive: Why building tech infrastructure for the Commonwealth Games is so tough


Snapchat finally delivers those TrueDepth-enhanced AR Lenses»

Recommended for You


Platform or publisher?

Tech Cynic – IT without the rose-tinted spectacles


Mark Shuttleworth’s next mission: making private clouds affordable

Martin Veitch's inside track on today’s tech trends


GDPR-based extortion could be the next cybercrime trend

Dan Swinhoe casts a critical eye on the future

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.


Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.



Should the government regulate Artificial Intelligence?