Deepfakes are just the beginning for cybersecurity's Faceswap nightmare Credit: Pixabay.com
Security

Deepfakes are just the beginning for cybersecurity's Faceswap nightmare

First, came the harmless “face-swap” apps. A fun, interactive application of machine learning technology embraced by younger mobile users in droves. Then, came the less-harmless applications of machine learning like the recent and controversial rise and fall of “DeepFakes” videos. From here, security vendors warn, this practice of face-swapping could morph into a monster with chilling implications for the cybersecurity world.

Summarizing this trend, Nick FitzGerald,  senior research fellow at security company ESET, says that there are several recent developments in machine learning that are “likely to have major implications for computer security, and even digital media in general.”

“The first is “adversarial machine learning” whereby adversarial examples can be generated to fool a machine-learning-based classifier, such that the sample is (to humans) apparently one thing, but it is reliably miss-classified as something else by the machine learning model used by the software. Adversarial examples of images and audio files – and potentially software examples such as PDF and executable files – have already been demonstrated.”

“Such adversarial examples are generated by the attacker being able to observe the operation of the classifier, extracting features that the learning model apparently depends on and then “perturbing” natural examples with enough “noise” that the software classifier is fooled, but the resulting audio or image is still clearly recognisable as something it is not, according to the machine classification.”

“The obvious implication for endpoint security products is that those that mainly or only depend on machine-learning-based approaches are just as susceptible to such adversarial example attacks, whereby an attacker with access to the product can reverse engineer enough features used by the model such that they could then produce malware files that would not be classified as such by that security product.”

To continue reading...


Please login or register to view your article. If you do not have or do not remember your password, please click on the “Forgotten your password?” link at the bottom.
If you do not yet have a password but are an existing user, please use the “Forgotten your password?

PREVIOUS ARTICLE

«Exclusive: Why building tech infrastructure for the Commonwealth Games is so tough

NEXT ARTICLE

Snapchat finally delivers those TrueDepth-enhanced AR Lenses»

Add Your Comment

Most Recent Comments

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Should the government regulate Artificial Intelligence?