Can 'good' machine learning take on global cybercrime?

“Dave Palmer, Director of Technology and ex-Mi5 and GCHQ at Darktrace, would like to offer his opinion on why the machine is ready to fight back on its own,” reads the invite I’m sent by Darktrace’s PR team. It is pitched as a defence of machine learning from the slightly off kilter perspective of security. 

There has been an awful lot written about machine learning and automation recently. Much of this takes the angle that the more work machines can do the less jobs will be available for human beings. The emphasis is scare mongering, an angle which often flies in the face of all the big possibilities afforded by machine learning which still, after all, require vast swathes of human assistance.  

These include everything from smart parking services, offered by the likes of Xerox, to the incredible health benefits presented by genome sequencing and our future ability to profile disease. While when we recently asked a panel of experts what will make the biggest difference to our everyday lives in 10 years’ time, Rob McFarlane, head of labs at Head London succinctly voiced the views of many when he said:

“Sadly it won't be time travel or flying cars. It will be AI and machine learning.” 

Darktrace is itself an interesting company because – aside from large amounts of funding, a staff of ex-government security professionals and a rising stack of awards – it approaches security with the assumption that traditional perimeter defences are absolutely not enough. Instead, it discovers all about normal business activities from the inside, via machine learning, and uses this intelligence to alert human security staff when something looks awry.

To continue reading...


« Typical 24: Pete Craghill, Thomsons Online Benefits


Crowdsourcing Innovation: Jan Willem Smeenk, SODAQ »