Rise of the Robot Hackers (and Cyber Defenders)
New Dog, New Tricks
Anyone with even the slightest interest in current trends within technology could not help but be aware of the astounding progress being made in the last few years in the field of Artificial Intelligence, and in particular a subset it known as Machine Learning.
We now employ Machine Learning algorithms in an ever expanding array of roles that affect every one of us in our daily lives, from managing our transport infrastructure to controlling stock levels at the coffee shop on the corner to understanding how cancer cells react to different treatments.
Machine Learning excels wherever predictable patterns can be found, and predictable patterns can be found almost everywhere, at least when you have the computational capacity to find them.
Your Cyber Guardian
It’s no secret that antivirus and malware detection has been heavily reliant on this ability for quite some time – there are now thought to be more than 500 million worms, Trojans and other viruses in circulation, with millions more appearing every day. There is simply no way that the old system of cataloguing every single one of them and flagging only content that appeared in the catalogue could have continued to be effective – so your AV software contains algorithms that do what machines do best, they look for patterns. Patterns that resemble those previously observed in other malicious software will trigger an alert. Indeed some software will “Sandbox” the suspect file in a safe virtual environment and automatically, you might say intelligently, probe it, attempting to make it “go off” so that the true malicious intent of the code can be understood. Pretty impressive, right?
A Dark Art
But on the opposite side of the cyber fence, hacking, and hence ethical hacking / penetration testing, has always been considered something of a “Dark Art”. A skill developed only with experience, and with an inquisitive, human, mind. A skill that can be aided by cleverly coded software, but never replaced by it.
Currently vulnerabilities are found “by hand” – humans spend time hacking the software / service / device, and when they find an exploitable vulnerability, the vendor gets to work patching the hole. Rinse, repeat. But with ever more connected devices, running ever more pieces of software, on ever more networks, there is no doubt that there is opportunity for malicious hackers to find vulnerabilities that ethical hackers and vendors haven’t yet found and/or pathed.
Throwing Down the Cyber Gauntlet
However, Darpa, or the Defence Advanced Research Projects Agency, have set a challenge. Or more specifically a “Cyber Grand Challenge”.
They know it’s only a matter of time before we see cybercriminals using Machine Learning based tools to automatically infiltrate their victims’ networks. Pieces of malicious software designed to sit hoovering up information silently, undetectably, until they have gathered enough intel to strike with the kind of precision only a computer can achieve. In fact, Justin Fier, director of cyber-intelligence at security company Dark Trace, believes his company already caught a piece of malware that may well have been attempting the first part of that mission.
“We caught malware that was just watching users and logging their habits,” he said.
“We have to assume that it was trying to determine the most suitable way to exfiltrate data without triggering alarms.
“Where the malware starts to use machine learning is when it’s going to get really interesting.”
So Darpa want to see the good guys get there first
The “Cyber Grand Challenge”, then, is to develop software “smart” enough to “spot and seal vulnerabilities in other programs before malicious hackers even know they exist.”
So seven teams have each produced their own software with the ability to hack, and they will take part in a competition at the Def Con hacker convention this week. And the line between science fiction and reality becomes ever less distinct.
 – “Can machines keep us safe from cyber-attack?”, http://www.bbc.co.uk/news/technology-36923794, BBC News, 02 August 2016