Waging war on hackers a daunting arms race
The following commentary by Dr Ian Welch from Victoria University's School of Engineering and Computer Science was originally published in the New Zealand Herald on 20 June.
20 June 2016
The Government has released its new defence policy with a budget of $20 billion. The plan covers the next 15 years and in addition to replacing hardware over that time, as you'd expect, it includes the creation of a cyber security system. This is good news—we need a cyber security system, but more importantly, we also need a lot more people learning about cyber security.
Recent US figures point to more than 169 million personal records being exposed in 2015, across the financial, business, education, government and healthcare sectors.
The now infamous Panama Papers represent the world's largest ever data breach, with 11.5 million documents that were stored in the computer systems used by Mossack Fonseca—the law firm that was a primary conduit for world leaders and corporations seeking off-shore tax havens—being leaked.
We might see this particular data breach as a good thing because of the pressure it has placed on governments around the world to address tax havens. However we might not have been so happy to see ordinary individuals' phone records or tax records being stolen and unfortunately the problems that allowed the Panama Papers to be stolen are not unique, they are found in many commercial and government systems.
So, with companies and government organisations responsible for storing so much data, why do such problems exist?
First, it is the nature of software to be insecure. Humans write software so making mistakes and introducing bugs that can be exploited is inevitable.
Second, we don't design the software to fit how humans weigh up risk when making decisions. Most of our rules for decision making are the result of the experiences of our ancestors when they lived in relatively small tribal groups. They have not yet caught up to the environment created by the online world where we have hundreds of friends on Facebook whom we have never met.
One way that these flaws are exploited by attackers is in so-called "drive-by" attacks, where hackers are able to bypass organisational defences, such as firewalls, and directly infect a victim's computer. This might be done by infecting a website known to be visited by the target users. The goal is to exploit both the trust of the users and bugs in their web browsers to install a virus allowing the hacker access to the organisation's network.
At Victoria University, my research group has been looking at this particular problem for the last 10 years, and trying to understand how attackers choose to target vulnerable users. Our goal has been to develop software that can detect infected websites. However this is such a massive problem that it cannot ever be solved by one research group—or even at all. As we develop new defences, attackers develop new attacks, which means we have an ongoing arms race.
The size of this problem means that businesses and government largely lack the expertise and resource to protect themselves. To help address this, the Government recently announced $22.2 million of funding for the establishment of a new national Computer Emergency Response Team (CERT) to support customer organisations in dealing with cyber-attacks and cyber-crime. This is a long-overdue step and will see us join a community of over 40 other national CERTs across Europe, Asia and the Americas.
This good start isn't enough. We also need to address major shortfalls in the number of software and network engineers with an understanding of security. Internationally, experts are forecasting a shortage of up to a million trained cyber security professionals in the coming years.
These are not necessarily people whose primary job is to be a security professional but who have studied computer security and can apply this in their day-to-day jobs. We will never remove the bugs but we can make it harder for hackers by having fewer of them in the first place.