We’re at a point now where most businesses are aware that cyberthreats exist. Every day, headlines are full of stories relating to data breaches, hacks, malware, phishing attacks or similar, and there’s an understanding that anyone could be the next victim. Cybersecurity is no longer a novelty.
The industry is, therefore, thriving. Spending is consistently increasing and is predicted to reach £115 billion by 2023, according to IDC, and the number of cybersecurity companies, services and tools are rapidly on the rise too. Consequently, organisations face an overwhelming choice around how budgets should be spent to safeguard corporate networks and data.
This plethora of options sometimes leads to a misconception that bigger and more expensive is better. The more bells and whistles a tool has, the more effective it will be at preventing nefarious actors from accessing networks. That’s not always the case. Yet, recent innovations utilising artificial intelligence (AI) and machine learning (ML) and their ability to perform tasks unaided is leading some to believe that cybersecurity will soon become a human-less function.
What part will the ‘robots’ play?
There is a great deal of concern about how intelligent technology will soon render swathes of the workforce out of a job, and it’s a fear that – to some degree – isn’t completely ungrounded. Recent history is littered with examples of how technology has been adopted and taken the job from a human. For instance, where once all cars were virtually handmade, now it’s the only high-end vehicles such as Rolls-Royces that receive such treatment, most others roll off the automated assembly line. Machines don’t need to take a lunch break or do the school run, so it’s no surprise that businesses adopt technology where they can.
Similar thinking is taking place within information management too. As tools become more intelligent and can monitor and flag activity taking place across ever-sprawling digital infrastructures, they will eventually take on more responsibilities of the security team before becoming completely autonomous, rendering the security team obsolete. Simply put, that’s a scenario that won’t happen, certainly not in the short- to medium-term future.
Humans still very much have a role to play, and companies that are placing too much reliance on their cybersecurity tools without supporting them with the relevant human expertise are finding out the hard way that technology has its faults. While it can undoubtedly aid productivity, it can also make mistakes, be configured incorrectly or be gamed by increasingly intelligent and wily cybercriminals – who themselves are using more technology to increase the effectiveness of their efforts.
For example, a wrongly configured system can display thousands of false-flags which a security manager will need to sift through in order to identify what truly is a cause for concern. They can then use that information to correct the configuration and resolve any vulnerabilities. If that expert intervention isn’t present, the flags build to an insurmountable level and attacks may slip under the radar. Adopting cybersecurity without the supporting expertise is akin to strapping an engine onto a rowboat without having someone to drive it – you will move forward, but you’re eventually going to crash.
The demand for more expertise
Alongside an acceptance that cybersecurity requires human intervention, it must also be understood that those responsible must be experts and not simply average Joes. With tools collecting and producing so much data, analysing it requires a trained eye that can spot trends which could identify an ongoing cyberattack, for example.
Yet, Europe is currently suffering from a cybersecurity skills shortage. A report by the International Information System Security Certification Consortium, or (ISC)², revealed that Europe is suffering a skills gap of 219,000 (the difference between the number of professionals in the field compared to how many are needed), when looking at the issue globally, the gap increases to more than four million.
The demand for expertise alongside a shallow talent pool has meant the average cybersecurity salary has increased quickly in the past few years. For instance, network security analysts have seen their average salary increase by 34% since 2017, according to a report by recruitment agency Reed. For companies with tighter budgets, this represents a challenge with them unable to match salaries offered elsewhere.
The growing issue has been recognised by the UK Government, with initiatives including the National Cyber Security Skills Strategy, introduced to help close the gap. However, such schemes can take years to come to fruition so, in the meantime, organisations must step up to the plate to help both themselves and the wider economy by nurturing the next wave of cybersecurity experts.
Nurturing home-grown experts
Cybersecurity training and study are both incredibly time-consuming processes. The field is so vast and there are so many nuances that it takes a substantial amount of time for an individual to be considered an expert. There is no short cut. However, organisations can get in early with these future experts by offering graduate schemes, apprenticeships and work placements which enable the individuals to grow with the company and learn its processes.
To help fill the gap between now and experts being ready, businesses should look at where skills and responsibilities can be overlapped. For example, network and security administrators share very similar goals, so expanding their skill sets into a broader shared role is a relatively natural fit. While not the ideal situation, it ensures that all bases are covered and not one area of cybersecurity is massively under resourced or completely side-lined.
Additionally, companies should adopt intuitive cybersecurity technology that will help individuals get up to speed as they are learning the ropes. With tools collecting data from so many network points, all of that rich information needs to be displayed in one place and in an accessible format so that it can be analysed and turned into actionable intelligence.
Addressing the cyber skills gap fully is reliant on internal investment and, therefore, buy-in from the board is incredibly important. Without the promise of cybersecurity investment in all forms, such as policy, staff, solutions and ongoing education, then those responsible for security face a real uphill struggle. Attempting to drive change without buy-in can be incredibly difficult as there isn’t always a tangible return from investment.
Externally, a changing regulatory landscape does force cybersecurity evolution to take place – just as the introduction of the EU GDPR did and is continuing to do – but this reactive approach is unlikely to ever become a source of competitive advantage.
Boards must be convinced that security and growth can go hand-in-hand. This is particularly true when thinking about data and the value it holds – it needs to be protected. When information is secure, it can be used, analysed and turned into actionable intelligence that generates company growth. Growth and security need each other to flourish.
Finding the balance
Ultimately, as cybersecurity tools become more advanced, it doesn’t mean the end of the human security team – it reinforces their importance. Tools are allowing them to better manage expanding digital estates, with data being collected and analysed faster than ever. However, while investment in solutions is needed, so is investment in the experts needed to oversee them. When there is equilibrium, companies are far better prepared for cyberthreat age we live in.
Words (excluding title) 1,213 / 1,200