Artificial intelligence, machine learning, and other techniques can help keep charities safe from hackers with minimal human intervention
Cyber security teams in charities and other organisations are often overwhelmed by alerts from their security systems: they ignore almost three quarters of the warnings generated by these systems because they don’t have the time or manpower to respond to them all, according to research by ESG.
What kind of alerts? These could be anything from warnings that a user has logged in from a new location, to an unknown application accessing the organisation’s data, to a failed login attempt due to an incorrect password, to an application running on a server which has a known vulnerability and which needs updating.
The problem is that many of these alerts may not be important. The recent rise in staff working from home and blended working means that people are bound to be logging in from new locations, and it is not uncommon for staff to enter the wrong password. But these may also be the first signs that a cyber criminal is trying to hack into the charity’s computer systems from abroad.
Security automation has the potential to be hugely important for charities’ cyber security. That’s because by automating certain security processes, charities can analyse alerts and prioritise the ones which need the most urgent attention without any human intervention. The most important alerts can then be passed on to cyber security staff so they can take appropriate action. In some cases, security automation goes further by taking the appropriate action automatically so that the entire security incident is dealt with quickly without taking up any staff time.
Most charities already use a form of security automation in the endpoint security software running on their computers. When this type of software detects malware in a file or on a malicious web page it automatically blocks the malware from running without any intervention from the user.
There are a number of benefits of security automation including:
As well as helping security staff keep on top of alerts, security automation can be used to interpret data to detect security problems. Two of the key technologies which underpin this type of automation are artificial intelligence (AI) and machine learning (ML).
These technologies can be useful because some cyber security systems (such as security information and event management – or SIEM systems) generate vast amounts of security data. Often the sheer amount of data is overwhelming, so cyber security staff miss vital clues about next generation threats and security incidents which are occurring because they are hidden in midst of less relevant data.
The ways that AI and ML can be used include for security automation include:
For most organisations, including charities, the idea of implementing security automation can seem daunting. But it is important to remember that you don’t have to automate everything at once. A far better plan is to start small, and then build from there.
Before deciding where to start, it is important to take some time to examine security pinch points and to identify which problems that could be solved by automation are the most pressing. For example, do security staff have trouble handling the number of security alerts they receive? Do they struggle to make sense of SIEM data? Or perhaps the right staff are not always on hand to make the correct security response decisions?
Depending on the answers to these questions, as well as your charity’s security budget and in-house cyber security skills, the best way to proceed may be to:
SOAR products include:
Splunk Phantom (free and paid editions)