ao link
Charity Digital
Search
Remember Login

New to Charity Digital?

User Menu
Remember Login

New to Charity Digital?

Remember Login

New to Charity Digital?

Search

You are viewing 1 of your 1 articles as an unregistered user

For unlimited access to our free content, please register or login.

Artificial intelligence and cyber security

Cyber criminals are taking advantage of artificial intelligence. Here’s what you need to know to keep your charity safe

Ai-generated image of a person rendered with white data against a black background
Artificial intelligence and cyber security

Artificial Intelligence (AI) can provide a huge boost to your charity’s security, helping to keep you safe from cyber attacks. But malicious hackers can also make use of AI technology to attack your charity. That means that when it comes to cyber security, AI is something of a double-edged sword.

 

Keeping your charity protected as AI continues to develop is, therefore, essential. Charities can find a wide range of cyber security products available at a discount on the Charity Digital Exchange.

 

Check out Avast on the Charity Digital Exchange

 

 

AI-powered security software

 

AI systems can be used to create very powerful security tools, because they can monitor vast amounts of data very quickly. They can also be trained to spot unusual patterns of behaviour – such as a large number of attempts to enter a password, or people logging in to computer systems in the middle of the night. When they do spot something anomalous these security tools can raise a security alert or even take defensive actions such as shutting down a particular system.

 

AI security tools can be particularly beneficial for smaller charities that may not have a large dedicated cyber security team. Sometimes AI is embedded into security software that the charities run themselves, but more frequently the AI security tools are run by “managed security service providers” who monitor charities’ computers for cyber security threats on their behalf.

 

 

AI-powered criminal activity

 

That’s the good news when it comes to AI and security. The bad news is that cyber criminals are also beginning to take advantage of AI technology to help them in their criminal endeavours.

 

For example, many charities suffer cyber security breaches when employees fall victim to phishing attacks after reacting to malicious emails. These malicious emails can sometimes be spotted before they cause harm because they have spelling and grammar mistakes.

 

But AI systems like ChatGPT – known as Large Language Models – can make it much easier for criminals with a limited command of English to generate more convincing content for their emails. In the future it’s likely that criminals will also use AI to generate “deepfake” content such as fake voice messages from charity bosses asking employees to make bank transfers.

 

Large Language Models also present a security risk because charity staff may use them to generate content such as reports and proposals. That can be a problem if staff submit confidential information to a publicly accessible AI-system which may stolen by hackers.

 

 

AI systems are opaque

 

Another problem with AI systems is that they can be very hard to understand. While an AI might make recommendations – for example it may analyse data held by your charity to help decide which projects should be carried out or where money should be allocated – it is not always obvious why the AI made the recommendations that it did.

 

This makes them vulnerable to attack by criminals who wish to manipulate the recommendations. They may attempt to do this by “data poisoning.” This involves breaking into your charity’s databases and injecting false data or changing existing data. If the AI processes this data, then it could be made to make recommendations which benefit the criminals rather than your charity.

 

 

Confidential data breaches

 

The problem is compounded by the fact AI systems often rely on large amounts of data which they ingest and process. When it comes to charities, this data could include private and personal information about constituents. If cyber criminals are able to break into your AI databases then they may be able to make off with a vast trove of confidential information that they can look to exploit.

 

 

Growing cyber security problem

 

At the moment the risk to charities is relatively low – both because the adoption of AI systems is in its infancy, and because cyber criminals are only beginning to exploit AI for their own advantage. But it’s a problem that has been recognised by the UK’s cyber security chief, National Cyber Security Centre CEO Lindy Cameron. “AI developers must predict possible attacks and identify ways to mitigate them. Failure to do so will risk designing vulnerabilities into future AI systems,” she warns.

 

But as time goes on these risks will inevitably increase. Here are some things your charity should be doing to protect itself from the cyber security threats posed by AI systems.

  • Offer anti-phishing training: Phishing is involved in the majority of cyber security attacks. The best way to prevent phishing attacks is to provide regular training to staff to help them to recognise phishing emails and to know what types of reactions to emails they should avoid
  • Don’t give confidential information to external AIs: It’s tempting to use systems like ChatGPT to generate documents for your charity. But anything you give the system could potentially be stolen by cyber criminals
  • Keep software up to date: All of your charity’s software should be updated promptly to ensure that any known security vulnerabilities are fixed. An effective way to ensure that updates are not overlooked is to use patch management software
  • Keep your endpoints secure: To keep malicious software out of your charity’s desktop and laptop computers, make sure that all of them are running endpoint security software which includes anti-virus and anti-ransomware programs
  • Encrypt your data: Keeping your data encrypted makes it far harder for criminals to access your data even if they break in to your systems. Most commercial databases have their own encryption systems, while data on PCs can be protected with Microsoft’s BitLocker encryption
  • Use strong passwords and 2FA:  Make sure that all of your charity staff use long, complex passwords which are difficult to guess. Most systems, including cloud services, also offer two factor authentication (2FA) as an additional layer of protection to prevent cyber criminals accessing confidential data

Related Articles

Amazon Web Services Credits for NonprofitsAmazon Web Services Credits for Nonprofits
Avast Business Antivirus - Annual SubscriptionAvast Business Antivirus - Annual Subscription
Exploring AI and talent managementExploring AI and talent management
How A.I. can help charities confront the Cost-of-Living crisisHow A.I. can help charities confront the Cost-of-Living crisis
How AI can boost user experienceHow AI can boost user experience

Related Media

Generative AI and biasGenerative AI and bias
What is Generative AI?What is Generative AI?

More on this topic

How to retain donors

How to retain donors

How to deepen your impact with social media in 2025

How to deepen your impact with social media in 2025Sponsored Article

Charity Digital Academy

Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.

 

Tell me more

Recite Me toolbar