ao link

You are viewing 1 of your 1 articles as an unregistered user

For unlimited access to our free content, please register or login.

Artificial intelligence: data protection and privacy

Artificial intelligence offers charities huge benefits. But this is what you need to know before you let it access constituents’ personal or confidential information

Blue and pink computer generated image of a human face
Artificial intelligence: data protection and privacy

The use of artificial intelligence (AI) by organisations of all sizes is skyrocketing as the technology becomes more powerful and more pervasive.

 

Some large corporations have their own bespoke AI systems to assist them, some use AI-powered chatbot software to improve customer service, and some use AI tools to improve the user experience on their web sites or to improve the recruitment and retention of their staff.

 

And thanks to Microsoft’s drive to incorporate AI into Office products including Word and Outlook, almost every organisation – including charities – will be using the technology in the near future.

 

This has important implications for your charity and here’s why. All charities rely to a great extent on trust and if service users and donors lose trust in your charity then its ability to raise funds and carry out its aims will be severely compromised.

 

And here’s the rub: 60% of people are worried about how organisations are using AI with private information, and 65% say that their trust in some organisations has diminished because of their use of AI. That’s according to a recent survey carried out by technology giant Cisco.

 

These are extraordinarily high numbers. They suggest that despite the potentially large benefits your charity could reap through the use of AI, you should certainly take steps to ensure that you go about implementing it with care and due consideration. That’s to ensure that you don’t end up losing the trust of the majority of your constituents.

 

 

What are the issues?

 

AI needs data in order to operate, and when we are talking about charities then it is highly likely that at least some of that data could be personal or private. There are two key risks here: one is that confidential data could leak – perhaps via a cyber-attack – and the other is that the AI may be able to use the data to reveal things that are not the charity’s business and are essentially private.

 

Unfortunately, that’s not the end of it. There are also regulatory issues, particularly when it comes to privacy laws such as the UK and EU General Data Protection Regulation (GDPR). For example, individuals have the right to know what personal data about them an organisation has stored, and they have the right to request that that data is deleted. But the way that data is incorporated into an AI means that it may not be possible to delete an individual’s data, or that an individual’s data will still affect the results that an AI produces even after the data has been deleted.

 

GDPR also gives individuals the right, in some circumstances, to get a human to review a decision made by an AI. Although this sounds straightforward, it is not always the case in practice. That’s because AI works in a different way to the human decision-making process, so it simply may not be possible to review the decision making process to look for errors or false assumptions. When an AI makes a decision, the underlying process may be completely opaque.

 

 

What should your charity be doing?

 

In order to protect your constituents’ privacy and to avoid falling foul of privacy laws such as GDPR, your charity needs a strategy. Here are four key points to help you formulate that strategy.

 

 

Have a plan for AI 

 

Don’t let AI into your charity and then watch its usage grow into areas that involve private information. Instead, make clear decisions about how AI should be used in your charity. When you have decided this, determine exactly what personal data would need to be used, whether it is really necessary to use it, and whether it can be deleted once it has been used.

 

 

Foster trust

 

If your constituents are worried about your use of AI, trust in your charity will likely be damaged. So make sure you can explain how AI will be using their personal information, and how this may affect them. Where possible you should also try to be in a position to explain specific decisions or predictions made by your AI software. This transparency and “explainability” should go a long way towards ensuring that you can maintain your constituents’ trust.

 

 

Anonymise where possible

 

Does your AI require that data is linked to specific people, or can it be anonymised? If the latter is the case then there are a number of techniques including aggregation and removing personally identifiable components that you can use to benefit from the data without risking individuals’ privacy.

 

If you do need to use personal data linked to specific people then collect as little as necessary, ensure you have the permission to collect it, allow as few people or AI systems as possible to use it, and delete it as soon as you no longer need it.

 

 

Test for bias

 

AI systems can be subject to unintended bias due to the composition of the data that they are given. For example, LinkedIn discovered that its AI system was biased towards recommending men rather than women to open roles, and this was traced back in part to the irrelevant fact that men are more likely than women to respond to recruiters.  So it is important to look closely at the decisions or recommendations that an AI provides to your charity to help ensure that they are fair, explainable, and free of unintended bias. If something looks odd, investigate.


Related Articles

A simple guide to financial sustainabilityA simple guide to financial sustainability
AI and the future of the charity sectorAI and the future of the charity sector
How A.I. can help charities confront the Cost-of-Living crisisHow A.I. can help charities confront the Cost-of-Living crisis
How are charities using artificial intelligence in service delivery?How are charities using artificial intelligence in service delivery?
How to spot AI contentHow to spot AI content

More on this topic

Understanding the latest fundraising and AI trends for charities

Understanding the latest fundraising and AI trends for charities

How AI can serve everyone

How AI can serve everyone

How to learn AI in 2024

How to learn AI in 2024

Webinar: A charity guide to Meta Quest VR

Join us on the 30th of May, where we will explain more about how VR technology works and how it can help charities, hearing from organisations where the technology has already made a difference.  

 

Sign up here

We use cookies so we can provide you with the best online experience. By continuing to browse this site you are agreeing to our use of cookies. Click on the banner to find out more.