Insights
Training
On-demand
You are viewing 1 of your 2 free articles
We look at how the rise of AI is driven by the ease of convenience, and how with greater awareness, charities can make more considered decisions around their use of AI
It’s hard to resist the lure of convenience. How often do we order something online to be delivered the next day, when we could easily wait a few days to buy it on the high street? Convenience can often give us instant gratification, enabling us to get what we want with the least effort.
Artificial intelligence (AI) feeds off the psychology of convenience, allowing us to get quick results with the least effort. In our daily lives, that can be anything from asking a chatbot how to return an item, organising a trip, or even deciding what to cook for dinner.
That ease continues into our working lives, with AI allowing us to save time and effort on everyday tasks, such as analysing marketing data or creating notes from a meeting. Interim findings from the ‘Charity Digital Skills Report 2026’ show that 88% of charities are using AI daily. That’s up from 76% last year.
AI brings a whole host of use-cases for charities: from creating content and supporting fundraising, through to running administrative and project management tasks. And the underlying reason for this rapid uptake is convenience, which in turn saves charities precious money and time. This article explores the costs of AI convenience and how to take action.
Convenience doesn’t always give the best results and can come at a cost. When it comes to AI, a huge cost is the environmental impact of running the data centres needed to power AI. This is an issue for an increasing number of charities, with those concerned about energy use and environmental impact rising to 39%, from 26% last year. There are also significant ethical risks around data security, data bias, discrimination, copyright, and misinformation.
Alongside this, research is finding that dependency on AI may have a detrimental effect on our ability to think critically and work things out for ourselves. One study scanned students’ brains while they wrote essays. Those who used ChatGPT had the lowest brain engagement compared to people who didn’t use it or who just used Google search. Over time, the students who used ChatGPT got lazier with their essay writing.
So with the drawbacks of using AI ranging from environmental impact to dulling our human critical thinking, it’s important for charities to be discerning about using AI. To use AI ethically and responsibly, charities need to make conscious and informed decisions about when and how they use it.
Before you launch into a task using AI, take a pause and question whether using it, in this specific situation, is the most responsible and appropriate option. There might be some situations where AI is the most efficient, making the most of your carefully fundraised money.
For example, if you’ve interviewed service users to create case studies, it could be a better use of your time to anonymise the interview as appropriate and use AI to transcribe it. But before you make that call, consider your options.
Search anything using Google and you’ll automatically be presented with an AI-generated overview, right at the top of your search results. It can take a conscious effort to scroll down and ignore them.
These summaries are designed to give you quick answers along with direct links to sources. While they provide answers ridiculously easily, they come with drawbacks. Sometimes they’re inaccurate, out of date, or simply incorrect. And they provide users directly with information, meaning that fewer will choose to visit the original websites – which, for charities, can mean less engagement.
If you’re researching information, use reputable sites and resources, rather than automatically adopting the Overview. You’ll only have to double-check the Overview findings anyway.
If you need user research – for example, to develop a new service or put together a grant application – you could get answers from AI, instantaneously and with minimal effort. But you can’t rely on the accuracy of those results, which could be biased, skewed, or completely untrue.
Beyond that, simply asking AI for input misses out the direct experience and opinions of your users. Nothing can replace the quality and depth of speaking human to human, and AI cannot understand the nuances of empathy. This is especially important if you’re asking for input on personal or sensitive topics. If that’s the case, then in-depth user interviews between humans is the best option.
Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.