Insights
We explore how charities can make the most of AI learning for their cause
In 2025, artificial intelligence (AI) use in charities is largely pushed from the grassroots by staff and volunteers. While 61% of charities are using AI, only 11% are taking an organisation-wide approach to adopting it, according to the Charity Digital Skills Report.
That suggests that the sector is excited about AI – that charity workers of all levels may have found uses for it that could potentially bring charity missions closer to reality.
On the other hand, this individualised approach leaves charities vulnerable to risks. Without a joined-up strategy that manages risk and skills development, staff and volunteers could make substantial mistakes, like breaching data privacy by entering information into AI chatbots which save and learn from all information entered.
So how can charities make the most of their team’s AI engagement, and build on their existing knowledge, while mitigating the risks? This article explores how to create a learning culture for AI.
A learning culture involves the ability to learn from mistakes – but in the case of AI, it’s best to prevent the biggest mistakes by first constructing clear guardrails. Using boundaries, the team can explore opportunities with more confidence and less risk involved.
As a starting point, charity leaders and trustees can check out Zoe Amar Digital’s AI checklist for charity trustees and leaders, The Wildlife Trusts’ AI Risk Assessment (please note, at the time of publishing, this is a prototype/working draft), and AI Fringe Safety Summit resources.
In the world of AI, many value innovation for its own sake. But operating with squeezed resources and driving determinedly towards a clear vision of change, charities know the importance of staying focused.
That means learning how to approach AI as appropriate to your charity’s service users. That might include exploring how AI has changed what service users need from your services. For example, young people’s charities may find users now need support to navigate AI in education.
It might also mean understanding service users’ attitudes towards AI tools like chatbots to determine to what extent they might be helpful, and how to design them to be accessible.
A good gauge of whether to use AI for a particular task is whether it will tangibly help service users and bring the charity closer to its mission. Gathering data and working together with users can illuminate this question.
Charity staff and volunteers should use AI out in the open, rather than in secret, to minimise risks and make the most of learnings. Offering AI training could be a good jumping-off point for honest discussion.
Staff and volunteers should feel empowered to safely experiment with AI, where relevant to the charity’s work, and share their learnings. Holding a regular AI ‘sync’ meeting, informal ‘brunch and learn’ sessions, or forming an AI working group could help encourage skill building over time as well as transparent communication.
Use specific learning questions to cultivate focus in experimenting and sharing AI learnings. Some ideas are:
To maintain transparency and trust, make sure the appropriate teams are involved in decision making around new strategies and projects involving AI. Discussions should be inclusive, nuanced, respectful of varying hopes and fears, and make good use of data and factual evidence.
Make sure to close the loop when it comes to decision making on AI. This includes explaining what decisions were made, why they were made, and how risks will be managed. Keep an open channel of communication for assessing new AI projects.
Charities’ worlds are constantly changing, and so is the landscape of tech – so staying up-to-date is key. But how can charities balance the demands of service delivery with the need for constant learning?
“You will need to give people time to engage and participate in evaluation and learning activities,” says NPC. “Knowledge and learning will inevitably take a backseat to delivery, so this needs to be carefully considered and tackled head-on.”
Though learning involves an investment of time, learning and service delivery should ultimately complement each other, leading to more effective results for service users, and saving time in the long-term.
To streamline the learning process, integrate data collection into day-to-day work, making it easy and accessible for service users, frontline staff, and others to share feedback and learnings. You could use questionnaires, a ‘comments’ book, a social media hashtag, or a purpose-made app.
From there, review the data and enact the changes needed. Integrating review and decision-making processes into daily operations can help the team constantly build upon past successes and mistakes and avoid repeated stumbling blocks.
Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.