Insights
Training
On-demand
We explore how trustees can build their AI skills and govern safe and responsible AI use with OnBoard
Artificial intelligence (AI) is already influencing how charities operate: 76% of charities are using AI, according to the ‘Charity Digital Skills Report 2025’, with most learning on the job about using AI for administration, project management, fundraising, and communications.
Yet many trustee boards feel under-prepared to engage with AI, limiting their ability to govern its use effectively and confidently. In 2025, only 3% of charities said their boards are excellent at AI skills, while 44% said they were poor in this area.
But it doesn’t have to stay that way. This article will explore how trustees can build practical AI literacy – not to become technical experts, but to strengthen governance, oversight, and decision-making in a way that is secure and directly relevant to the board’s work.
Being a trustee is a voluntary role, meaning that trustees can face time pressure when fulfilling their governance duties. Trustees can come from many different backgrounds, meaning that they may not always have experience of governing the use of digital technologies.
Charities could once thrive with one “digital trustee” – an expert who oversaw all things digital – but today digital encompasses so much of charity work that every trustee needs to grow their digital understanding in order to help their charities avoid risks and make the most of opportunities.
And it’s no different when it comes to AI. The technology’s applications range from finance to service delivery, from administrative tasks to content creation to fundraising.
The AI boom has left many trustees uncertain about how to proceed, feeling intimidated by the technical aspects of AI, worried about its potential risks, and lacking a shared language to talk about AI with their fellow trustees.
So how can trustees get up to speed with AI to help their charities stay protected, build resilience, and excel in building towards their mission?
Boards need to know where to put their focus. They don’t need to become AI developers – instead, they need the skills to ask the right questions to understand and respond to AI. That includes both its risks and use cases, both at board level, and across the whole organisation.
Download the OnBoard AI Brochure
Governing AI use in charities involves understanding how to use AI responsibly in practice, helping you think constructively about how your charity team is using AI.
Boards can practice using secure AI with OnBoard AI’s suite of tools, which create a controlled environment that respects privacy, governance requirements, and ethical oversight.
OnBoard says: “The platform operates in a closed-loop system: no prompts or data ever leave your board’s unique instance, and nothing is used to train external models.”
You can use the suite to generate structured agendas, summarise and surface context in board materials, and transcribe and draft meeting minutes.
By both practicing high ethical standards and embedding them into their charities’ AI use through policy, boards can mitigate potential risks and create clear boundaries for safety.
Strengthening AI oversight is all about keeping a human in the loop at the key moments of AI processes, whether that’s in board meetings or on the ground of fundraising or service delivery. Trustees need to ensure the right processes are in place for charity teams to intervene into AI processes and ensure they are aligned with the charity’s mission and values.
In board meetings, tools such as OnBoard’s Minutes AI support transparency, giving you the ability to choose how much automation you want – and edit, revise, or delete content as needed to match your board’s tone and recordkeeping preferences.
Trustees can improve AI oversight across their charities by fostering a culture of transparency and embedding thoughtful processes to keep humans always in the loop.
AI use in charities has been largely driven from the grassroots by staff and volunteers, without a strategic approach. Charity trustees and leaders need to build a constructive conversation about how teams are using AI, particularly when it comes to making decisions that affect service users and other stakeholders.
To support strong, human-centred AI decision-making across charities, boards can build time for teams to continuously learn about how their work with AI can better meet service user needs.
In the board room, it’s for trustees to make the important decisions. But they can use AI to help them become more effective in that process. OnBoard’s Insights AI anticipates and surfaces the topics that need attention before they emerge, while Book AI highlights language and content that may carry risk, so boards can address it before it becomes an issue.
Follow-up questions for CAI
How can trustees effectively build practical AI literacy for governance?What methods help boards strengthen AI oversight and maintain human control?How can trustees ensure AI data privacy within charity board operations?In what ways can AI support decision-making in charity trustee meetings?How can trustees embed ethics into charity AI use?Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.