Insights
Training
On-demand
In the rise of AI, charities have a critical role in shaping a human-led future
It seems that artificial intelligence (AI) isn’t going away. Beyond the widespread adoption of large language model-led generative AI that has seen tools like Claude, Copilot and ChatGPT occupying a permanent tab on our desktops, AI usage is seeping into the processes and practices that shape our daily lives.
An algorithmic decision might influence anything from whether we’re shown an advert on social media to whether we’re granted a visa to visit another country. And it’s impacting the daily lives of the communities that charities exist to serve.
To an extent, it’s been possible for charities whose mission relates to health, social care, the environment, or any area of expertise not directly connected with technology, to have a watching eye on the development of AI.
But AI usage is seeping into many of these areas now. Continuing to observe quietly from the sidelines risks charities being left behind and could lead to further exclusion of marginalised groups from a world heavily reliant on AI.
Without data, AI tools are nothing more than ornaments. Forgetting to think responsibly and critically about data in the context of AI is like looking at a car in a showroom and ignoring its need for petrol or electricity.
Charities are data controllers for valuable information on underrepresented groups and have the opportunity to bring those groups and data about them into the process for developing ethical, inclusive AI models and systems. As the World Economic Forum says, “At this phase of the AI life cycle there’s maximum opportunity to incorporate an inclusive approach, as the data serves as the foundation of the model.”
There’s also a role for civil society in data governance, ensuring that the highest quality and most representative data is collected and that it is protected, safeguarded, and has proper guardrails.
User acceptance testing is a vital part of the AI development process and another area where charities could play a critical role. The voluntary, community, and social enterprise sectors have extensive experience in inclusive service design and co-production, which could positively impact AI development and testing. There are already models in the tech sector, such as Microsoft’s inclusive design principles that the social sector complement with their own expertise.
Charity involvement with testing or shaping the testing process could also lead to better model governance. Charities can help ensure that AI models are used ethically and are optimised to serve all users, including marginalised groups.
The legal basis for AI regulation in the UK is still a work in progress. Now is the opportunity for charities to make the voices of the groups they represent heard, as white papers, conferences, and consultations become bills and governance bodies.
Research from the Ada Lovelace Institute shows there is, “a clear misalignment between public expectations and government ambitions for AI.” One of the recommendations from the report is to create formal channels to allow civil society, particularly those representing vulnerable groups, to feed into regulatory processes in a meaningful way.
Organisations in the charity sector that do have a remit around technology are opening the conversation to sector colleagues. For example, DataKind run discussions and offer a range of resources on AI adoption, policies, and ethics.
Thanks, in part, to the Public Law Project speaking out about the use of AI by government departments, the government’s Responsible Technology Adoption Unit has developed an algorithmic transparency recording standard and repository where models that interact directly with the public, or play a significant role in decision-making, must be published.
CAST has established the Charity AI Task Force to help the social sector adopt AI at the same pace as the commercial sector, but also to provide a point of contact for government to “benefit from the invaluable wealth of skills, connections, and community insights that the sector can share.” In 2025, the government promised to reset its relationship with civil society through the launch of the civil society covenant — another point of connection where social sector expertise can influence policy and practice on AI.
And charities are collaborating directly with developers on AI for good. The Children’s Society collaborated with Microsoft to develop an AI tool that supports young migrants and refugees, for instance.
When it comes to including charities and marginalised groups in the future of AI, progress is being made. But there is more to do, and the cost of exclusion could be significant and likely exponential over time. There are opportunities within the sector and AI developers to help create AI that is inclusive, representative and ethical, governed responsibly and used to support charitable missions.
Follow-up questions for CAI
What is AI governance?How can charities conduct user research to find out how their communities are impacted by AI?How can charities collaborate with each other to make greater change?How can charities build effective campaigns for government and companies to take action?How can charities change systems in society?Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.