Insights
We explore everything you need to know about Microsoft’s artificial intelligence assistant, Copilot, including how it can help charities with everything from fundraising to finance
Artificial Intelligence (AI) is set to transform the way we work. In the charity sector, AI is expected to revolutionise our operations, determine how we reach out to donors, identify the best times to send out emails, discover where their services are most needed, and much more.
It is important to note that AI is a broad field, encompassing lots of different technologies and tools, each with different capabilities. There is even an AI aggregator that pulls together different AI tools and what they are used for – ChatGPT can generate text and images; Vizly can analyse data to predict the future.
AI assistants are one of the most commonly talked about iterations of AI, as well as the most commonly used. Examples of AI assistants include Siri, Amazon’s Alexa, and, more recently, Microsoft Copilot. Each assistant allows the user to streamline processes, providing quick solutions to tedious or complex tasks alike. With AI assistants, charities can quickly find answers to their most pressing questions, arrange meetings, summarise emails, create drafts, and much more.
In 2024, it feels like we‘re on the cusp of mainstream AI adoption. With the promise of enhancing productivity and reducing the time spent on administrative tasks, AI is an opportunity charities can’t really ignore when balancing limited time and resource with rising demand for services.
The 2023 Charity Digital Skills report showed that more than half (53%) of charities are either using or planning to use AI in the future, while 78% of charities believe AI is relevant to their charity and has the potential to transform it.
Similarly, a report from IT services providers Wanstor found that nearly 90% of respondents to its AI survey believe that the use of digital assistants such as Copilot will positively impact employee productivity and efficiency. Less than 2% expect the technology to have a negative impact.
According to Microsoft, seven in ten people using Copilot said they were more productive, while people using it to catch up on missed meetings were able to do so four times faster. And charities are not likely to miss out on such benefits.
Wanstor’s research found that more than nine in ten charities were considering investing in Copilot specifically, with almost 90% agreeing it will have a positive impact on employee productivity and efficiency. Seven in ten non-profits say they are likely to implement Copilot across, management, team leadership, and administrative functions.
Microsoft Copilot is particularly helpful to charities in that it is built into a lot of the more familiar Microsoft products we already use. Charities using Microsoft 365 can use Copilot in conjunction with other Microsoft 365 products, such as Excel, Word, and OneNote. Likewise, charities using Microsoft’s Bing search engine or its Edge browser can use Copilot to assist them, using it to helpfully summarise search results.
In this article, we explore more about how each version of Copilot works, the AI technology that underpins the service, and how charities can use it to their advantage in future.
Microsoft Copilot launched in 2023 and is designed to assist users across a variety of Microsoft products, including Microsoft 365, Bing, and Edge, with the aim of simplifying tasks and enhancing productivity.
Microsoft Copilot, in essence, works by combining the natural language processing technology behind ChatGPT with the data from your own Microsoft usage – including your calendar, emails, Teams chats, meetings, and more. By connecting to existing Microsoft products, it can offer more bespoke benefits, with more insight into your work and what you need.
Microsoft Copilot is particularly helpful to charities in that it is built into a lot of the more familiar Microsoft products we already use. Charities using Microsoft 365 can use Copilot in conjunction with other Microsoft 365 apps, such as Excel, Word, and OneNote. Likewise, charities using Microsoft’s Bing search engine or its Edge browser can use Copilot to assist them, using it to helpfully summarise search results.
Since 2023, various iterations of Copilot have been launched to achieve different goals. For example, when used with Microsoft Teams, Copilot can recap meetings, summarise action points, and even translate languages. With Copilot for 365, users can use Copilot with Word, PowerPoint, or Outlook, generating content for emails or presentations.
The most commonly understood version of Copilot is probably Copilot for Bing, which used to be known as Bing Chat, and works alongside Microsoft’s search engine Bing. However, all of Microsoft AI assistant tools are now called Copilot, with the name of the app they work with usually included in the title – e.g. Copilot for Word or Copilot for Windows.
Copilot for Bing and Edge is free to users with a free Microsoft Account, as is Copilot for Windows, which is available in a preview mode meaning it’s still being tested. Pricing for other applications varies – Copilot for 365 costs around $30 per user per month and you need to have certain licenses to be able to add it on (which we’ll explain more about later).
There’s also Copilot Pro, which works like Copilot for Bing and Edge, but for $20 a month, users have the added ability to unlock the Copilot assistant inside Word, PowerPoint, Excel, OneNote, and Outlook – without Copilot for 365 – and you can create more images too.
The main difference between Copilot Pro and Copilot for 365 is that Copilot Pro is aimed more at individual users of Microsoft, whereas Copilot for 365 is aimed at enhancing the productivity of organisations as a whole. Copilot for 365 is more grounded in an organisation’s work, rather than the web.
To use Copilot for Microsoft 365, you need one of the following licenses:
Microsoft 365 Business Standard
Microsoft 365 Business Premium
Microsoft 365 E3
Microsoft 365 E5
Office 365 E3
Office 365 E5
Charities can access Microsoft 365 licenses at a discount through the Charity Digital Exchange. There isn’t a nonprofit discount currently available for Copilot itself.
Below, we explore more about the different versions of Copilot and how charities can use them.
Microsoft Copilot for 365, as we’ve mentioned before, works with your existing Microsoft apps such as Microsoft Teams, Outlook, Word, Excel, PowerPoint, and more to give users assistance that is grounded in their work and relevant to their needs.
Here’s a brief outline of what it can do with specific apps in 365:
Word: Copilot in Word brings together large language models (LLMs), which can process and generate text, to assist with tasks such as drafting new writing, summarising existing writing, and turning text into tables.
Excel: Microsoft Copilot in Excel can generate formula column suggestions, create charts, and generally point out insights you might not otherwise have spotted. Microsoft Support has a handy list of prompts that can help with data analysis within Excel, which we’ve linked to within the slides, which we’ll be sending out after this session.
PowerPoint: With Microsoft Copilot in PowerPoint you can create a presentation from an existing Word document, generate summaries, redesign slides with Microsoft Designer, and create speaker notes.
Teams and Outlook: The AI assistant generates email thread summaries to help users catch up on missed messages or find old emails. It can compose and refine emails, identify tasks and assignments in Teams, and suggest people to follow up on specific action items.
Users also get access to Copilot Studio, through which they can create custom GPTs (generative pre-trained transformers, like ChatGPT etc.), automate processes, or change the sources of Copilot for different results. Copilot Studio is included in Microsoft E3 and E5 licenses but available as a separate subscription to Copilot for 365 if using with a different subscription.
Copilot for Bing and Edge is a separate feature and it’s completely free to anyone with a free Microsoft account. It assists users during web searches and browsing by providing context-aware suggestions and responses.
For example, if you search "What is Charity Digital?", Microsoft’s AI technology will provide you with a short summary, complete with links to its sources.
Indeed, Copilot for Bing and Edge is very helpful in that it can connect to the Internet, unlike ChatGPT, giving it the ability to access more up-to-date information than other models which are trained on past data only. It is also very easy to find is easy to find – look for the Copilot logo.
For example, when highlighting text in certain tabs on Edge – like when you’re adding text to presentations on Canva – you can also get the option to “Rewrite with Copilot”. Users can also summon this by pressing Alt + I.
Copilot in Windows is an AI assistant in Windows that can help you with a variety of tasks, both relating to your PC’s settings and to generative assistance. Basically, it can help you with anything from answering questions as in Bing and Edge, launching apps, writing text, organising your windows, creating images, and more.
Replacing Cortana as the default assistant in Windows 11, Copilot appears in the task bar in preview form, which means it is still being tested. You can also summon it by pressing the Windows button on your keyboard + C.
It’s also worth noting that Copilot for Windows has an option to switch up conversation styles.
There are three:
Creative
Balanced
Precise
Creative – with the Creative style, you’ll get longer, more detailed responses, including jokes, poems, or images.
With the balance mode, you’ll get informative yet friendly responses – this is the standard conversation style
With precise, you’ll get concise, straightforward answers to your prompts – it’s direct and prioritises clarity
Also, a note on the limit of responses – you might see a “one out of thirty responses” when you interact with Copilot, but this is just in relation to each interaction. So each question can use 30 responses. To start again, simply start a new conversation or refresh.
As with any emerging technology, there are challenges involved with AI too. The speed with which AI technology is developing is a particular challenge for the sector, giving us limited time to adopt the digital skills and vision necessary to make the most of AI and crucially, mitigate its risks.
Therefore, with Copilot as with any generative AI tool, it is worth proceeding with caution and having clear usage policies in place before org-wide adoption. It’s important to both protect the charity’s data, finances, and also from a reputational perspective.
Only 13% of respondents told Charities Aid Foundation they wouldn’t pay much or any attention to what a charity they supported said about how they use AI. Around a quarter said they would pay a great deal of attention.
So let’s have a quick look at some of the main risks around using generative AI like Copilot and how charities can mitigate them.
AI hallucinations are, essentially, false or inaccurate outputs given in response to a prompt. The BBC points to good example of an AI hallucination, when a US law firm used generative AI for legal research, leading to fictitious cases being cited in court.
Hallucinations happen because, although AI is able to work out how words and letters work together, it never understands the meaning of what it is saying. So while AI understands grammar and word associations, it does not understand concepts.
As Charity Digital’s Head of Content Ioan Marc Jones points out: “[AI] is pattern matching, in essence, and if incorrect information matches the pattern, the AI may well produce that information. And, importantly, since the AI is often verbose and confident, it will relay that information in a way that is easy to believe.”
Using AI to search for answers or to generate content can make it easier for organisations to spread misinformation without even knowing. Charities have a uniquely trusted position in society and are experts in their cause; to reproduce or share incorrect information without fact-checking can lead to a decline in trust between its supporters, beneficiaries, and the organisation itself.
Similarly, as well as posing a reputational threat, incorrect information from AI can lead to incorrect decisions being made.
What to do about it:
Always involve humans. While AI can save us valuable time, having a human, who is familiar with your charity and its cause, look over information generated by AI can help spot any issues within it, reducing the potential to cause harm. Fact-checking is vital and there are plenty of tools to help with that, including Google Fact Check Explorer, Claimbuster, and Snopes.
Always check AI’s sources. Copilot footnotes much of the information it generates but it is worth checking those links are reputable and always making sure you’re giving credit to those sources, too, both to back up your information and to prevent accusations of plagiarism.
For charities to be sure they are delivering the utmost impact for their communities, they must first be confident that their data is correct. This means eliminating data bias – flawed data that perhaps favours certain demographics or groups, leading to an incomplete picture of the challenges charities are trying to address. Data bias, like hallucinations, can lead to false results or poor decisions being made without access to more accurate information.
AI is a data machine. Its outputs are only as good as the data that goes into it, and we’ve already seen high-profile cases of biased data leading to biased AI, including Bloomberg’s study of AI images which found clear racial and gender disparities, where high-paying jobs were dominated by images of men with lighter skin tones, for example.
In short, if data being fed into it is biased, AI can produce biased results. It can extend that bias through machine learning, creating outputs that are flawed and drive little impact for those excluded from the data.
What to do about it:
Make sure your data is ready before you use it with AI. For example, when using Copilot to make predictive suggestions, ensure data is complete and up-to-date and understand where you might need more, or less, data to understand the next steps.
Similarly, as we mentioned before, creating an AI policy that defines the general rules of AI usage can be really helpful to ensure data is used responsibly, with privacy and security in mind. Knowing where AI is being applied can help with identifying any potential problems or errors. Remember AI is a tool and it does not understand your charity like you do.
Charities should always be cautious when feeding data into any system they use, including AI tools, making sure they are secure against potential cyber breaches.
The risk of a data breach was highlighted as the second biggest concern for the public in terms of charities using AI, according to research from Charities Aid Foundation, meaning data security remain a priority for the sector when adopting new technology.
The key word here is “remain” - Charities Aid Foundation notes that people “do not see the use of AI as making a data breach any more likely than it is already”, and charities have a legal and ethical responsibility to ensure their data is robustly protected against cyber threats whether they’re using AI yet or not. The important thing is to proceed with caution and always look at the security of the systems you’re using.
This is what actually makes Copilot for 365 a good choice for charities – it has the weight of Microsoft’s data protection features behind it already and is compliant with its GDPR policies.
In Copilot for 365, your Microsoft Graph data isn’t used to train the AI model, for example. The information contained within your prompts, the data they retrieve, and the generated responses remain within the Microsoft 365 service boundary, in keeping with its current privacy, security, and compliance commitments. You can find out more about how your data is processed in Copilot for 365 on this slide.
It’s also worth noting that Copilot has a dedicated security tool too which has been very recently launched and works with other Microsoft security products such as Defender, Sentinel, and Purview. Charities need an existing Microsoft Azure subscription to add on.
What to do about it:
Empower your staff and volunteers by training them around the risks of AI, as well as the capabilities. Defining how AI is used and the risks means people know what using it responsibly looks like and can keep data privacy and security at the top of their priorities.
Click above to discover Microsoft non-profit discounts on the Charity Digital Exchange
For the sixth year in a row, we're bringing back an action-packed event filled with Digital Fundraising insights from the charity and tech sectors. Join us on 7th October 2024 for a free, one-day online event featuring informative webinars and interactive workshops.