Insights
We look at how charities can look after the mental health and wellbeing of their staff and volunteer content and social media moderators
The impact of social media content moderation on mental health is an increasing area of concern. In recent years, social media platforms have come under scrutiny for their own moderation practices.
Documentaries like The Cleaners have examined the day-to-day work of content moderators for major platforms like Facebook. Moderators are tasked with reviewing content that could be violent or pornographic in line with the platform’s community guidelines – choosing whether to post or delete it. Continued exposure to this kind of content is having an impact on their mental health.
The Cleaners Directors, Hans Block & Moritz Riesewick, said: “The symptoms displayed by many of the content moderators are similar to those experienced by soldiers returning from combat.”
Content moderators in the charity sector have a far less extreme task. The day-to-day experience of managing a charity’s community might include thanking fundraisers or answering FAQs about an event. However, charities exist to solve difficult societal problems, so charity content moderators can find themselves dealing with distressing content.
In the Charity Comms Wellbeing Guide for Comms Professionals, Kirsty Marrins writes, “most organisations don’t recognise that those who manage their social media are on the frontline and are often the first point of contact for someone, whether they’re simply asking for help or support or whether they’re trolling the organisation.”
In addition to dealing with trolls, charity social media managers interact daily with people who have lived experience of the issue the charity works on. This can mean hearing regularly about discrimination, physical and sexual violence, death and bereavement, or experiences of poverty and homelessness, for example.
Social media teams can also find themselves at the epicentre of media crises. In the last couple of years, charities like the RNLI and the National Trust have been caught up in the so-called ‘culture wars’. Their actions – rescuing migrants in distress at sea and examining historical links to colonialism and slavery – have been heavily criticised, resulting in lots of negative social media comments.
Negative or distressing comments are directed at the charity brand, but it can be difficult for social media staff and volunteers not to feel personally targeted. In an interview for the Content Marketing Institute, Ella Dawson, Social Media Manager for global non-profit TED, says: “It’s very difficult to take your own personal identity out of it. There are moments where it does feel like you’re the one being insulted or attacked.”
Charities have a responsibility to provide a psychologically safe environment for staff and volunteers who work on their social media platforms. Here are five ways your charity can help safeguard the mental health of your content moderators.
Run through your planned social content and alert your moderators to any upcoming content that might be distressing or likely to generate negative comments. This will help them to make decisions about when they might need to step away from moderating content that is personally triggering.
Make it clear that moderators can step away from moderating content that they find distressing, especially during times of crisis when there may be a large volume of negative comments to deal with. This means making sure that more than one person is trained to monitor your platforms to allow for mental health breaks.
Encourage content moderators to use any support lines or counselling services included in your staff benefits package. A regular reminder that these services are available will help staff feel comfortable using them.
Encourage content moderators to turn off notifications outside working hours. Particularly staff or volunteers who have brand social media accounts on their mobile phones.
Some social media scheduling tools like Sprout Social or Agorapulse offer an inbox function which allows moderators to respond to comments and messages from all platforms in one place without needing to have accounts on their phones.
Consider setting time aside for everyone at your charity who has content moderation in their responsibilities to get together and debrief on the week.
Support and make space for efforts to bring together cross sector content moderator groups. Charity content moderators have used Facebook groups like Third Sector PR and Comms to find others having similar experiences. Sometimes in collaboration with organisations working on similar issues and sometimes with organisations caught up in the same crisis.
Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.