Insights
We dive into what shadowbanning is, who is affected, and what you can do if you think your charity has been shadowbanned
Shadowbanning is when a user is blocked from a social media site or online forum without their knowledge. This is typically done by making their posts and comments invisible to others.
The experience is like being on “invisible mode”: only the banned user can see their own posts, meaning they may not realise they have been shadowbanned.
Most social media companies deny using shadowbanning, but the phenomenon has been reported by users across platforms such as Facebook, Instagram, TikTok, and Twitter.
Shadowbanning can impact charities and non-profits in a few different ways. If a charity’s account is shadowbanned, their posts will get a much lower reach, limiting their ability to raise awareness, educate, and reach out to beneficiaries and donors through social media.
Paid adverts can also be shadowbanned, according to Paul D’Alessandro, founder and chairman of High Impact Nonprofit Advisors. If your charity suddenly realises it is earning less money through social media ads, this may be why.
If you have been shadowbanned, there are a few ways to find out.
Instagram has recently launched a new tool to let you know whether your posts are being made less visible to others. To use the tool, you can go to settings, account, then select ‘Account Status’.
The feature will allow you to view the ‘Recommendations Guidelines’ and detail which posts have been found to be in violation. Users can also select the ‘Disagree with this decision’ button to appeal the decision.
On TikTok, Neil Patel, co-founder of NP Digital, recommends that you look at your pageviews and ‘For you’ page statistics to get an idea of whether your reach has changed. “You can also use a hashtag in a post, then search for that hashtag. If your post shows up under that hashtag, then you aren’t shadowbanned.”
For Facebook, you can similarly use its inbuilt tools to check for a change in engagement in your recent posts for an indication of whether their visibility has been limited.
Instagram and Facebook each have guidelines for what type of content is likely to be restricted from being recommended, which you can check out.
Content that breaks a platform’s community guidelines or recommendation guidelines may result in a ban of some kind, but it is unclear whether social media companies inflict shadowbans to accounts for other reasons, too, as social media companies don’t publicly disclose all of their processes for how they maintain and moderate their platforms.
Adam Mosseri, Head of Instagram, told the BBC, “Sometimes your account can end up in a state where it’s not eligible for your photos and videos to show up in [our] recommendations,”
“If you have posted things that violate our ’recommendability’ guidelines or recommendation guidelines...you can end up in a state where your content won’t be recommended.”
Patel suggests that on Instagram, less “family-friendly” content could put you at risk of a shadowban.
On TikTok, he suggests that illegal material, violence, hate speech, and spam are all reasons why your account might be shadowbanned.
On Facebook, sharing misleading information and clickbait is a big reason for a shadowban to be imposed, whereas on LinkedIn, spamming, disrespect for others’ privacy and intellectual property, and harassment are among the reasons for shadowbanning.
Forbes also explains that profitability comes into what decisions are made around the way these companies manage peoples’ accounts: “Users who do not generate engagement or profit are not valued and become less visible.”
As we will explore below, marginalised communities also report being disproportionately subject to shadowbanning, which is thought to be a result of bias in social media companies and their algorithms.
Charities and not-for-profits are unlikely to fall foul of many of these observed rules. But D’Alessandro suggests that organisations that focus on “controversial” issues should plan for the possibility of being shadowbanned.
Instagram distinguishes between those accounts perpetuating certain online harms and those which are trying to address them (i.e. “accounts focused on providing support, raising awareness, and recovery”). Charities that use Instagram to tackle certain harmful subjects in this way will therefore likely still be recommended to other users as usual.
But across social media, there is still a possibility that your organisation could fall victim to a shadowban. A community report about the impacts of shadowbanning on sex workers and activists explains: “Content moderation practices reflect the biases of the creators and the platforms and algorithms, as well as the biases of the content moderators themselves.”
Forbes expands: “[Artificial intelligence] learns and changes from the algorithms its creators develop. The unconscious biases of the developers are embedded in the systems they create.”
So charitable organisations may be subject to shadowbanning online depending on the biases involved in the systems and processes of different social media companies.
Shadowbanning has the positive uses of suppressing disinformation, clickbait, and spam. In addition, the Instagram and Facebook recommendation guidelines outline how they are working to reduce some other significant online harms through recommending certain content and accounts less.
This is necessary for a safe and healthy online world.
But shadowbanning has also had negative impacts on marginalised communities, whose accounts are often targeted regardless of obeying a platform’s rules and guidelines.
Those commonly affected are LGBT+ people, BAME communities, sex workers, and plus-size people.
The community report ‘Posting into the Void’ finds that shadowbanning has been used to both repress the voices of sex workers and the voices of the Black Lives Matter movement.
The report notes that content moderation processes can interfere with the ability for people to make a living. Forbes also notes that it can impact growth and income for business owners and artists.
Importantly, the report describes how the processes have disrupted social movements and have isolated communities, among other impacts.
The #IWantToSeeNyome movement stood up against the suppressing of Black plus-size model and influencer Nyome Nicholas-Williams’ Instagram posts. Talking about her shadowban, Nicholas-Williams said, “I feel like I’m being silenced.”
Mental health specialist Dr. Neeta Bhushan says that the stress of trying navigate a platform’s “guidelines” by suppressing self-expression is dangerous, stating, “Being told over and over again that you are ‘inappropriate’ or unwanted, simply for being yourself is both exhausting and traumatising for users.”
Dr Carolina Are, a content moderation researcher at the Centre for Digital Citizens at Northumbria University, told the BBC that the problem is exacerbated by a lack of real people involved in the moderation process.
Also a content creator on Instagram who often posts videos as a pole dance instructor, she says she has experienced shadowbans herself.
Talking about the new transparency of the Instagram recommendation settings, she says, “I still think it’s a bit of a cosmetic and performative change…As good as this is, without investing in human moderation, we are just going nowhere.”
If you are shadowbanned, you may be able to appeal the decision or simply wait for the ban to end. If you feel the nature of your charity’s online content is unfairly resulting in shadowbans, you can consider moving your online activity onto a different platform or you can campaign to help tackle biases in online algorithms.
Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.