Algorithmic Bias is a phenomenon that occurs when an algorithm produces results that are systemically prejudiced due to erroneous assumptions in the machine learning process” (Mary K. Pratt)
In the early days of Facebook, Instagram, and Twitter, the majority of the content you would see would be from friends or other people you follow. These days, you see content that an algorithm thinks you’ll like and you see less from those you follow. You also see more advertisements from brands. You see these ads in your Instagram stories and in your feed. In 2016, Instagram changed their algorithm to no longer show things chronologically. Since then, we see what they want us to see. Reels are pushed in our faces (even as Meta has now announced the end of Reels bonus plays which is one way creators made money). Ads are pushed in our faces. We barely see the people we want to see.
The current algorithms across all social media platforms are biased. How can an algorithm be biased? In her book, Algorithms of Oppression: How Search Engines Reinforce Racism, Dr. Safiya Noble writes:
Part of the challenge of understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings… The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy, which is well documented in studies of Silicon Valley and other tech corridors… we are supposed to believe that these same employees are developing ‘neutral’ or ‘objective’ decision-making tools.
Though Dr. Noble’s research is specific to search engines, I believe this framework can also be applied to social media platforms and their algorithms.
When we hear talk of “the algorithm,” it’s often in a way that is distanced, sterile, and ghost-like. The way we discuss it only benefits social media platforms. They want us to think it’s “computer error” if anything should go awry. They want us to think it’s not on them. The problem though, like Dr. Noble said, is that real humans are behind these algorithms; real humans with bigoted views get to decide what is seen by a majority of users.
In 2020, Vox reported on a study that found that “by teaching an artificial intelligence to crawl through the internet—and just reading what humans have already written—the system would produce prejudices against black people and women.”
We saw this during the height of #BlackLivesMatter. Many posts and tweets were pushed down. Some weren’t even shown in people’s feeds. TiKTok actually apologized to Black users for what they called a “glitch” during this time.
Dr. Noble writes:
Many people say to me, ‘But tech companies don’t mean to be racist; that’s not their intent.’ Intent is not particularly important. Outcomes and results are important.
We’ve also seen how the biased and unfair algorithm has hurt sex workers. In 2020, Instagram announced updated censorship rules. According to Daze, these included: explicit sexual solicitation, sexually suggestive emojis, and sexually explicit language. The article quotes London-based sex worker, Rebecca Crow, saying: “(Instagram’s censorship) leaves already precarious sex workers without any platform for online content promotion, which is the safest way to work during the global pandemic.” All social media platforms have only ever been hateful to sex workers, so this continued censorship is not surprising.
There is also the issue of Shadowbanning in which a user’s posts are consistently not shown in their followers’ feeds resulting in little to no engagement. Instagram’s CEO has consistently said that shadowbanning “is not a thing.” Perhaps it’s the terminology that is “not a thing,” and what we should actually call it is “algorithmic bias.” That is, indeed, a thing.
During my time on these platforms and doing my own “content creation” (barf, that term gives me the icks, but you know what I mean), I know firsthand how frustrating it is to spend your time, energy, and love on a post just for it to have zero engagement. I often wonder if my current following of 247K is even correct, because I will get maybe a hundred likes on some posts. Even as someone who struggles with dyscalculia, I know that the math ain’t math-ing. Do I have bots following me? What’s even happening?
I don’t often post anything that I consider “inappropriate,” but part of the problem with algorithms is that they will flag what they deem “inappropriate.” This is how certain hashtags get hidden as well. For example, Instagram has been known to hide hashtags like “#sexualhealth” but not racial slurs. The humans who created the algorithm clearly need to do better.
There is a continual lack of reciprocity I have felt in having my large Instagram account, especially as I see little to no engagement on each post. This is why I have mostly stopped creating information image carousels for the ‘gram. We can all joke about the Canva-ication of social justice posts, but that shit is labor. Do you know how long it takes (if you’re doing it the “librarian” way) to research, vet the information, double-check it, write it in a way that is easily digestible (accessible language), and also make it visually appealing? Until you’ve done it, stop making fun of it. I see the work that so many accounts put in just for their posts to barely get seen. It’s a slap in the face to those of us who have been on these platforms for over a decade. I don’t blame my following (although I do wonder sometimes why people don’t double-tap that damn heart button), I primarily blame Instagram.
The algorithm (and the people behind it) are continuing to harm our social justice movements. Information is either scarce (and not being shown) or it’s incorrect. Transphobia, whorephobia, racism, ableism, and sexism run rampant and appear louder than anything else. In the current dangerous political climate, social media algorithms are only going to make things worse for real people. These platforms have to change and we have to stop relying on them as our primary online spaces.
Stay tuned for the last installment of this series where I’ll discuss how social media platforms freeze activism movements.
🎉 Things of the Week
Am I too hopeful in thinking this may stop parents from exploiting their children for views on social media?
Ada Limón’s interview on On Being
Conservatives Are Trying to Ban Books in Your Town. Librarians Are Fighting Back. - Melissa Gira Grant
This from
:This beautiful piece by
on volcanoes & Sant'Agata
✨Reminders✨
The Guerrilla Feminist is reader-supported! Thank you to the folks who pay monthly to support my work. If you want to be added to a paid plan, go here.
Want consulting on Digital Violence Prevention Programming? Go here.
Wanna add to The Guerrilla Feminist Spotify Playlist? Go here.
Forward this email to a friend you think would enjoy it or take a screenshot and share it on social media!
Thank you for voicing my similar concerns. I still have ghost profiles on fb and IG and stopped checking or looking at it three years ago. When I think about going back to it I have a feeling similar to trying to eat something again that once made you seriously ill from food poisoning. It’s not worth feeding the beast.
Thank you so much for this series! Canva-ication, ah finally for a word for this awkwardness I feel (also when working with canva myself) and yes, it is work👏