Rage & Softness is reader-supported! Thank you to the folks who pay monthly to support my work. If you want to upgrade to paid, go here.
Add your favorite song to the newsletter Playlist!
Please “heart” or share my writing with your community and comrades! Forward this email to a friend, put it in your own newsletter, screenshot and share it on social media.
If you would like to pay for a subscription, but don’t want to do so through Substack, you can pay via my Venmo or PayPal. Also, if you’d like a paid subscription, but can’t afford it, please email me and I’ll add you—no questions asked.
*This is an excerpt from my chapter, “Get Off The Internet,” from my upcoming book, The Guerrilla Feminist: A Search For Belonging Online & Offline coming out in March, with a few newer things added*
Social media is on life support. It’s far from resuscitation. Each platform has changed for the worse. Each algorithm is less forgiving than the next and cares little about what its users actually want. Meta has been dying a slow death for at least a decade, but with their recent bending of the knee to the Right, it’s eroding even faster.
Facebook is where Boomers, Gen X, and elder Millennials are put out to pasture. Most Millennials who have a profile don’t actually use it anymore; our profiles exist as ghosts of what once was. Twitter began a rapid descent the second Elon Musk bought it. Instagram, owned by Meta, began its own downfall because, let’s be honest, it’s no TikTok. Instagram has tried to compete with the video creation giant with its “Reels,” but Gen Z is not having it (good for them!) The primary downside of Instagram for Gen Z (and everyone) is its lack of reach and ability for a post or reel to go viral. TikTok’s algorithm identifies a user’s preferences and shows videos within those preferences whereas Instagram attempts to push through content to users that they might not like or be interested in. Also, many Gen Z seem to feel that Instagram is more buttoned-up and “serious” than TikTok. There is more pressure on Instagram to show a pristine and vibrant life. On TikTok, anything goes and everything has the potential to go viral.
We have already seen an influx of mis- and disinformation on social media apps, but this is about to get much worse. We may think we’re digitally literate, because we can navigate our way around the internet and “Google” things, but are we really?
The terms “disinformation” and “misinformation” are frequently used interchangeably online, but they are, in fact, different. Dr. Nicole A. Cooke, an assistant professor at the School of Information Sciences of the University of Illinois, Urbana-Champaign, writes:
Misinformation is simply information that is incomplete, but it can also be defined as information that is uncertain, vague, or ambiguous.” Misinformation is often misleading. Whereas disinformation is “false information which is intended to mislead, especially propaganda issued by a government organization to a rival power or the media.
Unlike misinformation, disinformation is always incorrect. Both are troubling, problematic, and rampant online. We have seen this play out in various instances, though most recent (and most egregious) throughout the Covid-19 pandemic. In an interview, Tara Kirk Sell, PhD, a senior scholar at the Johns Hopkins Center for Health Security, says there are four types of false information that come about during major health crises. These include: “1) mischaracterization of the disease or protective measures that are needed; 2) false treatments or medical interventions; 3) scapegoating of groups of people; and 4) conspiracy theories.”
Back in February of 2020, when the pandemic was just starting here in the United States, Surgeon General Dr. Jerome Adams tweeted:
Seriously people—STOP BUYING MASKS! They are NOT effective in preventing general public from catching #Coronavirus, but if healthcare providers can’t get them to care for sick patients, it puts them and our communities at risk!
This tweet has since been deleted. In April of 2020, federal government health officials changed this guidance. Later, in July of 2020, The Centers for Disease Control and Prevention published a study showing how mask-wearing may have prevented two Covid-positive employees at a hair salon from spreading the infection to their 139 clients. Had The Centers for Disease Control and Prevention made the recommendation to wear masks from the beginning, perhaps we wouldn’t continue to see such vitriol and confusion amongst citizens. The misinformation around masks from our own government is terrifying.
In running a large social media account, I have felt pressure to post any and everything—and be the first to do so. It’s almost as if you are a newspaper fighting for the first story. I see this with other social justice accounts as well. The sense of urgency is palpable and spreads like wildfire. This then leads to posting mis- or dis-information. People are too quick to post false or skewed information, which then gets shared hundreds or thousands of times. There have been many times, especially early on with Guerrilla Feminist, that I shared stories without fully vetting them. My now librarian self cringes at this. However, according to a recent research study, social media platforms are more to blame for the spread of misinformation than individual people. This particular study found that misinformation is a “function of the structure of the social media sites themselves.” Gizem Ceylan who led the study says:
The habits of social media users are a bigger driver of misinformation spread than individual attributes. We know from prior research that some people don’t process information critically, and others form opinions based on political biases, which also affects their ability to recognize false stories online… However, we show that the reward structure of social media platforms plays a bigger role when it comes to misinformation spread.
Part of how this happens, on an individual level, is pressure from followers to post any and everything about an issue. I knew that I couldn’t post about every injustice, but if I didn’t post about everything, I would get inundated with messages from people. My DMs would be full of questions like: “Why haven’t you posted about X yet? You clearly don’t care.” I didn’t set out to be a feminist news space, but that’s what I became—and I didn’t like it. I have mostly stopped posting anything other than funny Tweets and my own writing on “the grid” of Instagram these days. Part of this is for my own sanity, but part of it is also because I’m tired of putting so much labor into educational posts (researching, vetting, writing) only for them to not be seen. The algorithm is fully responsible for this since most of the time Instagram doesn’t even know my posts to my followers (unless my posts go viral).
As a librarian, I want to educate people about mis/dis-information, but I also don’t have the time to do unpaid labor. Librarianship is already demeaned by many people, and I don’t want to add to this by creating toolkits, educational posts, or anything else if it’s not given the attention it’s deserved. If we don’t have people, like librarians or other educators, creating resources, how will we ever combat fake news? With the rise of AI and deep fakes, mis- and dis-information are only going to become more common and more insidious.
I don’t think there is a way to continue with social justice work on social media platforms mostly because of the rampant mis- and disinformation. Especially since Meta, Twitter, and TikTok are not doing anything to prevent it. I realize that this will continue to negatively impact disabled folks, specifically homebound folks who rely on social media spaces to engage with activism. On-the-ground activism is still unfortunately inaccessible for many people. I’ve read far too much about organizers not enforcing mask-wearing during a global pandemic, and thus, making their events and protests harmful to disabled and immunocompromised folks. We simply have to do a better job of making outdoor activism accessible as well as creating more authentic and purposeful spaces for online activism—away from social media platforms.
I don’t think we can utilize public social media spaces to educate and inform. Perhaps we’ll start seeing a resurgence of private message boards or private social media accounts that vet folks who join. Either way, things will need to shift and change. Continuing to rely on social media platforms for our collective organizing and education is going to lead to scary, untenable situations.
In the early days of Facebook, Instagram, and Twitter, the majority of the content a person would see would be from friends or other people they follow. These days, users see content that an algorithm thinks they’ll like and they see less from those they follow. There are also more advertisements from brands. We see these ads in your Instagram stories and in our feeds. In 2016, Instagram changed their algorithm to no longer show things chronologically. Since then, we see what they want us to see. Reels are pushed in our faces (even as Meta has now announced the end of Reels bonus plays which is one way creators made money). Ads are pushed in our faces. We barely see the people we want to see.
The current algorithms across all social media platforms are biased. How can an algorithm be biased? In her book, Algorithms of Oppression: How Search Engines Reinforce Racism, Dr. Safiya Noble writes:
Part of the challenge of understanding algorithmic oppression is to understand that mathematical formulations to drive automated decisions are made by human beings… The people who make these decisions hold all types of values, many of which openly promote racism, sexism, and false notions of meritocracy, which is well documented in studies of Silicon Valley and other tech corridors… we are supposed to believe that these same employees are developing ‘neutral’ or ‘objective’ decision-making tools.
Though Dr. Noble’s research is specific to search engines, this framework can be applied to social media platforms and their algorithms.
When we hear talk of “the algorithm,” it’s often in a way that is distanced, sterile, and ghost-like. The way we discuss it only benefits social media platforms. They want us to think it’s “computer error” if anything should go awry. They want us to think it’s not on them. The problem though, like Dr. Noble said, is that real humans are behind these algorithms; real humans with bigoted views get to decide what is seen by a majority of users.
In 2020, Vox reported on a study that found that “by teaching an artificial intelligence to crawl through the internet—and just reading what humans have already written—the system would produce prejudices against black people and women.” We saw this during the height of #BlackLivesMatter. Many posts and tweets were pushed down. Some weren’t even shown in people’s feeds. TikTok actually apologized to Black users for what they called a “glitch” during this time. Dr. Noble writes: “Many people say to me, ‘But tech companies don’t mean to be racist; that’s not their intent.’ Intent is not particularly important. Outcomes and results are important.”
We’ve also seen how the biased and unfair algorithm has hurt sex workers. In 2020, Instagram announced updated censorship rules. According to Dazed, these included: explicit sexual solicitation, sexually suggestive emojis, and sexually explicit language. The article quotes London-based sex worker, Rebecca Crow, saying: “(Instagram’s censorship) leaves already precarious sex workers without any platform for online content promotion, which is the safest way to work during the global pandemic.” All social media platforms have only ever been hateful to sex workers, so this continued censorship is not surprising. There is also the issue of “shadowbanning” in which a user’s posts are consistently not shown in their followers’ feeds resulting in little to no engagement. Instagram’s CEO has consistently said that shadowbanning “is not a thing.” Perhaps it’s the terminology that is “not a thing,” and what we should actually call it is “algorithmic bias.” That is, indeed, a thing.
During my time on these platforms and doing my own “content creation,” I know firsthand how frustrating it is to spend your time, energy, and love on a post just for it to have zero engagement. I often wonder if my current following count is even correct, because I will get maybe a hundred likes on some posts. Even as someone who struggles with dyscalculia, I know that the math ain’t math-ing. Do I have bots following me? What’s even happening? I don’t often post anything that I consider “inappropriate,” but part of the problem with algorithms is that they will flag what they deem “inappropriate.” This is how certain hashtags get hidden as well. For example, Instagram has been known to hide hashtags like “#sexualhealth” but not racial slurs. The humans who created the algorithm clearly need to do better.
There is a continuous lack of reciprocity I have felt in having my large Instagram account, especially as I see little to no engagement on each post. This is why I have mostly stopped creating information image carousels for the ‘gram. We can all joke about the Canva-ication of social justice posts, but it is labor. It takes time and energy to research, vet information, double-check it, write it in a way that is digestible (accessible language), and also make it visually appealing. I see the work that so many accounts put in just for their posts to barely get seen. It’s diminishing to those of us who have been on these platforms for over a decade. I don’t blame my following (although I do wonder sometimes why people don’t double-tap that damn heart button), I primarily blame Instagram.
The algorithm (and the people behind it) are continuing to harm our social justice movements. Information is either scarce (and not being shown) or it’s incorrect. Transphobia, whorephobia, racism, ableism, and sexism run rampant and appear louder than anything else. In the current dangerous political climate, social media algorithms are only going to make things worse for real people. These platforms have to change and we have to stop relying on them as our primary online spaces. Posting about current events on social media is a fast and easy way to gain visibility for these events. However, because of the quickness with which we all post, fact-checking becomes non-existent. This has been an ever-increasing issue in social media spaces. As a librarian, I am committed to ending dis/mis-information (even if I also sometimes get swept up in it). As I have been consistently reading and reposting things to my Instagram stories about the genocide of Palestinians, I have seen so much misinformation, and it leads me to wonder how helpful is social media to our collective movements?
The idea of “Digital Sandcastles” is that our online landscapes, specifically on social media, are precarious. Many of us have spent time, effort, energy, blood, sweat, and tears crafting our online worlds. This is obviously near and dear to me since I created Guerrilla Feminism on Facebook in 2011 and then the Instagram page in 2013. The issue with digital sandcastles is that they were never meant to last forever. They were never meant to be used as “archives.” [Side note: people really love using that term when they don’t know what it means!] They have no permanent infrastructure. Besides the impermanence and precarity, these spaces repeatedly hide anything that the company doesn’t want to show its users. Currently, we are seeing this with how Meta is purposely suppressing any sort of “pro” Palestine post.
Posting through a genocide feels incredibly bleak and dystopian, and yet, what else can we do in the current moment? Organizing and attending in-person protests is great, but what about those who are homebound or have other disabilities that inhibit them from attending these protests? Clearly, a digital component is needed and necessary. I have personally learned so much from Palestinians who have been able to record video and take pictures of what is happening on the ground in Gaza. All I feel I can do (aside from give money and attend local protests) is to repost all of the injustices I see from the people living through them. Digital sandcastles, like a regular sandcastle, will eventually get washed away and destroyed.
I am tired of fighting with social media platforms who are tits-deep in corporate greed. I am tired of asking for crumbs. I am tired of seeing celebrity nonsense get the most visibility. Everything is an ad and I don’t want to see our movements become further fractured by commodification. What is the point in having a following if no one even sees my posts? What is the point in feeling beholden to a platform who has not done anything for its users? What is the point of creating an informational carousel on Instagram when it just seems to go into a void? I have more questions than answers related to the future of posting through horrific times. Questions I’m thinking about and asking myself:
How can we ensure that voices from marginalized communities or those posting about issues that affect marginalized communities, will be listened to and seen?
Where should we be archiving digital resistance efforts?
How should we be communicating these things with others committed to the same causes as us?
Are there other digital spaces or structures that we could collectively inhabit that have less precarity?
Are there ways to circumvent suppressive digital tactics? If so, how can we share information about this?
Sometimes the internet is good, sometimes it's terrible. Either way, we can’t depend on it for our mobilizing, organizing, and activism. We have to meet elsewhere online and off. If we truly want to get free, if we truly want liberation for all people, we have to be willing to build community offline. We have to talk to each other in person. We have to divest from corporations. We can’t rely or depend on Meta, Twitter, and TikTok to amplify our needs. We have to create options for how people can get involved. Relegating our activism to one sphere minimizes our collective power.
I am still toying with the idea of deleting my Instagram. I am still trying to figure out what to do or where to go next. I don’t want to be an influencer or a social media creator. Although, I still long for community and belonging in any way I can get it, and there is a lot of it on online spaces.
This chaotic internet playground is vastly different from the one I first played on. The rules and people are different. I don’t want us to be lost to each other when social media apps eventually disappear. My desire for belonging has shifted over time, and I’m sure it will shift again. The belonging I’ve spent my entire life searching for has always brought me back to myself. Maybe the internet is bringing us back to ourselves. Perhaps it’s our innate humanness that continues hunting and prowling for each other.
I was interviewed about witchy things for Hedge Witch Botanicals!
Meta surrenders to the right on speech - Casey Newton
Why mending in community matters -
Wading Into 2025: How to Begin - Kelly Hayes
This is What it's Like to Be A Medical Student at Al-Aqsa Martyrs Hospital in Gaza - Zaina Alqudwa
Revisioning Standards for Library Services for the Incarcerated or Detained - ALA Standards
The trapped animal bellow inside: Ruminations on PMDD -
This song:
Thank you for this article. I just left Meta for many of the same reasons you mention above. I also have an issue with making someone I have little respect for, richer because of my eyeballs and the ads he can sell. Enough is enough! .
Yes