Long has the banhammer existed as an online threat. In early instances it served as a sort of situational justice enacted by internet municipalities the like of Massively-Multiplayer Online games (MMOs), imageboards forums, and private websites. Bans were rare, but also arbitrary, being wielded mostly by moderators and those hosting their own sites. As the internet became less of a collection of villages and instead a concentration of feudal plots of server space, the serfish plight of the average internaut was only further exacerbated. Now, as internet users become less and less self-reliant, being mostly dependent instead on platforms to host user-generated content, the nature of the hammer has changed.
No longer is the threat of content moderation ruled by a petty elite of power-hungry moderators. Rather, online automation has rendered content moderation into less of an act and more of a process, transforming it from a conscious and active occurrence to a passive procedure that can be unconsciously performed by machines. Algorithms permit companies to (theoretically) sift through, identify, and moderate vast swathes of content in near-instantaneous speeds. However, it seems we learned nothing from Terminator 2: Judgement Day, as automated Moderation technology (or “automod”) remains far from perfect despite its pretensions. There are myriad examples of such failures floating around in the collective consciousness of the World Wide Web. After Tumblr banned the posting of all “adult content” in December of 2018, users immediately found massive holes in its moderation algorithm. In many cases the pornography that the algorithm was designed to target remained unscathed, while Tumblr took down images of everything from sand dunes to ceiling lamps for bearing the most superficial of resemblances to the naked female body (prompting users to coin the term “ceiling nipple”). It’s important to acknowledge that decisions of this scale are not made in a vacuum. Tumblr’s porn-ban was motivated largely by sex workers using the site as a safer way to find work than Craigslist and other classifieds, and the measures taken by the website undoubtedly made online sex work far more precarious and unsafe.
Algorithms are used less these days for mass-purges as they were following Tumblr’s “cleaning up” of its image, and instead now serve as a sort of “soft-measure” to restrict access across large platforms. So common is this practice now that it has garnered its own colloquial name: the “shadowban”. Shadowbanning is a term that most people osmose, provided they spend enough time online, but is much harder to explain to the average grass-toucher. Generally, shadowbanned accounts and content are understood to be “hidden” by the platform which hosts them, without the platform having actively deleted the material in question. D-list internet t-girl that I am, my following is not large enough to feel these impacts, so getting myself shadowbanned for content was out of the question. Instead, I decided to enlist some expert advice.
@Autogyniphiles_Anonymous is a philosophy meme page “run by a bunch of old tired trans wlw,” (‘women who love women’). Back in July I talked to them about a targeted harassment campaign, being threatened with deletion during Pride month, and how platform censorship hurts queer creators. Although their account managed to avoid deletion, in the months since, they’ve continued to receive community guidelines strikes, and the reach on their main account of more than 37.3k has periodically dipped below that of their five-thousand-follower backup. Despite a regular content stream, their engagement has stagnated, making it difficult for them to circulate the resources and fundraisers which they usually promote. With arguably few people more qualified to speak on the mechanics of shadowbanning, I figuratively sat down once more in the metaverse with Auto_Anon’s Arendt Admin to discuss how the experience of existing online has changed in the time since we last spoke.
“There’s been some big changes on the internet, definitely since we started the account, but also I’d say since we last talked,” she notes. “One of the big changes there has been is not TikTok itself, but the emulation of TikTok. It’s no secret that Facebook has desperately run to whatever seems most popular, and this is what made the actual site Facebook essentially unusable, because it went from a fairly understandable platform to a bit of a labyrinth.” Since August 2020, Instagram and its parent company Facebook (now Meta) have been gradually phasing in “Instagram Reels,” short, TikTok-like videos, while restructuring the Instagram user-interface around them. Accounts that have long seen high engagement find that suddenly Reels are being prioritized over conventional posts, with their engagement floundering as a result. “I don’t want to get all Marshall McLuhan, ‘the medium is the message’ here,” Arendt Admin laughs. “But if somehow they ‘TikTok-ify’, you know, mimic TikTok for as long as it’s popular, and then mimic whatever’s next, you get an internet that is trend chasing not only in its content but also in the format of its own media.”
“It’s homogenizing,” I suggest.
“Exactly,” she replies.
This homogenization is reinforced by the algorithms such platforms employ. Per Arendt Admin, this algorithm manifests in two ways: first, as a proverbial “shotgun to the back of the head” of online creatives, pressuring them to hop on the bandwagon of each successive media trend (e.g., Instagram Reels), and second, as a means of actively suppressing certain subjects. “A common claim that I see increasingly is this: ‘Well the algorithm is not actually going against Marxism or discussions about sex, what’s happening is people don’t like this content and aren’t engaging with it as much, therefore it’s not being pushed as much’.” This claim seems superficially feasible, however, it strikes her as unlikely that thousands of people simply aren’t seeing Auto_Anon’s posts. “The insinuation here is that suddenly people stopped caring about being trans, which seems unlikely. It points to a more nefarious algorithm that isn’t simply doing this weird supply and demand thing, but is actively suppressing subject matter.” That active suppression is a quintessential example of a shadowban.
The ubiquity of the algorithm has subjected online creators to an environment which is particularly hostile to certain content, especially when it concerns the position of marginalized groups. On April 15, Auto_Anon posted a passage from the Red Seder in celebration of Passover. Despite their large following, the post was shown to less than 20% of their followers. Of them, a mere 165 total liked it in the first four hours. “If it is just because of Instagram — and I think it is partially just Instagram — that means Instagram is antisemitic, big whoop,” Arendt Admin sighs. “Mark Zuckerberg has had well-documented allowances for Neo-Nazi propaganda on his websites since day one. This is not a man who seems particularly concerned with his own Jewish identity. This is a man who seems like he wants as much discourse going on in his platforms as possible.” However, she doesn’t deny that the unpopularity of Jewish content may equally be an audience response. “Out of the history of Marxists ideology studies and ideology critique lies this conception that perspectives from within ideology may differ, and that differing perspectives within ideology might give you insights into the underlying reality.” This notion of “Standpoint Epistemology,” echoed in Critical Race Theory’s conception of intersectionality has, in the words of Arendt Admin, “been somewhat accepted in academic circles, but on the internet, has been watered down and made a vulgarized version of itself, which has become the milieu we live in.” As a Jewish woman herself, Arendt Admin can point to a long history of Auto_Anon’s Jewish-focused posts underperforming, and angry backlash (even from avowed Leftists) to posting about anything even tertiarily related to antisemitism. “They demand, ‘listen to sex workers,’ or ‘listen to drug addicts,’ or ‘listen to women of colour’, ‘listen to transsexuals,’ and so on. This is the implication of Standpoint Epistemology, and it’s readily accepted... but this hasn’t been the case with talking about anything Jewish.”
We expound on this point a bit, Arendt Admin pulling various books off the floor-to-ceiling shelf behind her and holding them up to show me through the camera. “This entire shelf is Arendt,” she says, pointing behind her. Though we talk about it at length, neither of us can conclude whether rampant cultural antisemitism or merely the design of Instagram’s algorithm is to blame for Auto_Anon’s experiences. There seems to be an ideological ouroboros at play: antisemitism has informed the design of the algorithm, and the algorithm reinforces antisemitism. It is impossible to discern where one symptom ends and the other begins, because they mutually reinforce one another. Per Marxist critique, ideology exists before thought, unconsciously shaping our beliefs. The scary thing about algorithms is that they possess the power to artificially select which ideas to reinforce, naturalizing them until they are so structurally embedded that we don’t even notice them reshaping the way we think.
The economic structure on which the internet was built equally contributes to its hostility to online creators. Many YouTubers, podcasters, independent writers, and content creators of all varieties find precarious employment with online publications, or else depend on donations, sponsorships, and subscription services like Patreon to supplement their income. With a recent promotion for the Ottawa Art Gallery, and a series of posts sponsored by inclusive contraception company Jems, Autogyniphiles_Anonymous have forayed into this nebulous world of sponsored content over the past year. “It’s something we entered into with some trepidation,” admits Arendt Admin. “For instance, we did multiple days of polling like, ‘Hey, would you folks mind if we sold out just a little bit? We’d really like some money,’ and the response from our followers was overwhelmingly ‘Yeah, go for it, get that bread,’ it sort of shocked me I guess.” As a relatively high-profile account on the trans side of Instagram, Auto_Anon frequently shares mutual aid requests and fundraisers. Even so, as the admins are trans women themselves, the money they generate through sponsorships, donations, and their merch store helps meaningfully supplement the cost of their and their close friends’ transitions. “If we were older and had been around longer, people would’ve called us sellouts... Gen Z and younger millennials just don’t really see it that way. There’s this acknowledgement that we are all mostly struggling for money and need to survive.”
Even so, the admins have maintained hard and fast boundaries about editorial freedom, being wary of the influence advertisers might try to exert. “We have been very careful about the sponsors that we accept. Mostly we’re looking at stuff that isn’t going to ask us to clean up our image,” she laughs. “My mother’s cousin is just like, a ‘normal’ influencer and she’s like ‘Look, a sponsorship from the Hudson’s Bay Company, why don’t you do that?’ And I always say, ‘Hudson’s Bay would never want to be associated with the scum I peddle,’ that’s not going to happen.” As she admits, “we’re pretty much limited to sex companies and weird artists and galleries. I think we’d lose credibility in people’s eyes if we started doing ads saying, ‘Hey Marx is great, but did you know Winners has a sale for 50% off?’.”
The trend Arendt Admin highlights in the ideological function and structural design of the internet is one deeply tied to the concept of media convergence, a theory in Media & Communications Studies which suggests that mass media join into one consolidated medium over time as communications technology progresses. The internet, to many, symbolises the reification of this effect, a mass medium which encompasses all prior visual and auditory media in one nebulous, connected space. This amount of power leveraged by the concentrated hands of a select group of platform corporations should terrify anyone concerned with online independence, especially in journalism. By the design of Web 2.0, small publishers such as Arthur (in which you’re reading this article right now!) are forced to adhere to standards administered and enforced by corporate giants such as Twitter, Instagram, and its parent company, Meta, with meaningful alternatives few and far between. Despite being an independent outlet, Arthur is reliant on Amazon Web Services to host our website, advertising to supplement our Levy funds and donations, Twitter, Instagram, and TikTok for promoting articles, Slack and Zoom for productivity, Twitch for podcasting and our annual livestreamed Telethon, and countless other sites and productivity tools in the research process for each and every article. That we are so reliant on corporate entities despite our stated independence is not a coincidence, it is an intentionally hostile design function of Web 2.0. The fact that such vast amounts of computing access and capital are concentrated in the hands of Amazon, Twitter, Meta and ByteDance (TikTok’s parent company) is a feature of contemporary late-stage capitalism. Not only that, but it did not emerge out of nowhere, as a matter of fact, many theorists saw it coming.
In Edward S. Herman and Noam Chomsky’s “A Propaganda Model,” they argue that hegemonic media control has the effect of producing a phenomenon they call “the manufacture of consent,” pointing out that the structures and individuals which control the media also possess incredible power to influence media consumers. Chomsky and Herman decry the Western mass-media institutions as structures for disseminating propaganda. Essentially, the mass-media carry out an ideological function, normalizing the presence of the state and of the structures of capitalism, making it seem “normal” to be governed and impressing upon each citizen that the free market is good and natural until that belief is thoroughly internalized. Per Herman and Chomsky, the contemporary media landscape is characterized by five “filters,” as follows:
In many ways this lines up chillingly with Arendt Admin’s observations about trends in the internet’s structure. Wealth, both through platform ownership and corporate advertising contribute immensely to which narratives are given the light of day. As another quote from the same Marshall McLuhan Arendt Admin earlier invoked goes, “all media work us over completely.” To dive deeper into the implication of Chomsky and Herman’s work in the internet age, and to consider the effects of this emergent medium of journalism, I decided to pester outgoing Arthur editors (and my lovely bosses) Nick Taylor and Brazil Gaffney-Knox. Despite the ongoing pandemic, Arthur has flourished under the pair’s watchful eyes.
“Arthur has gone from being a levy group that was one of the most opted out of during the Student Choice Initiative, to having successfully campaigned to increase our levy fee in just a few years. I think part of that is that we're doing so much more to meet people where they're at,” says Taylor. “We're trying to appeal to zoomers. Where are zoomers? They're on TikTok and they're on Instagram.”
“So we will TikTok,” Gaffney-Knox interjects.
“And we must Instagram.”
However, the transition to a digital publication has brought with it challenges in upholding Arthur’s ethical standards, something which has been an important consideration for the editors during their tenure.
“Brazil and I are both silly little philosophy majors. I did a specialization in ethics. We take very seriously our obligations to our community, to each other, to journalistic integrity, to the mandate that Arthur has as an anti-oppressive outlet,” says Taylor. When it came to revamping TrentArthur.ca, “[they] had many conversations with the previous editors about like, ‘What would it mean for Arthur to have our domain tied up and like then be frankly reliant on Amazon Web Services?’ and ‘What does it mean to market an anti-oppressive newspaper on Instagram?’ There was a lot to grapple with in terms of how you even share an article that is serious and very much justice-oriented through a medium that is owned by fucking corporate oligarchs.”
“We can't be perfectionists about these things,” adds Gaffney-Knox. “The ethics in the real world are imperfect.”
Beyond the aforementioned platforms which they use mostly for promotion, Arthur also relies on advertising and donations collected through its site to supplement the funding provided by Arthur’s levy fee. “In terms of donations, I don't think it affects our editorial work,” mulls Gaffney-Knox. “We have to be really, really careful about that. Cause yeah, we do news, and you can't have donations in influencing the news you do. I think all like the donation grind does is it just takes energy away from our editorial work in some ways, because we do have to spend time thinking of fundraising, but I feel like our little Telethon/Award Show/yearly fundraiser is helpful in consolidating all of that energy into one ridiculous horrible month.”
“When we were launching the site a lot of what we talked about was the role that advertising would play on it,” adds Taylor. “It's funny, Issue Two, the entire back page is an ad for the very housing referendum that we lobbied against successfully.” In an ideal world, this would never have happened, but with Trent cutting funding for Trent Work-Study Program (TWSP) and Trent International-subsidized Levy jobs this year, Arthur has been more dependent on our secondary revenue sources than ever. However, the financial precarity associated with independent journalism has not deterred the editors. “It didn’t stop us from leading this vote no campaign,” Taylor grins.
The campaign to which Taylor refers is an Op-ed/tweet-thread they wrote in early March, compelling students to vote “No” on the housing referendum. The thread has been one of Arthur’s most circulated stories this year. Despite this, Brazil & Nick insist they don’t let good analytics get to their heads. Says Gaffney-Knox, “our philosophy has been ‘if we do good journalism it will get shared,’. That's actually what people care about most.”
In summary, control of the media (and by extension the content it produces) has two major ramifications.
Shadowbans and content moderation enable this by dictating a new moral standard on the platforms they enforce. Deviance is responded to with active suppression, if not deletion. What we see in the latent era of Web 2.0 is the propaganda model working overtime. It’s difficult to be a truly “free” thinker under these conditions. Creating a practical history of the internet is itself even a difficult task. Data is constantly subjected to deletion or extreme measures of suppression without rhyme or reason. Despite what high school guidance counselors may have told you, it seems hard to believe anything on the internet is truly there forever. In effect, content on the internet is being consistently curated to fit the standards of moral hegemony. Those of us old enough to remember, or even just envision a time before Web 2.0, inherently know this. Ours is an internet populated by dead URLs and images long-since corrupted by the annals of time. All that remains is the servers under the watchful eyes of our platform overlords, all our interactions subject to their “Terms of Service” and “Community Guidelines policies”.
“I think you can very much talk about gentrification on the Internet,” says Arendt Admin, as our discussion winds to a close. Our interview has continued intermittently as we get side-tracked talking about Cultural Studies, campus newspapers, and a lengthy discussion of our preferred fountain pens. As I begin to anxiously look at time, contemplating how long this will take to transcribe, she offers one final piece of (in her own words) ‘rhetoric’. “The internet has spaces that are often cruelly lorded over by their landlords, and populations get pulled in and sucked out, and communities get demolished. If the medium is the message, when the medium gets changed, so does the community that’s living there. I think that these changes of spaces to be spaces they weren’t quite before — in extreme cases, these mass bulldozings — I don’t think they’re very different from what you see in the physical world. It is communities and people being pushed away.”
All told, the corporatized, platform internet is bad for truth. It’s bad for doing good journalism with the integrity necessary for the job. Most of all, it’s bad for its users. You wouldn’t be wrong to think it’s harder to parse out the truth these days, but this is not a quirk of the Trump Presidency, or the supposed anonymity the internet affords us. Rather, the powers that be are embroiled in an ongoing struggle to obscure the nature of the truth, and not even independent outlets such as ourselves can escape the structure. There is no antidote to propaganda. All you can do is be diligent, think critically, and cross-reference as many independent and critical news sources as you can find. Somewhere in the middle of all that you might find the truth, not in definitive, but in a negative-space sketch of sorts. The truth is no longer concrete, it is danced around, gestured at, but never shown directly. This is the challenge we face, because so long as the system is structured in the way it is, the truth will forever continue to elude us.
The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.
A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!
"Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system."
The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.
A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!
"Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system."