Following last week’s deadly attack on the Capitol, Twitter and many other social media platforms have booted President Trump, who incited the violence through his tweets and other digital messages to his ardent supporters.
Initially I relished in the cacophony of hysterical memes and Tweets that emerged from the news. It felt like just desserts.
But while anyone who has ever read the First Amendment can tell you that Trump’s freedom of speech wasn’t infringed upon because Twitter and their ilk are privately owned companies, this prohibition can have far-reaching and troubling consequences — and not just for White House occupants.
Trump’s ban is a reminder that social media platforms need fair and equitable algorithmic access, a stronger moderation team, and a strong moral compass. Their private ownership means we are only free to use them within their own terms of service, which seems to be malleable. That’s problematic because as we’ve seen time and time again, their judgement calls are, at best, inconsistent, and at worst, discriminatory. Fat, queer, and BIPOC users have repeatedly cited that they’ve been banned from social media for far less and far fewer incidents than Trump, and those bans have had real-life implications.
Influencers have claimed that book deals, revenue streams and other things social media can dictate can come to a grinding halt over some follower “reporting” a post calling out white supremacy or a fat roll being visible in a shirt. Take, for example, what happened to plus-size Black model Nyome Nicholas-Williams. Photographer Alexandra Cameron captured artistic photos of Nicholas-Williams where she was fully covered, but alluded that she was topless. Hours after she posted the photo, which didn’t violate Instagram’s TOS, the model received a notice that her photo was deleted and her account could be shut down.
“Millions of pictures of very naked, skinny white women can be found on Instagram every day,” Nicholas-Williams told The Guardian. “But a fat Black woman celebrating her body is banned? It was shocking to me. I feel like I’m being silenced.”
Another angle to consider here is algorithmic equity, where some marginalized groups claim they are being “shadow banned.” This essentially means that hashtags used to gather followers around specific content are actively being suppressed to make them less viral. Popular platform TikTok allegedly admitted to doing this, though they claim it was a benevolent effort to make sure marginalized groups weren’t targeted for cyberbullying.
To me, that sounds an awful lot like victim blaming and leaving the bullies, like Trump and his followers, to have free reign. Moderators can jump on the visibility of a plus-size Black body, but cannot piece together the many warning signs of an outright insurrection incited by the President of the United States. These companies have the money to invest in moderators who can catch these red flags.
People rely on these platforms as essentially publishers of the content they can consume. So when companies suppress that content by either lazy moderation efforts or algorithmic biases they can essentially shape the “reality” we know it as with impunity.Trump’s ban, though deserved, feels like the beginning of a windfall where these companies who essentially serve as public squares can silence anyone they so choose on a whim. They also need more transparency when it comes to their rules and regulations, and people who will enforce them equitably.
By ensuring these companies are held to standards that are fair and accessible to all users with moral moderation, we can curtail the spread of misinformation, keep it fair and equitable for marginalized folks, and keep even influential men like Trump in check without necessarily imposing any “threat” to their first amendment rights.