Skip to content
Link copied to clipboard

Is Facebook a trustworthy gatekeeper? | Stu Bykofsky

I don’t trust the tech giant to make the rules

Minister Louis Farrakhan, left, the leader of the Nation of Islam, and radio show host Alex Jones have been banned from Facebook as "dangerous" people.
Minister Louis Farrakhan, left, the leader of the Nation of Islam, and radio show host Alex Jones have been banned from Facebook as "dangerous" people.Read moreVahid Salemi / AP

Alex Jones is a despicable person who invents conspiracy theories. Louis Farrakhan is a despicable person and a flagrant anti-Semite. Laura Loomer is a despicable person who traffics in conspiracy theories and Islamophobia.

They are among a group of “dangerous individuals” kicked off Facebook last week. As a company that is traded on the stock market, Facebook has a right to make its own rules.

But is this rule right?

Previously, Facebook pretty much “let the marketplace manage,” says Bryan Monroe, an associate professor of journalism at Temple University who specializes in media law. Recent hate attacks in Sri Lanka and New Zealand “set the stage for their increased scrutiny of incendiary or hate speech,” he says.

If this column isn’t published exactly as I wrote it, that’s not censorship; it’s editing. My work has to conform to the standards established by editors. Is that what Facebook is doing?

No, it is banning people, and the two businesses are not the same.

Newspapers make publishing decisions using tests such as fairness, accuracy, and objectivity (except for opinion pieces designed to be subjective).

Facebook is very different. Posting requires no particular quality or skill. That explains excessive cat videos and pictures of your restaurant meal.

Facebook is less like a newspaper than a brick wall begging for graffiti. It is a global monopoly.

I couldn’t get anyone from Facebook on the phone, but in an online video, Facebook founder and CEO Mark Zuckerberg said his company will ban “groups where people are repeatedly sharing misinformation or harmful content.” But who decides what’s “harmful” — Facebook?

The same company facing up to $5 billion in fines for privacy violations?

Can you see why I am hinky about trusting them 100 percent?

By banning some voices, “Facebook said it will be the international arbiter of what is and what is not hate speech,” says Monroe, who’s not sure Facebook is the right entity to do that.

As a journalist, my default is always in favor of more speech, not less. Yes, even when that speech is unpleasant and even when I disagree with it. The red lines are incitement to violence and provable lies.

The best antiseptic for bad speech is good speech, says Monroe, but “Facebook needed a check on incendiary speech." I understand the dilemma. Facebook wants to ban “hate speech,” but that’s hard to define. USA Today reported that some black people are complaining they were kicked off the platform when they wrote about racism. Someone saw that as “hate speech.”

When I read or see some of what Facebook has carried, I want to vomit.

Radio show host Jones posted that the massacre of innocents at Sandy Hook was a hoax. That was a deliberate lie, and some of the parents are suing him for defamation. That’s a better solution than banning him.

If Facebook should not have the final word about whose views get posted, who should be the decision-maker?

That would be you.

If you don’t like what you are reading, turn the page or protest online. If you read a lie, post the truth.

That doesn’t mean Facebook should be passive about outright lies. It has options. In the past it took down defamatory or false or obscene posts rather than closing the account.

Here’s an idea: Facebook can use a caution, like the surgeon general’s warning on tobacco products, saying, “This page is filled with misinformation that can harm your mental health.”

I’d rather keep the rats in a public cage than drive them underground.