Skip to content
Link copied to clipboard

Social media has broken politics. Is science next? | Opinion

Our health is threatened when misinformation passes as science or medicine. For our own safety, we need to be able to tell the difference between science and science fiction.

Judy Marcucci tries her first e-cigarette at World's Finest Vape Shop in Bridgeport, Montgomery County. More vape shops are popping up, in both city and suburbs, and governments are taking notice. Vaping is just one practice vulnerable to widespread misinformation spread through social media and other sources, the authors say.
Judy Marcucci tries her first e-cigarette at World's Finest Vape Shop in Bridgeport, Montgomery County. More vape shops are popping up, in both city and suburbs, and governments are taking notice. Vaping is just one practice vulnerable to widespread misinformation spread through social media and other sources, the authors say.Read moreRACHEL WISNIEWSKI / Staff Photographer

Jerry Seinfeld explained his love of bookstores for the way they’re organized:

“I like the way it breaks down into fiction and nonfiction. In other words, these people are lying, and these people are telling the truth. That's the way the world should be.”

We seem so far from that now. In the past, a political comment someone didn’t like could be ignored, discussed, or argued about. Now it can be dismissed as “fake news.” It’s like walking into a bookstore and making your own decision about which books are fiction and which are nonfiction.

There’s reason to worry that science and medicine may suffer the same fate, and we need to prevent that because, quite literally, our lives depend on it.

You’ve likely heard some of the more infamous examples of medical and scientific untruths spreading: claims that vaccines cause autism, miracle diets, and natural “cures” to incurable diseases. In fact, incorrect news about the Zika virus propagated far more effectively than correct news. To combat theses untruths, we need to understand the three factors that enable them.

First is social media. Social media stands next to the printing press and moveable type among inventions that dramatically lowered the cost of communicating widely — now to the point that anyone can do it. Here, you find little to no fact-checking, and no editorial standards to govern broadcasted information. For example, Twitter accounts, once believed to be managed by trusted connections but actually driven by bots, have falsely promoted the health benefits of e-cigarettes, even though they are just as addictive as cigarettes. Paradoxically, perceived credibility may have increased. Communications through pre-existing social networks are typically more trusted than information from impersonal sources.

Second is selective deafness. When Walter Cronkite was the “Most Trusted Man in America,” many received their news from that single source. Now, Americans can select news feeds from thinly parsed media channels. It’s only human to want to hear what you want to hear. But what is a good strategy for music is not a good strategy for news. The problem is less that those into homeopathy can subscribe to homeopathy-favorable channels — it’s that they can do so to the exclusion of everything else. Selective deafness creates the “echo chamber” people decry.

Third is that lies are chameleons. Truth comes in only one form, but lies can be shaped to match any taste. The suffering want hope, and those unencumbered by the truth have an easier time giving it to them.

So, what do we do?

Scientists must get active — not just by taking their messages to the public, but doing so in ways that distinguish their contributions from the fake news that surrounds them.

Much of the work also falls on the audience. Since 2015, France has expanded its funding for courses that teach students to identify dubious online stories. It’s a practical move that schools in the United States and elsewhere should strongly consider. Like the students at J.K. Rowling’s Hogwarts Academy who take “Defense Against the Dark Arts,” they’re learning to protect themselves from dangers around them.

The rest of us need that education, too. We need to understand the importance of the source of information, distinguishing characteristics of legitimate authority, how conflicts of interest can alter what is studied and what is reported, and how our own biases can alter what we hear — all the skills of good and trusted journalists, not Twitter bots.

The world is threatened enough when political lies are accepted as truths. Our health is threatened when misinformation passes as science or medicine. For our own safety, we need to be able to tell the difference between science and science fiction.

David A. Asch is executive director of the Center for Health Care Innovation and John Morgan Professor of Medicine, and Raina Merchant is director of the Center for Digital Health and associate professor of Emergency Medicine, all at Penn Medicine.