Skip to content
Link copied to clipboard
Link copied to clipboard

Facebook says it will alert people who have interacted with coronavirus misinformation

It's part of a series of new, aggressive steps to combat what health authorities have described as a global "infodemic."

Facebook is going to let users know if they liked, reacted or commented on posts with harmful misinformation about the coronavirus that moderators later removed.
Facebook is going to let users know if they liked, reacted or commented on posts with harmful misinformation about the coronavirus that moderators later removed.Read moreAmr Alfiky / AP

Facebook said Thursday it will begin alerting users if they have interacted with harmful misinformation about the coronavirus, part of a series of new, aggressive steps to combat what health authorities have described as a global "infodemic."

The messages — which will appear in users' News Feeds — will direct people to official, credible information from the World Health Organization in an attempt to ensure dangerous myths about the disease, its origins and how it is treated don't continue to proliferate, either on the social-networking site or in the real world.

"We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook," Guy Rosen, the company's vice president for integrity, said in a blog post. He said the warnings would appear for those who have "liked, reacted or commented" on such content.

Facebook also revealed Thursday new data illustrating the enormity of its task to remove dangerous posts, photos and videos about the pandemic from its services, which reach billions of people globally. In March alone, the company said it displayed fact-checking labels on 40 million posts, based on 4,000 articles that its third-party reviewers had rated as false.

Taken together, the new announcements illustrate Facebook's heightened vigilance in the wake of a public-health crisis that's killed more than 29,000 in the United States and affected millions more around the world. "People really need to feel connected in times like this," wrote the company's chief executive, Mark Zuckerberg, in a letter to top deputies in March stressing the critical role the social network played at a moment when people feel most isolated.

For years, regulators around the world have called on the tech giant to take faster, more decisive action against misinformation — including that spread by politicians — even as the company maintained it should not serve as an "arbiter of truth." But Facebook has sought to chart a new course with coronavirus, even banning a wide array of content outright, including posts, photos and videos that peddle fake cures.

So far, Facebook said it has directed 2 billion people on its main social networking site and its photo sharing app, Instagram, to a portal it created for credible COVID-19 information. More than 350 million people have clicked through to learn more, the company said.

Public-health officials tracking coronavirus have sought to debunk a series of pernicious myths in recent months, including those that wrongly suggest that 5G mobile networks, the next generation of wireless service powering smartphones and other devices, are responsible for the outbreak.