Facebook is going to try and curb the spread of false news about COVID-19

Users will soon be sent notifications informing them if they have "liked" false claims about COVID-19

Fake coronavirus news is rampant across social media, and now Facebook is taking more aggressive action to try to limit it. The tech giant will soon let users know if they interact with misinformation related to the pandemic on the site. 

Users will soon be sent notifications informing them if they have “liked,” reacted to or commented on dangerous or false claims about COVID-19 after the post has been removed by moderators. The alert will also direct users to a World Health Organization (WHO) site debunking myths about the virus. 

“We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook,” the company’s VP of Integrity, Guy Rosen, said in a blog post

A number of tech companies have attempted to introduce policies and tools to limit the spread of coronavirus hoaxes, but misinformation is still pervasive online, especially as people in isolation spend more time than ever searching for news. So far, Facebook has hired fact-checking partners and introduced pop-up links to health resources such as WHO and the U.S. Centers for Disease Control and Prevention. 

“As this pandemic evolves, we’ll continue focusing on the most effective ways to keep misinformation and dangerous hoaxes about COVID-19 off our apps and ensure people have credible information from health experts to stay safe and informed,” Rosen’s blog post said. 

Facebook said its latest effort will roll out in the next few weeks. Still, social media platforms face a seemingly endless challenge from users promoting fake treatments and other potentially dangerous false information.

“As the world fights the deadly COVID-19 pandemic — the most challenging crisis we have faced since the Second World War — we are also seeing another epidemic — a dangerous epidemic of misinformation,” U.N. Secretary-General António Guterres said earlier this week. “Harmful health advice and snake-oil solutions are proliferating. Falsehoods are filling the airways. Wild conspiracy theories are infecting the internet.”

Guterres called on social media companies to do more to weed out misinformation on their platforms. “Together, let’s reject the lies and nonsense out there,” he said.

Facebook said Thursday that it has so far removed hundreds of thousands of posts that could lead to physical harm, including misinformation about the effectiveness of social distancing and bogus “cures” like drinking bleach. It also disclosed that it has put more than 40 million warning labels on videos, posts and articles on its platforms, sent over 350 million people to health information sites, and stopped over 95% of people from clicking on fake coronavirus news.

It also said it is adding a section to its COVID-19 Information Center called “Get the Facts,” to include fact-checked articles that debunk coronavirus misinformation. 

“Through this crisis, one of my top priorities is making sure that you see accurate and authoritative information across all of our apps,” said CEO Mark Zuckerberg. 

However, this week, nonprofit advocacy group Avaaz released a report that found over 40% of coronavirus-related misinformation remained on the site, the majority of which was debunked by Facebook’s own fact-checkers. The group said in total, those posts had been shared 1.7 million times on the platform in six languages. 

Facebook said Avaaz’s research is not representative of its efforts. 

“We share Avaaz’s goal of reducing misinformation about COVID-19 and appreciate their partnership in developing the messages we’ll now be showing people who engaged with harmful misinformation about the virus we’ve since removed. However, their sample is not representative of the community on Facebook and their findings don’t reflect the work we’ve done,” a spokesperson told CBS News on Thursday.