Coronavirus: Facebook now reports fake news

Facebook alerts users when they click, comment, or share news about the Covid-19 in their news feed. It would be 95% effective.

Facebook will now directly warn its users who consult false information related to the new coronavirus, announced Thursday its CEO, Mark Zuckerberg. During the month of March, Facebook deleted “hundreds of thousands” of content linked to Covid-19 which “could represent an imminent danger to health”, such as claims that bleach would cure of the virus .

Users who click on, comment on, or share this type of info will now receive a message in their news feed, urging them to check out trusted sources like the World Health Organization’s site , Mark said. Zuckerberg in a blog on Facebook . The Facebook CEO highlights the efforts of the social network to “limit the spread of disinformation on the Covid-19  “.

While rumors abound, the social network indicates that it has already redirected “nearly 2 billion users”, or almost all of its members, to information from public health authorities, through its “information center Covid-19  “, available on each news feed. Result: “more than 350 million users have clicked on our educational messages” to find out more, on Facebook and Instagram , underlines Mark Zuckerberg.

A warning label on 40 million Facebook posts

The contributions of 60 fact-checking Facebook partner organizations around the world, such as AFP, will also be featured on the information center. If information turns out to be false or imprecise, but it does not represent an “imminent danger”, the social network attaches a “warning label” to it. 40 million Facebook posts received this label during the month of March, pushing users not to view this content in 95% of cases, according to Facebook.

Many observers, including the NGO Avaaz, have denounced Facebook’s inability to curb the spread of news, with serious consequences. “Facebook is at the epicenter of this disinformation crisis,” said Fadi Quran, an Avaaz official, in a press release. “But the company is taking a step today to clean up this toxic ecosystem , by becoming the first social network that alerts its users exposed to information about the coronavirus and redirects them to information that can save lives.”

Measures to limit disinformation via WhatsApp

One of Facebook’s messengers, WhatsApp , had already implemented new measures to combat disinformation in early April. WhatsApp users can no longer transfer viral messages to only one contact at a time, to limit the flow of information.

Agence France-Presse participates in more than 30 countries and 12 languages ​​in the “Third party fact-checking”, a media verification program developed by Facebook. With this program, which started in December 2016, Facebook pays around sixty media for the use of their “fact-checks” on its platform and on Instagram. If information is diagnosed as false or misleading by the media, users are less likely to see it appear in their news feed. And if they see it or try to share it, the platform suggests that they read the verification article. The participating media are completely free in the choice and treatment of their subjects.