Facebook is doing more to promote reliable information about coronavirus

Facebook coronavirus

The coronavirus pandemic has people around the world searching for information about what they should and shouldn't be doing, as well as news about the spread. But there is a lot of misinformation out there, and social media platforms are breeding grounds for such harmful content to spread.

Facebook has been taking steps to counter misinformation about COVID-19, not only on the main Facebook platform, but also on Instagram, WhatsApp and Facebook Messenger. Now the company is expanding its efforts to connect people with trustworthy information about coronavirus.

See also:

Facebook is eager to ensure that people are getting coronavirus-related information from reliable sources, and this is why it includes the COVID-19 Information Center at the top of the news feed. This includes information from the World Health Organization (WHO) as well as real-time data about infections. Pop-ups are also appearing in Instagram when people search for COVID-19, and information is shown in Messenger as well.

There is a lot of misinformation shared, and quickly spread, on WhatsApp, and Facebook says:

People can sign up to receive the WHO Health Alert on WhatsApp, a daily report with the latest numbers of COVID-19 cases. It also includes tips on how to prevent the spread of the disease as well as answers to commonly asked questions that people can easily send to their friends and family. We’re also working directly with health ministries in the UK, India, Indonesia, Singapore, Israel, South Africa and other countries to provide similar health updates specific to those nations. In the last week, over 100 million messages have been sent by these organizations to WhatsApp users. In addition, we donated $1 million to the International Fact-Checking Network to expand the presence of fact-checking organizations on WhatsApp, so people can submit rumors they find directly to fact-checkers.

To help counter COVID-19 hoaxes, Facebook says that it removes "coronavirus-related misinformation that could contribute to imminent physical harm". On its main social platform, Facebook is guided by information from the WHO. The company says:

For claims that don’t directly result in physical harm, like conspiracy theories about the origin of the virus, we continue to work with our network of over 55 fact-checking partners covering over 45 languages to debunk these claims. To support the global fact-checking community’s work on COVID-19, we partnered with the Independent Fact-Checking Network to launch a $1 million grant program to increase their capacity during this time.

Once a post is rated false by a fact-checker, we reduce its distribution so fewer people see it, and we show strong warning labels and notifications to people who still come across it, try to share it or already have. This helps give more context when these hoaxes appear elsewhere online, over SMS or offline in conversations with friends and family. On Instagram, we remove COVID-19 accounts from recommendations and we're working to remove some COVID-19 related content from Explore, unless posted by a credible health organization.

Facebook has already introduced limits in WhatsApp that prevent people from forwarding messages too many times. This is an attempt to stop false information spreading virally, and it must be having some effect because there are plans to introduce stricter limits on Facebook Messenger to control the number of chats someone can forward a message to at one time.

Image credit: shellygraphy / Shutterstock

© 1998-2020 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.