Facebook stops flagging fake news because it was making the problem worse
Fake news -- or disinformation as it used to be called -- has become an increasingly serious problem as false information quickly spreads online through social media.
Facebook tried to combat the problem by flagging up fake news with Disputed flags but, having discovered that this was actually making the problem worse, the social network is dropping this approach. Instead, Related Articles will be used to provide context and give alternative takes on stories.
Citing academic research, Facebook says that highlighting stories as fake news with a warning flag or similar actually did nothing to stop people believing the content to be true. Quite the reverse, in fact: it "may actually entrench deeply held beliefs -- the opposite effect to what we intended."
In a blog post, Facebook's Tessa Lyons says that the social network will be "redoubling" its efforts in 2018 to fight fake news. It will rely on external fact-checkers to determine the truth of stories, and demote those deemed to be false:
We are starting a new initiative to better understand how people decide whether information is accurate or not based on the news sources they depend upon. This will not directly impact News Feed in the near term. However, it may help us better measure our success in improving the quality of information on Facebook over time.
False news undermines the unique value that Facebook offers: the ability for you to connect with family and friends in meaningful ways. It's why we're investing in better technology and more people to help prevent the spread of misinformation. Overall, we're making progress. Demoting false news (as identified by fact-checkers) is one of our best weapons because demoted articles typically lose 80 percent of their traffic. This destroys the economic incentives spammers and troll farms have to generate these articles in the first place.