How will Facebook fight the fake news phenomenon? Poorly... and stupid, lazy users don't help

facebook_mac

Facebook has many problems, but the most recent and prominent of them has been the issue of fake news. So serious is the problem, that some have blamed fake news stories on Facebook as being the reason Donald Trump is now president elect.

Mark Zuckerberg has made it fairly clear he doesn't subscribe to this particular idea, but he is certainly aware that fake news is a problem. Under pressure to do something about it -- bearing in mind that for a worrying percentage of people, Facebook is their only source of news -- Zuckerberg wants to not only make it clear that "we take misinformation seriously", but also that there are plans to tackle the problem. But they're not very good.

Having previously indicated that he was unconcerned with the issue of misinformation online, the Facebook CEO has since had something of a change of tune. He is now willing to acknowledge that there is something of a problem and is keen to be seen to be doing something about it. But it is, of course "complex, both technically and philosophically" (his words).

You'd be right to see this as something of a cop out.

You'd be equally justified in seeing Zuckerberg saying, "we do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties" as something of a cop out.

This last statement can be taken in a number of ways. The first is that it is passing the buck. Why should Facebook be responsible for fact-checking everything that's posted on the social network? But equally, should it be left to a team of neighbourhood watchers with nothing better to do to do the job instead? And trusted third parties? How can we be sure of impartiality?

This questions crop up before we even consider the concept of what truth actually is. In some instances, the truth is clear -- a square has right angles at each corner. But when it comes to the reporting of a political situation, a war, or just about anything else deemed newsworthy, things start to get a little murky. Without editorializing, it is damn near impossible to report any given event in a way that will be seen as presenting the complete truth by everyone who reads it. A story that said "Donald Trump is black" would, of course, be easy to disprove, but this is often not the case.

With the US election, the fake news problem has focused mainly on the views of the right and the left, but things are rarely that binary. A rightwing newspaper might be ridiculed by a leftwing reader for writing utter nonsense, but does that mean that everything in said paper is untrue? And, of course, vice versa for a rightwing reader of a leftwing paper.

The problem is extremely complicated and highly nuanced. Despite this, Mark Zuckerberg has taken to Facebook to outline what he wants to do. His post is a little light on details, but it does set out a few areas that will be the focus of projects to weed out fake news:

- Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.

- Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.

- Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.

- Warnings. We are exploring labelling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.

- Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.

- Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.

- Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.

There remain many problems here. Any system of news flagging that relies on input from the community is clearly going to be open to abuse. It’s naïve to think otherwise. Making it easier to report problems as fake only exacerbates the problem. Using third party fact-checkers might seem like a good idea -- and it gets Facebook off the hook a little -- but this will introduce a further element of distrust. Are the third parties on Facebook's payroll? Have they been selected because they hold a certain ideology? Will fact checker that hold extremist views be ignored or embraced?

Warning labels will have little to no effect on the spread of fake news. It ignores a very important factor: the stupidity and laziness of many people. Even if a story has a 'fake' label slapped on it, it is safe to assume that many people will not see it. They may ignore it. They may believe the label is wrong. They may believe the story has been labelled as part of a conspiracy. Stories that are labelled as fake will still be shared, they will still spread. How will this metric be factored in.

What about the issue of satirical stories? There are numerous high-profile news satire sites who deal in fake news -- that's why they exist. Sure, it might be easy to whitelist the big names, but this could then be unfair to newcomers and smaller players. Again the stupidity of readers comes into play here. An intelligent reader will consume a piece of satire and appreciate it for what it is. A less intelligent reader might read it and accept it as truth. There are very few -- through a combination of stupidity and laziness -- who would bother to check the veracity of the story if they were in doubt.

Facebook talks about interfering with the ad revenue of sites delivery fake news. This presents a problem for satirical and humorous sites too. It also means Facebook runs the risk of, essentially, placing editor of the news. As the company has said time and time again that it is not media company it needs to be very careful not to be seen acting like one -- and this is exactly what it going to happen when it starts tinkering even further with the presentation of news.

The truth is, after all, something of a rainbow. Or, to use a famous quote: this is my truth, tell me yours.

Photo credit: Alexey Boldin / Shutterstock

64 Responses to How will Facebook fight the fake news phenomenon? Poorly... and stupid, lazy users don't help

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.