Fake news allegedly plagued the U.S. Presidential election. Most of them were shared on Facebook, that decided to take action against them.
In a nutshell, Facebook decided to:
- allow users to report stories they think are fake. One possible drawback is that people tend to report as fake news real news they don’t like. Facebook seems quite cautious on this functionality (“We’re testing several ways to make it easier to report a hoax”)
- flagging disputed stories as such, with independent fact-checkers providing the status of “disputed”. Facebook will ask to these organizations to fact check articles, not the contrary
- “Stories that have been disputed may also appear lower in News Feed”, which means these stories may be harder to reach for users
- sharing disputed stories will display a warning
Besides these functional ways to fight fake news, Facebook also decided to take financial action against fake news:
- disputed stories can’t be turned into an ad nor promoted
- spoofed domains are now banned
The actual story behind fake news is that many of them aren’t shared by politically oriented organizations but by people who wants to make a lot of (easy) money with click-bait stories, like teenagers in Macedonia.
To be honest, no one actually knows how to properly fight fake news. Facebook actions are interesting, but it’ll take some time to figure out if these actions are useful or not. At the very least, that’s a good think Facebook decided to do something against fake news.