Facebook Fires Another Salvo In the Battle vs. Fake News

The social network is looking to disrupt economic incentives because most false news is financially motivated

Headshot of David Cohen

Facebook’s latest offensive in its war against fake news is an educational tool to help users spot it, created in conjunction with nonprofit First Draft.

Vice president of News Feed Adam Mosseri introduced the new tool in a Newsroom post, saying that it will appear atop the News Feeds of users in 14 countries “for a few days,” and adding:

When people click on this educational tool at the top of their News Feed, they will see more information and resources in the Facebook Help Center, including tips on how to spot false news, such as checking the URL of the site, investigating the source and looking for other reports on the topic.

FacebookFirstDraftFakeNewsTool from SocialTimes on Vimeo.

The new tool is also available in French, German and Italian (pictured below).

In a separate Newsroom post, Mosseri also detailed the three key areas the social network is focusing on and steps it is taking in each of those areas.

Pointing out that most fake news is financially motivated, he wrote:

When it comes to fighting false news, one of the most effective approaches is removing the economic incentives for traffickers of misinformation. We’ve found that a lot of fake news is financially motivated. These spammers make money by masquerading as legitimate news publishers and posting hoaxes that get people to visit their sites, which are often mostly ads.

Mosseri said Facebook is attacking this problem by better identifying fake news through the social network’s community and third-party fact-checking organizations; making it more difficult for people responsible for posting fake news to buy ads on its platform; and updating the technology Facebook uses to detect fake accounts.

He also provided an overview of new products Facebook is developing:

  • Ranking improvements: We’re always looking to improve News Feed by listening to what the community tells us. We’ve found opportunities like the fact that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. We’re continuing to test this signal and others in News Feed ranking in order to reduce the prevalence of false news content.
  • Easier reporting: We’ve always relied on our community to determine what is valuable and what is not. We’re testing ways to make it easier to report a false news story if you see one on Facebook, which you can do by clicking the upper-right hand corner of a post. Stories that are flagged as false by our community then might show up lower in your feed.
  • Working with partners: We believe that providing more context can help people decide for themselves what to trust and what to share. We’ve started a program to work with independent third-party fact-checking organizations. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact-checking organizations identify a story as false, it will get flagged as disputed, and there will be a link to a corresponding article explaining why. Stories that have been disputed also appear lower in News Feed.

Finally, Mosseri detailed the steps Facebook has taken to help its users make more informed decisions about the content they see, highlighting the formation of the Facebook Journalism Project in January and the social network’s participation as a founding funder of the News Integrity Initiative, which was announced earlier this week.

david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.