Facebook Tries to Tackle Vaccine Misinformation With New Updates

Site will not recommend groups, pages that spread false info

Vaccine hoaxes as defined by the World Health Organization and U.S. Centers for Disease Control and Prevention will serve as Facebook’s barometer for taking action, the company said. - Credit by Getty Images
Headshot of Kelsey Sutton

Facebook will try to curtail the amount of vaccination misinformation that spreads on its platform by no longer recommending groups and pages or accepting ads that post false information about vaccines, it announced Thursday.

In a blog post, Facebook vp of global policy management Monika Bickert said that the company would also no longer allow advertisers to target individuals on Facebook based on their potential interest in “vaccine controversies.” Vaccine misinformation will no longer appear in recommended content or in predictions when users search for the content, the company said, and anti-vaccination content would also no longer be recommended on Instagram.

Vaccine hoaxes as defined by the World Health Organization and U.S. Centers for Disease Control and Prevention will serve as Facebook’s barometer for taking action, the company said.

Facebook has repeatedly been implicated in the spreading of false information about vaccinations. In February, The Guardian reported on how easily anti-vaccine propaganda can spread on the platform. Additionally, earlier this week, an 18-year-old from Ohio testified before a Senate committee to talk about how his mother declined to vaccinate him when he was a child because of anti-vaccination misinformation she read in “social media groups.”

Facebook isn’t the only social media platform to grapple with the proliferation of vaccine misinformation on the platform. YouTube, which has been in hot water for recommending videos that contained false claims about vaccines, recently demonetized anti-vaccination videos over brand safety concerns. Pinterest has taken even more aggressive steps to try to address false information about vaccines on the platform, banning boards from anti-vaccination groups and blacklisting a number of search terms related to vaccination conspiracy theories.

Last month, U.S. Rep. Adam Schiff, D-Calif., sent a letter to Facebook CEO Mark Zuckerberg and Google CEO Sundar Pichai, demanding information about how the companies address medical misinformation.

Bickert said there would be more information on how Facebook may provide “more accurate information” about vaccines at the top of their search results in the future.

This move by Facebook does pull back the curtain on one intriguing aspect of the company: Over the last year, executives have stated that they don’t want to be the arbiters of truth, instead leaving content-moderating decisions to low-paid contractors. Thursday’s blog post illustrates that when faced with mounting pressure, Facebook will police some forms of content. The calculus behind when Facebook intervenes to address potentially dangerous forms of content on its platform—and when it chooses not to—remains hazy.


@kelseymsutton kelsey.sutton@adweek.com Kelsey Sutton is the streaming editor at Adweek, where she covers the business of streaming television.
Publish date: March 7, 2019 https://dev.adweek.com/digital/facebook-tries-to-tackle-vaccine-misinformation-with-new-updates/ © 2020 Adweek, LLC. - All Rights Reserved and NOT FOR REPRINT