8 in 10 Americans Don’t Trust Platforms to Moderate Content

Poll shows partisan divide on faith in social media companies to self-police

Most Americans don't trust social media sites to police themselves. - Credit by Illustration: Amira Lin
Headshot of Scott Nover

Key insights:

Americans have little faith social media companies will appropriately moderate content on their platforms, according to a new poll from the Knight Foundation and Gallup.

According to the survey released today, 84% of Americans say they “do not have much or any trust at all in social media companies to make the right decisions about what people can post on their sites.”

Most respondents indicated social media companies are not doing enough to remove harmful content on their sites, though there was strong disagreement along party lines.

Of respondents, 71% who identified as Democrats and 54% of independents said these companies are “not tough enough,” while only 32% of Republicans agreed.

In high-profile decisions, Facebook and Twitter recently made divergent calls on moderating President Donald Trump’s misleading and incendiary posts.

This decision was noted by John Sands, the Knight Foundation’s director of learning and impact, who told Adweek that there is a “significant trust gap” between Americans and social media companies.

“It remains to be seen which will be most effective at closing the trust gap that currently exists between these tech companies and the Americans who increasingly rely on their platforms for information and interaction,” Sands said.

Knight and Gallup also polled Americans on the platforms’ content oversight boards, such as the one Facebook recently introduced, which 81% of respondents said was a “good idea” or a “very good idea.”

Additionally, the poll asked about Section 230 of the Communications Decency Act, an obscure statute that has provided liability protection to online platforms for the past quarter century.

Of those who responded, 66% said they favor keeping the law, which shields platforms from most lawsuits over most user-generated content. However, 54% said the law has done “more harm than good because it has not made the companies accountable for illegal content on their sites and apps.”

Trump recently signed an executive order aimed at threatening Section 230 protections, an order that carries little legal weight since only Congress can amend legislation. However, there have been multiple bills introduced in recent months that seek to change these protections.

With content moderation front and center amid the Covid-19 pandemic and because of the president’s posts, which constantly push the limits of acceptable speech online, the largest players in social media are grappling with how to better police their own platforms.

Sands characterized these efforts as an “ongoing experiment” to win back user trust.

@ScottNover scott.nover@adweek.com Scott Nover is a platforms reporter at Adweek, covering social media companies and their influence.