Facebook Detailed the Results of a Civil Rights Audit Led by Laura Murphy

The NAACP kicked off a seven-day boycott of the social network

Facebook is working with civil rights law firm Relman, Dane & Colfax Michail_Petrov-96/iStock

On the same day that the NAACP kicked off a seven-day boycott of Facebook and Instagram due to its data and privacy issues and their impact on African Americans, the social network released the results of the civil rights audit it has been conducting over the past few months.

The NAACP is leading #LogOutFacebook, starting Tuesday and running for seven days, and the organization is encouraging its partners, social media followers and supporters to follow suit.

The NAACP said in a press release announcing the boycott that it returned a donation that it recently received from Facebook, and president and CEO Derrick Johnson added, “Facebook’s engagement with partisan firms, its targeting of political opponents, the spread of misinformation and the utilization of Facebook for propaganda promoting disingenuous portrayals of the African American community is reprehensible.”

Facebook chief operating officer Sheryl Sandberg introduced the audit results in a Newsroom post, writing, “In May, we accepted the call to undertake a civil rights audit. We asked Laura Murphy, a highly respected civil rights and civil liberties leader, to guide the audit. After speaking with more than 90 civil rights organizations, today Laura is providing an important update on our progress.”

Murphy said her team has interviewed leading advocates from 90 different organizations since this summer, with more to come, and they have also met with representatives from the policy, product and enforcement teams at Facebook.

She added that Facebook is working with civil rights law firm Relman, Dane & Colfax.

Murphy highlighted certain issues that were brought up in her group’s conversations with advocates and experts:

  • The use of Facebook to intimidate voters and suppress voter participation, particularly among minority groups, and the absence of a public-facing policy on preventing this from happening.
  • The implementation of protocols and a civil rights infrastructure to ensure that these issues are considered before products, services and policies debut.
  • Better content moderation and enforcement to both protect minority groups from hate speech and ensure that activists and civil rights advocates are not censored.
  • Ensuring that Facebook’s ad targeting features cannot be used to exclude ethnic and religious minorities, immigrants, older workers, LGBTQ (lesbian, gay, bisexual, transgender and queer) individuals, women, families with children and other protected classes.
  • Promoting greater employee diversity in all functions and at all levels of the company.
  • Ensuring that artificial intelligence tools—such as machine learning, facial recognition and algorithms—do not facilitate bias.
  • Developing privacy measures that both protect civil rights and prevent unlawful discrimination.
  • Increasing transparency in Facebook’s policy-making process, enforcement and operations.

Murphy also discussed the misuse of Facebook’s platform during the run-up to the 2016 U.S. presidential election and the urgency around the recent midterm elections.

She wrote, “It is now clear that Facebook was slow to understand the IRA’s (Russia’s Internet Research Agency) activities, a point the company has acknowledged. Eventually, Facebook did take a number of steps to address concerns that the platform had been used to spread misinformation designed to influence elections, intimidate voters and suppress voting. These steps included significant changes related to advertisers’ posting of political and issue ads, along with the creation of a searchable database, expanding the third-party fact-checking program, tightening restrictions on advertising and removing fake accounts that were used to spread misinformation and inflammatory and divisive ads.”

Murphy continued, “While these were important steps in the right direction, the civil rights community continued to express concern about the dangers of election-related misinformation spreading on Facebook. In particular, civil rights groups identified the possibility of false information regarding voter registration requirements or voting logistics. Before the rise of social media, flyers containing false voting information were posted in minority neighborhoods—on street corners or in churches, schools and public parks. Today, the same falsehoods may spread online as memes in Facebook’s News Feed, with the potential to reach millions of people in a short amount of time. Containing the spread of this type of content was and remains imperative for the company to address.”

Civil rights advocates and experts that Murphy and her team spoke with over the summer shared the following recommendations for Facebook prior to November’s election:

  • Prevent misinformation and voter suppression, and develop clear and public policies on that topic.
  • Increase transparency regarding technical aspects of Facebook’s election integrity efforts.
  • Seek guidance from voting rights experts and have them train Facebook staff on how to respond to voter suppression content.

Murphy outlined the steps Facebook took to attempt to safeguard the election:

  • The social network strengthened its policy prohibiting voter suppression in its community standards.
  • Initiatives to boost voter registration, as well as reminders on Election Day.
  • Facebook retained voting experts as outside advisers to its policy and operations teams.
  • New options were added for users to report incorrect voting information found on Facebook.
  • A specific reporting channel was established for state election authorities, enabling them to report potential voter suppression content to the relevant policy and operations teams.
  • Facebook partnered with voting rights and election protection groups to report potentially suppressive content for review and action and established dedicated channels through which potentially violating content involving voter suppression would be reviewed.
  • The social network established a war room at its headquarters in Menlo Park, Calif., dedicated to safeguarding the election.
  • A continued aggressive approach to take down fake accounts.

Murphy also pointed out other steps Facebook has taken this year, including:

  • A more detailed version of the social network’s community standards was released in April.
  • In May, Facebook released a detailed report on content that it removed for violating its policies against adult nudity and sexual activity, fake accounts, hate speech, spam, terrorist propaganda and violence and graphic content.
  • Facebook’s View Ads feature, introduced in June, enables people to see all campaigns currently being run by an advertiser.
  • Thousands of ad targeting options were removed for ads related to housing, employment, credit, insurance and public accommodations if those targeting options could be misunderstood as describing groups on the basis of race, creed, color, national origin, veteran or military status, sexual orientation and disability status.
  • Facebook implemented a new commerce discrimination policy that prohibits discriminatory language in users’ commerce-related posts on Facebook Marketplace and buy-sell groups.
  • The company convened a multi-disciplinary team to study the issue of ensuring fairness in tools such as algorithms and AI.

Finally, Murphy detailed the first two tasks she and her group will focus on in 2019:

  • Exploring various approaches to content moderation, as well as measures for dealing with censorship and potentially discriminating content on Facebook’s platform.
  • Creating the civil rights accountability infrastructure she mentioned above in order to ensure that civil rights are considered “on the front end” when Facebook develops new products, features and policies.

She concluded, “For the past several years, civil rights groups have consistently expressed, both publicly and privately, their deeply held concerns about Facebook’s products, policies and practices and their implications on civil and human rights. The work that has been done over the past six months is an attempt to capture and consolidate their concerns to produce meaningful results. Given Facebook’s scope and scale, this continues to be a challenge. That being said, in the first six months of this audit, we have witnessed some progress and tangible results, including policy changes, improvements to enforcement, and greater transparency in certain areas. Importantly, Facebook has sought to deepen its engagement with the civil rights community through this process, and I want to thank my colleagues in the community for their advice, insight, and time, which directly led to these improvements. I fully realize that these leaders expect much more, and I am committed to working with them to help achieve the ambitious goals we have set, and which have the potential to positively impact Facebook’s global community. I look forward to issuing another update in 2019.”

david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.