Yesterday afternoon Facebook posted about more detailed abuse reporting features now being provided for users. Facebook wants to protect those users who experience or witness “bullying, harassment, unwanted contact or offensive behavior”. That’s why the company is rolling out much more granular reporting features. Users can select from “nudity or pornography, drug use, excessive gore or violence, attacks individual or group, advertisement or spam or infringes on your intellectual property” when reporting violating images.
Videos in contrast can report from the following categories: “Not a personal video, nudity or pornography, drug use, excessive gore or violent, racist/hate speech, or targets me or a friend”. With more granular reporting features, Facebook can assign issues to the proper individuals within the company and respond quickly. With over 300 million users, scaling content filtering has been a significant challenge, as it’s managed by a relatively small team of employees.
As Jessica Ghastin writes in the company’s blog post, “The information you provide helps our international team of professional reviewers prioritize reports and know what they’re looking for when reviewing the content.” Protecting users is important to ensure that they keep coming back to the site. It’s a small upgrade but it emphasizes how important it is for Facebook to have efficient filtering mechanisms.