7.8 Million YouTube Videos Were Removed During the Third Quarter

Over 224 million comments were also taken down

Spam and adult content were the most common reasons for removal JasonDoiy/iStock

YouTube removed 7.8 million videos during the third quarter of 2018, with 81 percent of those videos originally detected by machines and 74.5 percent of those that were detected by machines never receiving a single view, it revealed in its most recent community guidelines enforcement report.

The Google-owned video site said in a blog post that more than 90 percent of channels and over 80 percent of videos removed in September were pulled for violating its policies on spam or adult content.

One strike is applied to a channel when a video is removed, and repeat violators are removed, as well as those that “contain a single egregious violation,” such as child sexual exploitation.

YouTube said more than 90 percent of videos removed in September for violent extremism or child safety had tallied fewer than 10 views when they were taken down.

The company wrote in its blog post, “Over the past year, we’re strengthened our child safety enforcement, regularly consulting with experts to make sure our policies capture a broad range of content that may be harmful to children, including things like minors fighting or engaging in potentially dangerous dares. Accordingly, we saw that 10.2 percent of video removals were for child safety, while child sexual abuse material represents a fraction of a percent of the content we remove.”

YouTube also provided an update on the combination of machine learning and human reviewers that it used to flag, review and remove spam, hate speech and other abuse in comments on its platform.

Over 224 million comments were removed for violating YouTube’s community guidelines during the third quarter, with the majority of those being removed for spam.

YouTube said the number of comments removed “represents a fraction of the billions of comments posted on YouTube each quarter,” adding that 11 percent of its daily users were more likely to comment on videos than they were last year.

The video site introduced tools for creators in November 2016, enabling them to hold all comments for review or to automatically hold comments with links or potential offensive content.

YouTube said over 1 million video creators are now using those tools.


david.cohen@adweek.com David Cohen is editor of Adweek's Social Pro Daily.
{"taxonomy":"","sortby":"","label":"","shouldShow":""}