YouTube reveals it removed 8 million videos from the site in the first three months of 2018

Following a statement last year in which it promised users greater transparency into how it tackles “problematic content”, YouTube has published its first ever Community Guidelines Report. To let users check the status of videos they’ve flagged for review, the site has also announced the Reporting History Dashboard.

The report shows that during the first three months of 2018, YouTube removed more than 8 million videos from its site. The vast majority of these – nearly 6.7 million – were automatically flagged by machines, with 1.1 million being identified by YouTube’s trusted flaggers, and 400,000 being flagged by normal users.

“The majority of these 8 million videos were mostly spam or people attempting to upload adult content”, YouTube summarised in a blog post.

Of the 6.7 million videos flagged by machines, the report claims 76% were removed before they received a single view, and YouTube claims its machine learning is especially effective at identifying and removing videos that contain violent extremism.

“At the beginning of 2017, 8% of the videos flagged and removed for violent extremism were taken down with fewer than ten views. We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than ten views.”

In total, 9.3 million videos were flagged by humans for potential violation of YouTube’s community guidelines, showing that roughly only one in nine flags resulted in a video eventually being taken down. India was the country responsible for the highest volume of human flags, with the US in second place, and the UK some way down the list in sixth place.

“This regular update will help show the progress we’re making in removing violative content from our platform,” the blog post explained. “By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.”

The report should be well received by those who’ve feel that YouTube doesn’t do enough to remove offensive or inappropriate content from its site. At the start of the year, the site removed Logan Paul from the Google Preferred programme after he posted a video depicting a dead body in the Aokigahara forest in Japan.

Now, when you manually report a video, you can check its status using the Reporting History Dashboard. If the content is deemed to violate YouTube’s terms, it’ll be removed and the dashboard will be updated accordingly.

Disclaimer: Some pages on this site may include an affiliate link. This does not effect our editorial in any way.