YouTube said it removed more than 9 million videos in the fourth quarter of 2017, with 6.7 million of those flagged for review by machines rather than humans.
The Google-owned video site said the majority of the removed videos were spam or adult content, adding that of the 6.7 million that were machine-flagged, 76 percent were removed before receiving a single view.
YouTube added in a blog post that 8 percent of videos flagged and removed for violent extremism at the beginning of 2017 were taken down with fewer than 10 views, but after the introduction of machine learning flagging last June, that figure jumped to more than 50 percent.
These statistics came from the new quarterly report being released by YouTube to provide transparency on how its community guidelines are being enforced, and YouTube said it plans to add data on comments, speed of removal and policy removal reasons by the end of the year.
The video site also introduced a reporting history dashboard that will enable users who have flagged videos for review to see the status of those reviews.
YouTube said in its blog post, “Deploying machine learning actually means more people reviewing content, not fewer. Our systems rely on human review to assess whether content violates our policies,” and it shared the video below to provide more details: