The online video sharing service announced it has removed 8.3 million videos from October to December 2017 from its platform.
To remove such content, YouTube uses two types of filtering. Ie using machine learning algorithm and based on user reports. Approximately 6.7 million videos are called deleted after being tagged by machines with 76 percent of which have been reported by users.
In the company's official blog, YouTube calls the use of machine learning to speed up content deletion and the number has dropped dramatically since June last year.
Even so, user interference was recorded also remained significant. Approximately 9.3 million videos in that period are known to have been previously reported by users for follow-up.
Most of the deleted content is known to contain pornographic, spam, violence, and hate speech content.
Users also report nearly 95 percent of videos deleted in piracy or re-uploaded without the permission of the copyright holder.
To facilitate irrelevant video content, YouTube also introduces the reporting dashboard so users can view the status of the videos that have been flagged and reported. So far the feature can only be used to report a video requesting YouTube to review and have age audience restrictions.