YouTube is often criticized for seemingly not doing enough to combat inappropriate videos on its site, but according to the Google-owned firm’s first quarterly moderation report, it took down a massive 8.3 million videos in the last quarter of 2017.
The report arrives after YouTube last year promised more transparency in how it addresses abuse on the site. It said the quarterly review is an initial step in dealing with the problem and would “show the progress [it’s] making in removing violative content from [its] platform.”
“The majority of these 8m videos were spam or people attempting to upload adult content and represent a fraction of a percent of YouTube’s total views during this time period,” the company said.
6.7 million of the removed videos were first flagged by machines before being forwarded to human moderators for deletion. 76 percent of these were removed from YouTube before a single person viewed them. A further 1.1 million flags came from members of the Trusted Flagger Program, while 402,335 were identified by general YouTube users.
In total, 9.3 million videos were flagged by humans, which means only around one in nine of these resulted in the flagged video being removed. 95% of these flags came from YouTube users and the rest from the Trusted Flagger program. Most were for sexual content or spam. India was the country that flagged the most content, followed by the United States, Brazil, then Russia.
Last year, the company was lambasted for allowing disturbing content on its YouTube for Kids app. Just a few months’ ago, a conspiracy clip that claimed one of the students in the Florida shooting was a crisis actor became the top trending video.
Back in December, YouTube promised to increase its number of human content reviewers to over 10,000 this year, up from the “thousands” that did the job in 2017. The site said it has already “staffed the majority of additional roles needed to reach [its] contribution to meeting that goal.”