(CNN) – YouTube is once again trying to clean up its platform.
On Monday, the Google-owned company said it had deleted more than eight million videos between October and December for violation of its community terms. Most of the videos were spam or “adult content.”
YouTube reported this in its first quarterly report on how it is strengthening its terms.
“This regular update will help show the progress we are making in removing infringing content from our platform,” said the video sharing site in a blog post.
According to the report, computers detect most of the videos that end up being deleted. He said that 6.7 million videos were marked for the first time for review by machines, not by humans. Of these, 76% were withdrawn before receiving any feedback from users.
YouTube has faced complaints from critics and advertisers who say the company has trouble handling offensive content on its site.
Last week, a CNN investigation found advertisements from more than 300 companies and organizations operating on YouTube channels promoting white nationalists, Nazis, pedophiles, North Korean propaganda and other controversial or extremist content. In the past, the ads appeared in videos of ISIS and other extremist and hate content.
“I think it’s an underlying problem with the [YouTube] business model,” Nicole Perrin, senior analyst at eMarketer, told CNN last week. “For years [they] encouraged the creators to put almost everything on the site, which led to an explosion of content.”
YouTube has about a billion users who watch a billion hours of video every day, making YouTube a difficult place to control.
Daniel Ives, head of technology research at GBH Insights, said the company’s new report on its efforts to remove problematic videos is a step in the right direction.
“YouTube and Google face increasing pressures to intensify their projection and make efforts around inappropriate content,” he said. “Transparency is key in this hot topic and these quarterly posts on the blog are a sign that Google is aggressively focusing on this area in the coming years.”
Google is committed to hiring 10,000 employees throughout the company by the end of this year to address content that violates their conditions. YouTube said on Monday that it fulfilled most of the additional roles needed to reach its contribution to that goal.
YouTube also said it will add more details to the quarterly reports before the end of the year, such as feedback information, the speed of removal and elimination policies, and announced a “Report History” panel where users can check the status of the videos that they marked for review.
On Monday, Google’s parent company, Alphabet, said profits reached $ 9.4 billion in the first three months of 2018, a big jump from the $ 5.4 billion it reported a year ago.