Internet

Youtube removes more than eight million unsuitable videos in three months

Table of contents:

Anonim

YouTube has published a transparency report on the application of its community guidelines, which do not allow content related to pornography, incitement to violence, harassment or hate speech among many others. Thanks to its machine learning algorithms, the company has been able to remove almost 8.3 million videos from its platform between October and December 2017.

YouTube detects videos with inappropriate content faster and faster

According to the report's release, nearly 6.7 million of those deleted videos were marked for the first time by machines, with 76 percent deleted before anyone saw them. The use of advanced machine learning algorithms has dramatically reduced response time in removing inappropriate content since the introduction of such techniques in June 2017.

We recommend reading our post about AMD voids Ryzen's warranty if you use a different heatsink than the stock one

Beyond machines, human users have also flagged more than 9.3 million inappropriate videos in the same time period, most of which due to content related to sexuality, spam, or hate speech and violence. It is noted that almost 95 percent of these videos were tagged by common users, while much of the rest comes from a group of specialized users that YouTube calls Trusted Flaggers.

YouTube is also rolling out the reporting history dashboard, so users can track the status of the videos they flagged when the company reviews them. The new dashboard is out now, and it also represents videos that may be age-restricted, rather than removed entirely after the review process. What do you think of this measure adopted by YouTube?

Theguardian font

Internet

Editor's choice

Back to top button