YouTube videos with miniatures depicting women involved in various sexual acts with horses and dogs complete the main search results on the video platform, according to a report from BuzzFeed News . Some of these videos, which can be easily found through YouTube's algorithmic recommendation engine after looking for innocuous phrases like "girl and her horse", have millions of visits and have been on the platform for months.
Of course, YouTube videos that represent such acts would be more easily captured by the company's algorithmic filters, its user reporting system and its human content moderators. More difficult to find and eliminate are videos that use obscene and graphic images as thumbnails, along with clickbait titles, to increase audience and generate more advertising revenue. It does not seem that any of the videos with bestiality of the miniatures show, in fact, bestiality.
This is not an isolated problem, but one more example of how the fundamental structure of YouTube can be exploited by bad actors, many of whom play with the platform. Rules to generate advertising revenue for click farms or for nefarious purposes. Like Facebook and Twitter, YouTube has had problems in recent years with the lack of control over the vast amount of content generated by users abroad every day. Although YouTube has powerful algorithms that help to mark the content and many thousands of human moderators hired, it seems that every week there is a new problem that shows how fragile and poorly equipped the company's moderation system is to deal with content that is against the rules or otherwise illegal.
So, a problem of moderation that began many years ago with copyrighted content has expanded to include videos of propaganda and recruitment of terrorists, content of child exploitation and pornography and other explicit material, between millions and millions of people. other videos not suitable for advertisers. YouTube has made substantial changes to its platform to appease advertisers, stifle criticism and improve the security and legality of their product. Those changes include the promise of hiring more human moderators, massive demonetization and banning of accounts, and updates to their terms of service and site policies.
According to BuzzFeed YouTube made and scrubbed its platform of videos and accounts responsible for bestiality thumbnail content once the organization reported the problem. In a story published by The New York Times today, YouTube said it deleted a total of 8.28 million videos in 2017, with about 80 percent of those takedowns starting with a banner of its intelligence content artificial. moderation system. However, as long as YouTube relies primarily on software to address so-called problematic videos, you will have to manually delete your content platform as thumbnails of bestiality and any other dark corner of the Internet from YouTube's public search results and through its YouTube engine. recommendation.