youtube-sexualised-videos-children

Following a major backlash from some advertisers, YouTube has removed "sexualised videos of children" from the platform as a response. Part of the move is the closure of 270 accounts purportedly promoting illicit sexual content.

YouTube responds to the major callout of the public and the advertisers of contents uploaded on the streaming platform. Some of these videos have comments from suspected pedophiles.

Also read: GrabCar driver in Singapore faces 16 months jail time for sexual assault

In a statement, the Google-owned internet company said it was able to axe more than 270 accounts as well as more than 150,000 adult videos from the platform. It also turned off comments on over 625,000 videos that drew child predators.

"Over the past week we removed ads from nearly 2 million videos and over 50,000 channels masquerading as family-friendly content. Content that endangers children is abhorrent and unacceptable to us," reads YouTube's statement to VICE News.

YouTube has been bombarded with complaints with regards to unsuitable content ever since. But despite its drive to take down these kinds of content, volunteer moderators had previously admitted that filtering comments from child predators is a tedious task and there have been around 50,000 to 100,000 accounts still posting inappropriate comments.

Also read: Facebook decides you're 'gay' if you 'Like' something on Lady Gaga, says study

BuzzFeed first reported incidents of YouTube suggesting what to search through auto-filling. For instance, it tried to put in the search box "how to"; the autocomplete generator would suggest "have s*x kids" and "have s*x with your kids".

YouTube says it is investigating the matter.

"Earlier today our teams were alerted to this awful autocomplete result, and we worked to quickly remove it. We are investigating this matter to determine what was behind the appearance of this autocompletion," reads YouTube's statement to BuzzFeed.