In a blog post, YouTube Product Management vice president Johanna Wright announced that the platform is "toughening" its approach to protect families and children on the platform after it noticed that a number of content are able to pass as family-friendly, when they are actually safe only for adult-viewing.
The Times investigation alleged that YouTube does not do enough to pro-actively check for inappropriate images of children and instead relies on software algorithms, external non-government groups and police forces to flag such content.
The company will also continue to work with the National Center for Missing and Exploited Children (NCMEC) to report the potentially illegal behaviour to law enforcement, said Wright.
Following several reports about disturbing videos involving children's favourite characters making their way in front of kids, YouTube moved to change its policies in order to stop this from happening.
A Mars spokesperson said in a comment to the news company: "We are shocked and appalled to see that our adverts have appeared alongside such exploitative and inappropriate content". Since it started this policy in June, the company has removed ads from 3 million videos and further strengthened the application of that policy to remove ads from another 500K violative videos. To help better understand how to treat this content, YouTube will be growing the number of experts it works with, and doubling the number of Trusted Flaggers it partners with in this area. This week, Buzzfeed reported about "hundreds of disturbing videos showing children in distress". The company will remove inappropriate sexual or predatory comments on videos featuring minors and, starting this week, it will take an even more aggressive stance by turning off all comments on videos of minors where these types of comments appear. The media outlet also contacted YouTube regarding some of its verified accounts featuring such videos, each of which allegedly had millions of subscribers. "When you block a video or channel, you will no longer see that video on the YouTube Kids apps when you are signed in". Earlier this month, YouTube said that it would increase age-restrictions on such content on its main YouTube service.
Children are reportedly stumbling across the videos as they pop up as suggestions in apps like YouTube Kids. The children are frightened, tied or forced to do some action.
In response, a YouTube spokesman said: "There shouldn't be any ads running on this content and we are working urgently to fix this".