YouTube Removes Ads From Videos That Are 'Demeaning'

Posted June 03, 2017

The new rules also clamp down on inappropriate use of family entertainment characters engaged in what it calls "violent, sexual, vile, or otherwise inappropriate behaviour", and videos found to include "gratuitously incendiary, inflammatory, or demeaning" content meant to shame or upset an individual or group. Now Youtube has made the guidelines clear. For example, it previously said videos that may be demonetized if they include "inappropriate language, including harassment, profanity and vulgar language".

YouTube has revised its guidelines to be more specific for content creators to point out which videos will be eligible for ad monetization and which videos will be not.

YouTube has clarified its rules on "hate speech" just a day after an European Union report stated that it was lagging behind Facebook in combating inappropriate content. Notably, in the blog post, YouTube mentioned the new guidelines as "advertiser-friendly content guidelines" that mainly focuses on what is suitable for advertising. "While it's not possible for us to cover every video scenario, we hope this additional information will provide you with more insight into the types of content that brands have told us they don't want to advertise against and help you to make more informed content decisions", Bardin says.

The ad exodus from YouTube has died down since its peak in March, but YouTube continues to update its guidelines to reassure advertisers and, in some ways, its creators.

Some YouTube fans have come out against the changes, arguing that the platform is undermining free speech. These categories were shared this week by YouTube in a message to creators.

It is also clamping down on what it calls "hateful" content that promotes discrimination against an individual or group of people on the basis of factors such as race, ethnicity, nationality, religion, disability, sexual orientation and gender identity. We use cookies to improve your browsing experience.

As another commenter noted, the context has begun to cause great concern among some of the YouTube community. "Being a YouTuber means you need a thesaurus to make a title now since "killing a Goomba" in Super Mario and "killing a police officer" in real life are both flagged because the word "killing".

They state that your video can't be monetized if it contains the following...

Hate preachers, rape apologists and anti-Semites appeared in the videos matched with the adverts, leading Matt Brittin, Google's head of Europe, the Middle East and Africa to "apologise to our partners and advertisers who might have been affected by their ads appearing on controversial content".

I doubt many would mind a racist, sexist, hate-filled rant by someone screaming in front of their smartphone camera not making money. Well, after the June 1st update, we now know exactly what the criteria is to get caught between YouTube's cross-hairs.