YouTube is taking steps to ensure that creators who post videos containing hateful content, inappropriate use of family entertainment characters or incendiary and demeaning content don’t earn any money from those videos.
While it’s not possible for us to cover every video scenario, we hope this additional information will provide you with more insight into the types of content that brands have told us they don’t want to advertise against and help you to make more informed content decisions. We know our systems aren’t perfect and we’re also working to further improve your ability to appeal impacted videos.
Bardin detailed the types of content impacted by the update to YouTube’s guidelines:
- Hateful content: Content that promotes discrimination or disparages or humiliates an individual or group of people on the basis of the individual’s or group’s race, ethnicity or ethnic origin, nationality, religion, disability, age, veteran status, sexual orientation, gender identity or other characteristic associated with systematic discrimination or marginalization.
- Inappropriate use of family entertainment characters: Content that depicts family entertainment characters engaged in violent, sexual, vile or otherwise inappropriate behavior, even if done for comedic or satirical purposes.
- Incendiary and demeaning content: Content that is gratuitously incendiary, inflammatory or demeaning. For example, video content that uses gratuitously disrespectful language that shames or insults an individual or group.
While it remains the case that videos that comply with our terms of service and community guidelines can remain on the platform, our advertiser-friendly content guidelines focus on what is specifically eligible for advertising. Content that does not comply with AdSense Policies and our ad-friendly guidelines will not be eligible for advertising.
Image courtesy of vDraw/iStock.