Starting on Thursday, Google will police YouTube like it never has before, adding warnings and disabling advertising on videos that the company determines crosses its new threshold for offensive content. YouTube isn’t removing the selected videos, but is instead setting new restrictions on viewing, sharing and making money on them. A note detailing the changes will go to producers of the affected videos on Thursday, according to a spokeswoman for the Alphabet Inc. company.
Videos tagged by its new policy won’t be able to run ads or have comments posted, and won’t appear in any recommended lists on the video site. A warning screen will also appear before the videos, which will not be able to play when embedded on external websites. YouTube will let video creators contest the restrictions through an appeals process, a spokeswoman said.
“YouTube doesn’t allow hate speech or content that promotes or incites violence,” the Thursday letter to YouTube creators reads, according to a copy viewed by Bloomberg News. “In some cases, flagged videos that do not clearly breach the Community Guidelines but whose content is potentially controversial or offensive may remain up, but with some features disabled.”
With the ability to comment, share, or even upvote a video, the audience for the remaining clips should plummet.