YouTube is making a policy change to its treatment of videos targeted toward minors and young children.
The video platform says it will now remove all content that contains “violent” or “mature” themes if it is targeted toward kids, either through the title of the video, its description, or the accompanying tags.
YouTube says this type of content “will no longer be allowed on the platform.” Prior to this change, YouTube was age-restricting such videos, but now it’s going a step further to help clean up the platform and make it a safer place for children amid intense regulatory scrutiny.
The policy change was announced on a YouTube Help community forum. The service says it will begin ramping up enforcement of this new policy over the next 30 days, to give creators a chance to become familiar with the new rules.
YouTube says it will remove videos that violate the policy, but it won’t be giving strikes to channels until the 30-day period is up. YouTube says it won’t be handing out strikes to videos uploaded prior to the policy change, but it still reserves the right to remove those videos.
YouTube advises creators check the YouTube Kids guidelines if they want to specifically reach children with their videos, and it also advises creators to make sure their descriptions and tags are targeting the right audience to avoid getting caught up in the ban.
The service also says it will be age-restricting more content that could be confusingly viewed as kid-friendly, like adult cartoons.
Edited from The Verge