Ars Technica reports:
YouTube is rolling out a new requirement for content creators: You must disclose when you’re using AI-generated content in your videos. The disclosure appears in the video upload UI and will be used to power an “altered content” warning on videos.
Google previewed the “misleading AI content” policy in November, but the questionnaire is now going live. Google is mostly concerned about altered depictions of real people or events, which sounds like more election-season concerns about how AI can mislead people.
Google says the labels will start rolling out “across all YouTube surfaces and formats in the weeks ahead, beginning with the YouTube app on your phone, and soon on your desktop and TV.”
Read the full article.
YouTube will require disclosure of AI-manipulated videos from creatorshttps://t.co/bxMpABUerE
— Truth in Advertising (@TruthinAd) March 20, 2024