YouTube is rolling out a new requirement for content creators: You must disclose when you're using AI-generated content in your videos. The disclosure appears in the video upload UI and will be used to power an "altered content" warning on videos.
Google previewed the "misleading AI content" policy in November, but the questionnaire is now going live. Google is mostly concerned about altered depictions of real people or events, which sounds like more election-season concerns about how AI can mislead people. Just last week, Google disabled election questions for its "Gemini" chatbot.
As always, the exact rules on YouTube are up for interpretation. Google says it's "requiring creators to disclose to viewers when realistic content—content a viewer could easily mistake for a real person, place, or event—is made with altered or synthetic media, including generative AI," but doesn't require creators to disclose manipulated content that is "clearly unrealistic, animated, includes special effects, or has used generative AI for production assistance."
Read 2 remaining paragraphs | Comments
https://ift.tt/Vhc5kOz
Comments
Post a Comment