If you’re a frequent YouTube viewer like me, you’ve probably noticed the surge of AI-related content over the past year. From AI-generated thumbnails to AI-created voiceovers and even entire videos, the platform has been flooded with this new technology.
YouTube has taken note of this trend and has announced new measures to safeguard its creators by introducing additional tools.
One of these is an enhancement to the well-known Content ID system, which has long been a source of frustration for YouTubers worried about demonetization when copyrighted music is detected in their videos.
Now, Content ID is being expanded to include AI-detection capabilities. It will be able to identify AI-generated singing voices modeled after real artists. This new feature is being fine-tuned in collaboration with YouTube’s partners and is slated for implementation in 2025.
But what about AI-generated images or videos? YouTube is working on that too. The company is “actively developing” technology that will detect and address AI-generated faces of real people in videos, though there’s no specific timeline for when this feature will be available to users or partners.
YouTube is also taking steps to prevent its content from being scraped by AI systems for model training, an issue that has sparked significant debate.
Companies like Nvidia have reportedly used publicly accessible YouTube videos to train AI models, which could potentially violate YouTube’s terms of service.
As the competition in the AI space heats up, particularly the video generation, platforms like YouTube and its parent company, Google, are key players.
However, individual users and creators are more concerned with unauthorized scraping designed to steal and replicate their likeness.
Tools that claim to train AI models using YouTube data are increasingly accessible, even on consumer-grade hardware.
YouTube has stated, “We’ll continue to employ measures to ensure that third parties respect [the terms of service], including ongoing investments in the systems that detect and prevent unauthorized access, up to and including blocking access from those who scrape.”
It’s worth noting that while YouTube’s terms of service prohibit third-party scraping, they do not stop YouTube or Google from processing videos on the platform for their own AI projects.
Although new rules require YouTube creators to disclose the use of AI-generated images, videos, and voices, a report in April revealed that Google allowed OpenAI to scrape YouTube content without legal opposition.
The reason? Google reportedly feared setting a legal precedent that could impact its own AI development efforts.