Skip to main content

2024 WITH iMUSICIAN: Discover all that we've accomplished together this past year! ✨💅 Check it out

All blog posts

YouTube Is Building Tools for Detecting AI-Generated Voices and Faces in Videos

  • Martina
  • 23 September 2024, Monday
Share this article on

YouTube is following in the footsteps of its competitors by developing new tools to detect AI-generated voices and likenesses of people in videos released on its platform.

New measures to combat the misuse of AI technology

YouTube has reported on the news in a blog post published earlier this month (September 5, 2024), stating that it’s working on building “synthetic-singing identification technology,” allowing its partners to automatically identify content that simulates singing voices.

The new technology will be integrated into the existing Content ID tool designed to analyze music published on its platform and thus help artists and music rights holders get compensated for unauthorized use of their music.

In addition to voice detection, YouTube is also developing a tool that will allow people from various industries, including artists, actors, professional athletes, and content creators, to “detect and manage” AI-generated content (AIGC) featuring their faces. Furthermore, the video giant emphasized that scraping content to create AI-generated material without permission violates its terms and conditions.

As the generative AI landscape continues to evolve, we recognize creators may want more control over how they collaborate with third-party companies to develop AI tools,” YouTube noted in its post, promising more updates later in the year.

It’s assumed that the music industry will welcome YouTube’s latest innovation with open arms, as it has long advocated for measures to prevent the unauthorized use of people’s likenesses and voices in AICG. Industry professionals, including artists, songwriters, and executives, previously applauded the introduction of the ‘No FAKES Act’ and ‘No AI FRAUD Act’ bills in the US Senate and US Congress, respectively.

If passed, the ‘No FAKES Act’ would establish a legal right to one’s own likeness and voice under US federal law, while the ‘No AI FRAUD Act’ would lawfully allow individuals to sue when their likeness or voice would be mimicked without permission in AIGC.

Other media platforms, such as TikTok, have previously implemented policies that strive to combat the misuse of AI technology. In June of this year, TikTok announced that it would automatically detect and label AIGC uploaded from other platforms. YouTube itself has previously unveiled new AI likeness and deepfake protection measures, allowing users to request the removal of unauthorized lookalike and soundalike content.

Ready to get your music out there?

Distribute your music to the widest range of streaming platforms and shops worldwide. 

Get Started
Share:
Always stay up-to-date

All You Need. All in One Place.

Get tips on How to Succeed as an Artist, receive Music Distribution Discounts, and get the latest iMusician news sent straight to your inbox! Everything you need to grow your music career.