In a previous article, we talked about the recent changes YouTube made to its monetization policy to better identify and address inauthentic, repetitive AI content, also called AI slop. YouTube is just one of many platforms making policy updates with the same purpose. But what exactly is AI slop, specifically across visual media and music? And why has it become such a hot topic online? Let’s explore it together!
The Surge of Mass AI-Generated Content, aka AI Slop
Over the past few months, a phenomenon called AI slop has been spreading across social media platforms. The term AI Slop refers to a flood of mass-produced, low-quality AI-generated content that has been contagiously saturating online platforms. It is typically marked by its incredibly banal, formulaic, and sometimes factually incorrect nature, as well as inherent lack of originality and deeper value. What is also common for this type of content is that it’s often produced with minimal to no effort and at incredible volume.
Visually and thematically, there are no limits to what AI slop can look like. It can be pretty much anything from emotion-driven images meant to evoke joy, nostalgia, disgust, and everything in between, to unreliable summaries of information, poorly designed and executed marketing materials, and these just weird-looking, supposedly entertaining videos appearing on your feed.
We’re sure that most of you have seen the clearly AI-generated animations of animals, the "which bed would you sleep in" content, the annoyingly bizarre Wednesday series dancing videos, or the endless variations of pictures with Jesus everywhere. And let’s not forget about Tralalero Tralala, Tung Tung Tung Sahur, and other videos of the so-called Italian brainrot genre—surrealist AI-generated memes of anything one can imagine carrying pseudo-Italian names.
As one can imagine, the term has a negative connotation, similar to 'spam,’ and is generally seen as digital clutter that pollutes the internet, mainly on social media platforms and search engines. The media initially started reporting on AI spam flooding TikTok and especially Facebook, where high-engagement posts attracted higher advertising rates and generated more revenue for creators.
Shortly after, mass-produced AI content became common on other social media platforms like YouTube, Instagram, and Pinterest. On all of them, the goal has been the same: to drive high engagement, go viral, and achieve significant monetary growth.
AI Slop in Music
Naturally, with the rise of AI, slop has expanded beyond visual and written content and is not foreign to music either. In fact, AI-generated music is becoming increasingly prominent on platforms like Spotify, Apple Music, and others. As the technology improves, it also becomes more difficult to tell the difference between an authentic track and one created with AI. It’s therefore quite possible that you may encounter a song—perhaps the next one the algorithm plays—that you have no idea was generated by AI.
Recently, a band called The Velvet Sundown, which claims to be ‘saving modern rock’ in its Instagram bio, has gained significant attention on streaming platforms. Their song Dust on the Wind, released on June 5th, 2025, has accumulated over 2,5 million streams on Spotify alone. At the time of writing this article, the band has nearly 600,000 monthly listeners, which supposedly was over a million at one point. Oh, and they also released three full albums over a span of a month and a half. How is that possible? The band is entirely AI-generated—from its name to its story, band members, image, promotional pictures, and, most importantly, its music.
Initially, the band denied the serious accusations that they used AI to create, well, everything. Later, the person behind ‘the project,’ under the alias Andrew Frelon, acknowledged that the band is completely AI-generated (though we can say in a very chaotic way, where he first claimed he was a hoaxer who managed the band’s fake X account). Specifically, he was using Suno, a well-known AI music creation program, to create the music.
Later, the band’s Spotify bio was updated to reflect its true nature, calling it “a synthetic music project guided by human creative direction, and composed, voiced, and visualized with the support of artificial intelligence.” Whatever that’s supposed to actually mean.
Although there haven’t been many cases of 100% AI bands or artists that would garner as much attention and coverage as the Velvet Sundown, it clearly shows a trend in music that we can only expect to grow in the future.
Earlier this July, the US-American magazine Wired wrote an entire piece on AI-slop music, highlighting artists and tracks that have the word AI written all over them. This includes BannedVinylCollection, a Spotify profile that releases explicit and erotica-themed songs, Nick Hustles, known for their song I Caught Santa Clause Sniffing Cocaine, and Vinih Pray, whose AI song A Million Colours has nearly 1.5 million streams.
What stood out to us the most in the article was the story of a musician and editor, Andy Cush, who once overheard instrumental music blasting in a park in New York and was so captivated by it that he approached the person playing it on a stereo to ask what it was. As he recalled, he was particularly impressed by the guitarist’s proficiency and skills. He later found out that it was AI-generated. “It was a weird experience and prompted somewhat of a crisis for me,” explained Cush. He went on to say that it made him realize writing off AI music might be harder than he initially thought.
It generally seems that, although coming to terms with the fact that AI will inevitably be part of the music (and beyond) in the future, the industry doesn’t appear prepared for the true impact of that technology, especially in terms of generative AI.
In a video about AI slop and technology in music released shortly after it was confirmed that the Velvet Sundown’s first album was 100% AI-generated, creator Nick Cesarz said he strongly believed that the band merely wanted to gain some notoriety before releasing a ‘real’ second album, showing they were, in fact, a human band. As nice as it might be, this didn’t turn out to be true. Maybe the truth is that AI slop, whether in music or other audio-visual content, simply comes from it being an easy way to create content that can be monetized in today’s digital landscape dominated by social media and streaming.
How does the public respond to AI Slop?
Like with any generative AI topic, the public—along with experts and industry insiders—has generally opposed AI slop. The overall vibe is that AI slop pollutes the internet and worsens our experience across platforms. Nesrine Malik, a Guardian columnist, sees slop as something that is actively “distorting our reality” and pushing us “deeper and deeper into subjective worlds rather than objective reality.”
Regarding music specifically, people seem to be even more resistant to the idea. While some call AI-slop music cheap and shallow, others go further, saying it is a “slap in the face of artistic integrity.” One post on the Velvet Sundown Instagram profile shows a comment that says “Silence is better than AI music.”
“Generative AI creations are not art. Art is the product of influences filtered through the experiences, tastes, and abilities of a conscious mind. AI content is nothing more than past influences filtered through an algorithm. While AI tools can be useful in the process (...), if the bulk of the creative decisions aren't made by a conscious mind, it's just slop,” says one Reddit user.
However, there’s also another aspect to the conversation surrounding AI slop, and that is profitability. As we outlined earlier in this article, the driving force behind AI slop has ultimately been generating engagement and increasing profits.
In fact, research shows that low-quality, mass-produced AI content has become a lucrative venture for creators. AI makes content more accessible and enables people to produce large volumes of posts, following platforms’ algorithms’ number one rule for virality, which is to “continuously share fresh and diverse content.”
It’s also been reported that people producing AI slop come from a wide range of backgrounds, including students, individuals facing unemployment, people engaged in domestic work, or individuals from economically marginalized backgrounds. For them, making AI-generated content is one of the latest trends for earning a side income or finding solutions to unemployment. For some, it also represents an alternative to more traditional forms of gig work.
The Social Media War against AI Slop
As mentioned at the very start of this article, YouTube has recently made headlines by introducing new updates to its monetization policies. The goal? To better detect and combat the flood of mass-produced AI-generated content. And YouTube is not the only platform doing this. Social media platforms in general have decided to take measures necessary to fight AI slop.
For creators, it might be an easy way to earn income. Meanwhile, for platforms like YouTube, Meta, or TikTok, it is a method that they believe exploits their monetization systems. The policy updates implemented by these channels are not only affecting creators of visual content—they also have implications for musicians and how their music is delivered and monetized there.
For example, both TikTok and Meta recently updated their audio delivery policies by separating their audio libraries from “fingerprint rights.” This change aims to prevent so-called bad matches or misidentifications caused by overly sensitive scanning technology. Previously, music had to have active fingerprint rights to be monetized in videos where it was used. TikTok and Meta platforms used fingerprint technology to scan for videos with background music, which is when a track would appear organically in videos rather than being selected from their libraries, and both usages could be monetized.
In other words, tracks distributed to TikTok, Facebook, and Instagram would automatically qualify for both in-library use and background monetization through the fingerprinting technology. While the requirements for fingerprint rights have been removed to streamline the process, it now means that background uses of music are no longer automatically monetized.
Speaking with the Artist & Relations Manager at iMusician, we learned that artists who want to monetize their tracks as background music on TikTok, Instagram, or Facebook must meet strict eligibility criteria. These requirements can vary between distributors, depending on genre or rights ownership. Based on our daily work supporting independent musicians across Europe, North America, and Latin America, we’ve seen that tracks are often declined if they:
Contain public domain works or audio containing public domain elements,
Include very generic/common sounds (applause, white noise, simple drum loops, ringtones, animal noises, spoken word, movie/TV dialogue not set to music, etc.),
Use non-exclusive beats or samples from sample packs,
Are DJ mixes, compilations, or full albums,
Are tracks licensed on a non-exclusive basis (e.g., production libraries, soundtrack licenses),
Resemble “soundalikes” or karaoke versions,
Fall into certain classical music categories, especially public domain works,
Are live versions that are nearly identical to studio recordings,
Are remixes, remasters, or derivative versions of songs where the artist doesn’t control the rights,
Include audio where the uploader doesn’t control the full rights to claim 100% of the delivered track.
If a track doesn’t meet these criteria, background music monetization requests will usually be declined. As our A&R manager notes, classical music has been a notable case: on Meta platforms, classical repertoire—including some contemporary works—has occasionally been affected more broadly than intended. Distributors, including iMusician, are currently in discussions with Meta to clarify these rules.
How Do Streaming Platforms Respond to AI Slop?
Compared to social media platforms, traditional music streaming platforms have generally been tolerant of the rise of mass-produced, AI-generated music and content. Apple Music and iTunes, for example, have not publicly introduced any major restrictions, and in general, most streaming platforms continue to distribute AI-produced tracks without significant hurdles.
We could say that Spotify, in particular, has been operating with an “eyes closed, ears open” approach. In other words, if people listen and the platform earns revenue, it does not really matter how the music was created. While Spotify reportedly bans AI music that deepfakes actual artists, it does not have a system in place to label AI content or require artists to disclose it during the upload process.
Meanwhile, Deezer appears to be more proactive in establishing boundaries around AI content. In early 2025, the platform launched a “cutting-edge” AI detection tool capable of accurately identifying synthetic audio content that not only flags but also removes some AI-generated content and bans AI-flagged material from its recommendations. In fact, Deezer seems to be the only major streaming service that offers listeners a way to actively block fully AI-created songs from appearing through its recommendation algorithm.
What Do All These Changes Mean for Artists? And Who Is Benefiting From Them?
While the music industry isn’t fully unified on how good or bad mass-produced AI-generated content is and how to approach it, the trend across social media is moving in one direction: the era of easy monetization through derivatives is ending. For creators, this means they must shift their focus toward originality, authenticity, and quality to keep earning from their content—only then can they meet the new platform standards and monetization rules. Although the stated goal of these changes is to protect genuine creativity and stop system gaming, the real effect has been to create more hurdles for independent artists and creators.
The artists who have previously relied on a rather permissive approach set by social media channels may have to quickly adapt to the new changes to draw advantages, too. This might be especially difficult for those lacking the necessary resources.
Overall, it appears that the ongoing changes related to AI position the platforms as the current winners in the reported battle against mass-produced AI content. For them, focusing on quality and authentic content also means decreasing monetization payouts.