A survey published last week suggested 97% of respondents could not spot an AI-generated song. But there are some telltale signs - if you know where to look.
Here’s a quick guide …
-
No live performances or social media presence
-
‘A mashup of rock hits in a blender’
A song with a formulaic feel - sweet but without much substance or emotional weight - can be a sign of AI, says the musician and technology speaker, as well as vocals that feel breathless.
- ‘AI hasn’t felt heartbreak yet’
“AI hasn’t felt heartbreak yet… It knows patterns,” he explains. “What makes music human is not just sound but the stories behind it.”
- Steps toward transparency
In January, the streaming platform Deezer launched an AI detection tool, followed this summer by a system which tags AI-generated music.



Frequency.
A couple months ago, I found a really cool remake of one of the songs from KPop Demon Hunters. Everyone was doing covers of those songs, and many of them were indie artists, and I was rolling through them. So I found this video, and the video was just an image effect on the cover, which looked very AI-generated, but it’s just the cover image, right? Who cares about that? I asked them in the comments if they would release their stuff on Apple Music. And they quickly responded — no, they’re going to leave that money on the table, and have decided to stay exclusive to YouTube. Why would an artist choose to do that? Sure, a couple artists pulled their music off all other streaming platforms when they made their own, or their friends did. Garth Brooks has never been on streaming (except Amazon, I think they’re the only one whose ethics he agrees with or something?). But most indie artists are on all the platforms. Maximise revenue. So these people saying no, not only to Apple Music — maybe they didn’t like Apple kissing up to Trump — but also to Spotify, Amazon, Deezer, and all the rest. Turns out most of those platforms are stricter when it comes to AI music.
But here’s the thing — their songs are still by the original artist. They’re just stripping out the lyrics and putting new music to the lyrics. And that music is AI generated. Or so I later learned. I looked more into the YouTube channel, and they say they will make you a cover of a song, in any style you like, for $200. And they have hundreds of uploads… in a few months. Each song may have five or six variants. And the songs are still fine, but they have a generic, plastic, not real feel to them.
Of course, they also qualify the first thing in OP’s summary, no social media presence. They just have the sales site, and the YouTube channel.
But maybe it’s fine, or at least less bad, that they’re taking existing songs and just remixing them with AI? Only they’re saying the covers are better, and they’re monetising the videos, so they’re getting paid for the streams when that money should be going to the original artist. It’s fine if they actually covered the song and recorded it, but having a computer do all the heavy lifting? Just seems scummy.
I’m not going to name & shame, but if you look up KPDH covers and see something that looks like AI slop with click-bait titles… you’ve probably found the right one. (They cover other stuff too, not just KPDH.)
Garth Brooks is available on every single music streaming service I know about. 🤨