I couldn't find a photo of Vulfpeck where they didn't look incredibly annoying but they pulled off possibly the first Spotify scam in 2014.
I couldn't find a photo of Vulfpeck where they didn't look incredibly annoying but they pulled off possibly the first Spotify scam in 2014.

The Wild West

Scams in the streaming era.
An essay by Jon W. Cole
  ·  
Last updated 
September 8, 2025

In 2014, funk band Vulfpeck released an album called Sleepify. It consisted of 10 silent tracks, 30 seconds each, & their fans were encouraged to play the album on repeat as they slept. This project was meant as a protest against Spotify, earning the band $20,000 before it was taken down. But while the media covered this as if they were sticking it to Spotify, in reality the money was coming out of the pockets of other artists.

Vulfpeck were well-meaning but misguided. The truth is that streaming royalty payout structures were broadly misunderstood at the time, & still are over a decade later. But there's a cottage industry of scammers who DO understand the system, & are manipulating it to loot the royalty pool to the tune of $2b per year.

General chicanery

Streaming fraud detection companies like Beatdapp monitor for suspicious play count inflating behavior. And they believe that as many as 10% of all streams are fraudulent, based on discoveries like 10,000 different accounts all playing the same 63 tracks. And the vast majority of these fraudulent bot farm plays--upwards of 80%--aren't boosting legitimate accounts, they're mass streaming tracks that were illegitimately uploaded--sometimes stolen music, sometimes AI prompt songs--in order to loot royalties.

North Carolina musician Doug Smith, for instance, is facing federal charges for creating thousands of streaming bots to stream AI tracks, generating over $10m in a long term streaming fraud scheme. He avoided detection by spreading the streams over thousands of songs, never drawing attention by inflating the play count of any one song to a noticeable degree. Apparently, it's not difficult to blend into the long tail--the millions of artist accounts that generate a thousand dollars or less per year--especially if you spread the scam across multiple services. Billions of plays across thousands of songs across numerous streaming services is not an easy puzzle to put together. And yet fraud detectors cracked the code.

Companies like Beatdapp are monitoring user behavior, looking for suspicious trends in plays but also things like repeated mouse movements within the app itself, suggesting that an automated bot is using the account instead of a human being. But streaming behavior monitoring is by definition reactive to the evolving strategies of scammers, who are training the bots to mimic real human behaviors & hosting them on en masse on cloud servers to disguise their locations. Even worse, they're also streaming legitimate artists in order to hide who they're boosting, leading to an emerging problem of legitimate artists receiving takedowns, with little to no recourse.

So while it's better to monitor and catch this stuff than to not, the more reliable solution is to reduce the incentive to participate in this type of fraud, eliminating the false positives. Spotify recently took a step toward this by implementing a requirement of 1,000 plays per year for a track to qualify for payouts. In theory, this should solve much of the billion dollar fraud problem, & should make future monitoring much easier by shrinking the number of artist accounts who need to be monitored (on Spotify, at least) from 12m to a few hundred thousand. But that doesn't mean artists don't still deserve a direct line to customer service when their music is falsely flagged for illegal streams.

But outright fraud isn't the only problem on Spotify.

Payola by any other name...

The act of paying radio DJs in exchange for plays was made illegal in 1960, but continued over the years in various forms. A major investigation by NY Attorney General Eliot Spitzer ended up fining record labels tens of millions of dollars in 2005, but reporting indicates that this was still a problem as recently as 2019. Radio stations are regulated because they must license their frequency from the US government, putting them under the purview of the FCC. The idea is that if payola is openly allowed, our public airwaves will simply become promotional channels for whomever has the most cash to spend, & that is contrary to the public good.

The internet, on the other hand, doesn't receive the same scrutiny, least of all the streaming oligopoly. Which has led to the rise of Spotify's Perfect Fit Content program. This program is Spotify's way of grabbing a slice of the 70% of revenue reserved to compensate artists. In exchange for paying a share of their royalties to Spotify, artists receive preferred placements in playlists & algorithmic recommendations.

Much of the controversy around this program has been related to so-called "Ghost Artists," who are musicians that create music for stock libraries, receiving for-hire payment in lieu of long term royalties. And this is actually a valid form of session work that's had a place in the music industry since at least the 1960s. And in fact so-called "library music" from the 1970s has played a critical role in shaping modern hip hop, psyche, & soul production for years. It's not even a problem when these songs appear on Spotify. They are, in fact, critical pieces of music history. The problem occurs when these stock libraries join Spotify's Perfect Fit Content program & displace royalty-dependent artists by undercutting them with lower royalty rates. This first gained traction when David Turner reported that "according to the playlist analytics site Chartmetic, on February 23, 2017, Spotify’s Ambient Chill playlist switched out 16 tracks by Brian Eno, Bibio, Jon Hopkins and other well known electronic acts for 28 songs connected to [stock music library] Epidemic Sound, operating under fake names like They Dream By Day, LUCHS, and Silver Maple." This is clearly problematic when an ambient pioneer like Eno is replaced with work commissioned in the style of Eno. The tail is wagging the dog in this case.

I think the moniker "Fake Artists," which often gets thrown around, is a misleading description of what's happening here. The problem is more like production houses undercutting other artists for placements, & Spotify benefitting financially from the arrangements. In essence, this is an evolution of the payola scandals from decades past.

But this isn't only happening with stock music. There are also "real" artists in this program who are displacing other artists. And that's just as bad, if less apparent.

The threat isn't realistically that the service will become overrun with these types of pay-for-placement tracks. It's instead that Spotify has the data on which tracks perform, so they can arrange these deals in ways that are anti-competitive in the marketplace. In other words, if you perform too well on a niche playlist, you may be in danger of being replaced by a similar song in the PFC program.

This is not dissimilar to how Amazon uses data from it's third-party Marketplace to inform the Amazon branded products it makes, displacing the third party vendors. For which they were rightly sued by the FTC. So if the FCC doesn't want to regulate this type of thing, perhaps the FTC will eventually step in with an anti-trust lawsuit.

Algorithm seeding or manipulation?

The music & media industries recently erupted over an AI band called the Velvet Sundown, & it's been a pretty interesting case study on how the definition of art is evolving as we grapple with the definition of copyright & what role AI will play in the future of music creation. What's more, we still don't know what role humans played in the creation of these songs. When does a producer, for instance, become a co-writer in this context? Who owns these songs, if anyone? Could they legally be used in a truck commercial without paying royalties? These and more philosophical questions remain up in the air.

And I kind of get why someone might like them. Yes, the audio is low quality. And sure, the lyrics are just a string of shallow platitudes. It suffers from all the shortcoming of current gen AI prompt music, & seems as boilerplate & trite as an average Lee Greenwood track. But it's better than a lot of human-made music, much of which is very bad. Maybe even better than a lot of Lee Greenwood tracks.

But I find the songs themselves & the hysteria surrounding AI as a replacement for human artists less interesting than how this band--AI or not, and without performing gigs or playing festivals or doing interviews or any of the traditional work of promoting an album--gained 500,000 monthly listeners & millions of plays.

The short answer is that they got a ton of free promotion by becoming the subject of a string of high profile rage bait articles intended to scaremonger about how AI is eating the music industry. But they had to get on the radar of these journalists first, by gaining a little bit of traction. Traction that many human artists struggle to achieve.

The paper trail has all but been washed away by the flood of listeners brought in by the media coverage. But before the surge, the data on their Spotify profile was rather interesting. Their "Fans Also Like" tab included obscure 1970s rock bands that you've never heard of like Gambler, Buckeye, & 805. Artists with only dozens of monthly listeners & only thousands of total plays. But were mostly on major labels & released albums in the late '70s or early '80s. This is unusual because new artists' "Fans Also Like" tab tend to start empty & become populated as the algorithm learns about user behavior. So a tab that's full of really obscure acts seems like it has to be intentional.

It's impossible to say for sure without knowing the ins & outs of how Spotify's algorithm works, but I wonder if one method of algorithm manipulation involves using user accounts to stream your uploaded album alongside obscure, like-sounding bands to seed connections within a specific genre. This could've been a sophisticated effort.

Additionally, searches for "Velvet Sundown" yielded odd results, like a user named "Roy," who follows Guns' N Roses, Jethro Tull, Styx, & the Alan Parsons Project, & somehow had a single playlist entitled "Velvet Sundown," saved by 17 other users.

We know this type of data factors into what songs are auto-played during passive listening sessions & how Spotify generates dynamic playlists like the user-specific Discover Weekly, using collaborative filtering. But we may never know to what degree this type of manipulation played in Velvet Sundown gaining their traction because of the media circus.

I find this to be interesting because it's not exclusive to AI... any artist seeking to fine tune if or how it's being recommended can simply reverse engineer the algorithm. And I struggle to decide if this is bad or if it's just the way things are now.

Then again, it's possible that the seeding didn't even work, & that whoever concocted the band spilled the beans to the media themselves. Maybe controversy has always been the fastest shortcut to fame.

fin.

Who the fuck is Jon W Cole?

Selfie circa 2025.

To be honest, I probably shouldn't be the one writing these essays. It's just that no one else is. And it feels like someone probably should.

I'm not a journalist. I'm not an artist. I don't work in the music or streaming industries. I'm just a web developer. But I have a lot of friends who are artists. And so I know what the struggles are. And when I see the discourse online, none of it really seems to be pointing toward any real solutions that are going to make a better industry for my friends.

These essays are meant, first & foremost, to start constructive debates. And I would love to hear thoughts from folks who are more deeply plugged into the industry than I am. I certainly have blind spots. And I intend to update these essays over time based on feedback.

At me on Threads @jonwcole, or e-mail me at jon@jonwcole.com.

Cheers.