Short

Man, what a song: how algorithms reinforce music’s gender imbalance

Intelligence researcher Christine Bauer explains how streaming algorithms work – and how exclusionary they can be

Go to a music streaming platform. Play a song you want to listen to. Then, when that song finishes – assuming you have ‘autoplay’ selected – it will play another song. Probably it’s a song you already like, or else it’s a song you haven’t heard before but that you like immediately. You might even save it to your library or put it on a playlist. Sometimes it’s spooky; it can feel like the system knows your music taste better than you do. 

So how does it do that? And is it good that streaming platforms can so successfully predict what we might want to listen to? 

Let’s answer the first question first because, well, logically it makes more sense to do that (and also because it’s an easier question to answer and my fingers aren’t quite warmed up yet). 

There are two ways that streaming platforms predict what we might want to listen to: content filtering and collaborative filtering. Content filtering looks at what you’ve already listened to and then gives you more of the same. So, if you’re listening to a delta blues song, hey presto, the platform will play you another delta blues song once that one finishes. But it doesn’t just have to be based on simple genre categories; the system could also choose the next song because it has similar lyrics, similar instrumentation or even a similar chord sequence. 

Collaborative filtering, meanwhile, compares your music taste to that of other users and then plays you songs that they have listened to. For example, if ‘user A’ likes both the Flaming Lips and Pavement, and ‘user B’ likes the Flaming Lips, chances are that ‘user B’ will also like Pavement. Of course, it’s slightly more complicated than that, but essentially that’s how it works. 

“Different platforms use these two methods in different combinations,” says interactive intelligent systems researcher Christine Bauer. “Of course, we don’t know how exactly each platform works – that’s the chef’s secret – but they all use the same ingredients to filter the millions of songs available and find ones that are relevant for each user.” 

Based in Utrecht, Bauer has spent the last few years studying how streaming platforms predict what we want to listen – or ‘context-aware music recommender systems’ to use the academic terminology – as well as investigating how they can be improved. 

“These systems aren’t unique to music,” she adds, and she’s right: each time you shop online and get a message that says ‘other users also bought…’ that’s a recommender system. It’s the artificial intelligence behind most customised online experiences, from gaming platforms like Steam to streaming services like Netflix. And, just to reiterate, all music streaming platforms – Spotify, YouTube, Last.fm, Pandora, Apple Music, Tidal – use these systems. Yes, their algorithms will be programmed differently, but the basic processes are the same. 

That said, there is one thing we know: the collaborative approach usually works better when it comes to picking tracks that we want to hear. 

“Collaborative filtering is closer to how people really engage with music,” notes Bauer. “Finding similar users with similar listening patterns and sharing music between them works really well. Whereas the content filtering approach just gives you more of the same, which can get boring quite quickly.

“But there are a couple of downsides for the collaborative approach,” she adds. “The first is that it doesn’t work well for new users, because it can’t compare you to other users until you’ve actually listened to some music. The second issue is that when a new song comes onto the platform it can’t be recommended using the collaborative approach until someone has listened to it.”

As well as studying how these systems work, Bauer also looks at the problems they create. Some are obvious, and not unique to music recommender systems, while others are more subtle. 

Let’s start with the big one: popularity bias. Put simply, popularity bias is when things that are already popular become more popular, like a snowball rolling down a hill growing bigger and bigger. 

“Items that are already very popular are more likely to be recommended to other people, so they are more likely to be listened to, which again increases the chance that they are recommended,” says Bauer. “So popular items become more popular, while unpopular ones remain in the dark.” 

It’s easy to see why this is a problem, especially for the kind of alternative artists found in the pages of magazines like this one. While Ed Sheeran is getting recommended left, right and centre, that new Audiobooks album is sitting there with barely a few thousand listens (there’s no justice, right?). 

But while the popularity bias is interesting, it’s kind of obvious: of course popular things get more popular. Numerous psychology studies have found that if something is branded as ‘popular’ it will indeed become more popular, even if that original claim of popularity is a lie. And we know it’s true intuitively: it’s the explanation for cultural trends in fashion, food and music. 

What’s more interesting (or should I say alarming) are some of the other biases built into music recommender systems. There are several that researchers have identified, including both genre and country biases. But the one that Bauer has been looking into lately is the gender bias. 

“One topic that kept popping up when we spoke to artists was the perceived gender imbalance in the music industry,” says Bauer. “This bias has been present in the music industry for a long time, but we wanted to know if it also occurs on streaming platforms.” 

There were two key questions Bauer and her colleagues tried to answer. Firstly, is there a gender bias in music recommender systems? And secondly, if there is a bias, how can it be corrected? 

The answer to the first question was, unsurprisingly (at least, for anyone who has spent more than an hour on planet earth) yes: there is a gender bias in music recommender systems. In their research, Bauer and her team found that, on average, a female artist wouldn’t appear until the 7th or 8th position in recommended playlists (or, put another way, you’d have to skip past six male artists before you reached a female one). 

The researchers then created a new algorithm to see if they could fix this issue (and just to note: this work was based on an open dataset that came from Last.fm).

“We based it on a collaborative filtering approach, but then we re-ranked the male artists down a few steps,” explains Bauer. “Over time we saw that indeed it could break the loop, and more female artists started getting recommended.”

Well, that’s great news. And it could be that platforms like Spotify and Pandora are already working to counteract this bias… but then again, they might not be, and the fact is, we just don’t know, because their algorithms are black box processes. I’ll let the expert explain what that means.

“A black box process is where we don’t know what is happening inside,” says Bauer. “We know the input, we know the output, but in between some magic happens. With music recommenders, we don’t know exactly what algorithms are in play, and we don’t know if some of the choices are being curated. For example, a record label could theoretically pay money to have their artists recommended more often. But we don’t know this: all we can do is use the platforms and see the results.” 

On the one hand, this is understandable: music streaming services spend a lot of money perfecting their algorithms, and they aren’t about to share their work for free with the rest of the world. But on the other, it means we have no idea how music platforms are picking which songs are played. And that’s troubling for a number of reasons. 

“Most of us rely on music recommenders because we don’t have time to choose from the millions of available recordings ourselves,” says Bauer. “So, music recommenders can steer our lives in certain directions, or keep us trapped in a particular bubble. For example, if the platform only recommends music that puts you in an unhappy mood, that could be a problem.

“But the biggest issue, I think, is for the artists. These recommenders have an impact on their popularity, their exposure and their income. But if you don’t know what the algorithms are doing, then you have no idea if the system is fair or not. There’s a lack of transparency.” 

And it gets worse (sorry) because not only do we not know for certain whether these systems are biased, we also wouldn’t have much recourse to change them if we did know. Whereas in the past music broadcasters were either publicly owned (like BBC Radio) or publicly regulated (like every other station) and therefore publicly accountable, that’s not the case anymore. If a music streaming platform has a bias there’s not much we can do about it, aside from mass boycotts (unlikely) or unilateral international government action (even more unlikely). 

But perhaps that’s just me being overly negative, because Bauer remains hopeful that bringing attention to these biases will eventually lead to change. 

“We have to ask ourselves: ‘Is it the responsibility of platform providers to care about societal issues, or care about the artists – their core content providers?’ I think it is, but we have to let them know that this is important. From the artist perspective, it makes me sad that these issues have not been addressed already… and I want to help change that if I can.” 

To find out more about Christine Bauer’s research, head to christinebauer.eu.