Algorithms were supposed to be the future for finding new movies and music. But now we’ve realized they’ll never replace your cool friend’s recommendation.

A hand holding a needle popping a speech bubble with music notes in it. The speech bubble is coming out of a laptop with a sad face on it. The background is purple with a binary code pattern.
Algorithms can only recommend the same, bland things because they’re based on wha we’ve already experiences instead of pushing us towards something truly new. Samantha Lee/Insider
  • Spotify’s music recommendation algorithm fundamentally misunderstands how taste develops.
  • As my friend said, “The way you find out what’s cool is someone cooler than you telling you it’s cool.”
  • These algorithms have flattened taste by mirroring your past behavior back to you, making it harder to find something truly new.
  • Drew is a Brooklyn-based writer who publishes the weekly newsletter Kneeling Bus.
  • This is an opinion column. The thoughts expressed are those of the author.

Nearly ten years ago, a friend and I were discussing the then-nascent phenomenon of algorithmically-generated music recommendations from streaming services, and my friend said something I’ll never forget: “The way you find out what’s cool is someone cooler than you telling you it’s cool.”

Spotify had recently launched in the United States while products like Pandora and Last.fm were already more established. The recommendation technology still felt somewhat unrefined, but was poised to continue improving rapidly – Spotify in particular seemed like it would soon know each of us so well that it could present us with new favorite songs and artists we’d never otherwise find.

To my friend and I, who had spent most of our lives discovering music via more traditional methods – getting recommendations from other friends, reading album reviews, listening to the radio – the idea of an approach that sidelined our beloved human curators felt unappealing, if not disturbing.

Unlike other tedious tasks that artificial intelligence promised to automate, choosing our own music was actually fun, something we felt no need to outsource. And yet, it seemed inevitable that listeners would outsource it anyway, regardless of how much they enjoyed the process. Maybe the recommendations would eventually get so good that we’d even welcome the transition.

But Spotify, along with Netflix and many other digital products that provide personalized media recommendations, failed to realize what my friend already knew: Cool recommendations come from cool people. These algorithmic efforts to discern the precise nuances of users’ individual preferences, by iteratively feeding their in-app behavior back to them, reflect a belief that personal taste is a product of nature rather than nurture. In other words, from this algorithmic perspective, our taste is intrinsic and immutable rather than an ever-changing result of our ongoing interactions with our environment.

In music, art, and many other domains, our preferences are largely the result of the culture in which we have grown up – a multitude of people around us telling us what’s cool. A 2016 MIT study found that “musical preferences seem to be mainly shaped by a person’s cultural upbringing and experiences rather than biological factors.” The researchers detected a “profound cultural difference” in different cultures’ responses to consonant and dissonant sounds. “Your brain basically gets tuned to the environment around it,” McGill University neuroscientist Robert Zatorre said.

Taste is deeply social, but Spotify’s user experience pretends it’s not. We’re not waiting for anyone to tell us what’s cool, Spotify implies. We’ve already decided. We just need to look inward to understand what we like – and Spotify wants to help us with that.

Deliberate or not, this misunderstanding of how taste develops may explain why the algorithmic recommendations provided by apps like Spotify haven’t improved all that much, at least relative to expectations.

By minimizing the social context of music and placing each individual within their own algorithmic silo, Spotify undermines the development of taste itself. Users don’t encounter algorithmic recommendations in a vacuum; their preferences have already been socially shaped at the moment they begin streaming. But by replacing an individual’s context-rich cultural milieu, partially or wholly, the algorithm curtails the opportunities for discovering new music the way people always have: from a cooler friend.

Algorithms aren’t cooler than you

In a 2020 Mashable piece titled: “An open letter to the most disappointing algorithms in my life,” Chris Taylor writes, “[Spotify’s] once-great Monday playlists have become a monoculture, focused on one kind of music entirely, and I fear it’s partly my fault.”

Taylor describes how, when he resumed running during the pandemic, he started listening exclusively to Drum & Bass music during his runs, even compiling his own 700-song Spotify mix, only to find that his Discover Weekly playlist had soon filled entirely with Drum & Bass – and Spotify’s selections were of lower quality than the playlist Taylor had carefully built.

Not only did Spotify fail to understand that running is a specific activity calling for specific music, it also couldn’t figure out how to recommend new, serendipitous music to Taylor – ostensibly the whole point of having algorithmic recommendations in the first place. In short, Spotify just didn’t seem to know him very well, and had no apparent way of learning more.

Netflix, similarly, has failed to meaningfully improve its user recommendations. In 2017, Netflix replaced its five-star rating system with a simpler thumbs-up and thumbs-down, which in turn generates a “percent match” score that shows Netflix’s estimated probability of a given user liking a movie or show.

In a 2018 Medium essay critiquing the rating system, Pascal Baspeyras observes that the percentage is not an assessment of how much we will like something, but just an estimate of whether we’ll like it at all (and give it a thumbs up). A low percent match score, in other words, indicates that Netflix isn’t really sure whether you’ll like or dislike something. Scores below 55% are not even displayed.

We just want someone to tell us what’s cool

Algorithmic recommendations appear to have a flattening and standardizing effect on taste, and that effect is more pronounced when the deeply social nature of taste is minimized and content is decontextualized. If you listen to Drum & Bass music, the algorithm simply serves you more of it – or perhaps an adjacent genre that you already knew about.

A human DJ, on the other hand, might introduce you to something completely different because they like it, and if you like the DJ – or are just enjoying yourself at a party – there’s a decent chance you’ll like what they select. In the most algorithmically-mediated digital environments, like Netflix, there’s nobody cool around to tell you what’s cool – just machines mirroring your own past behavior back to you.

Technology critic Rob Horning has written extensively about the flattening effect of such algorithms, arguing that their goal is not actually to understand the user at all, but instead to “reshape users to experience desire on schedule and in pre-formatted ways.”

By eliminating the context of the outside world and replacing it with an environment that provides its own algorithmic context – and then encouraging us to spend more and more time in that environment – our newly standardized taste “makes it easier for platforms to sell their users on to advertisers.”

Digital advertising has long relied upon matching users to archetypes or “buyer personas” that inform which brands and products they’ll be targeted with – even better if those users evolve to resemble the archetypes more closely. Why expand their horizons if doing so only makes them less valuable? Today, Netflix ratings are almost beside the point: When a blockbuster show like “Squid Game” or “The Queen’s Gambit” appears on the main menu, everyone just watches it.

Interestingly, some platforms’ content algorithms seem to understand their users better, or at least facilitate the kinds of digital serendipity that expand our horizons (even if problematically). These typically happen to be social networks – sites that foreground the individuals making the content – suggesting that human interaction, even in its digital form, is the true engine of taste.

TikTok’s For You page famously serves up an endless stream of personalized videos that glues its users to the screen for hours on end. YouTube’s notorious “rabbit holes,” meanwhile, lead users from innocuous starting points toward increasingly provocative videos that are curated to engage them. And there is growing demand for content that is not algorithmically mediated at all, such as Substack, which delivers blog posts to one’s inbox and lets the user sort it out from there.

The enduring importance of the influencer in its many forms may owe something to this fundamental desire for human recommendations. When someone opens YouTube or TikTok, they’re frequently watching people they like or admire. Those performers, of course, know less about their individual viewers than the algorithms do (even if the algorithms mediate who those viewers see), but it turns out that doesn’t really matter: What we really want is someone cooler than we are telling us what’s cool.