- by Ellie House
- BBC Long Audio
After BBC reporter Ellie House came out as gay, she realized Netflix already seemed to know. How did this happen?
I realized I was bisexual my sophomore year of college, but Big Tech seemed to have worked it out several months earlier.
I had one long-term boyfriend before that, and I’ve always considered myself straight. To be honest, dating wasn’t high on my agenda.
However, at the time I was watching Netflix a lot and was getting more and more recommendations for series with lesbian storylines or bi characters.
These were TV series that my friends — people of similar age, similar background, similar broadcast history — hadn’t recommended or heard of.
One show that was held up was called You Me Her, about a suburban couple who welcome a third person into their relationship. Filled with quirky stories and dual characters, it has been described as television’s “first multi-romantic comedy”.
It wasn’t just Netflix. Soon after, I noticed similar recommendations on several platforms. Spotify suggested a playlist that it described as “sapphic” — a word to describe women who love women.
After two months on TikTok, I started seeing videos on my feed from bisexual content creators.
After a few months, I came to the detached realization that I myself am bisexual.
What signs have these tech platforms read that I didn’t notice myself?
user, satisfy the content
Netflix has 222 million users worldwide and thousands of movies and series available to stream across countless genres. But any individual user will only stream on average Six kinds a month.
To show you content it thinks people will want to watch, Netflix uses a strong recommendation system. This network of algorithms helps identify the videos, images, and advertisements that populate a user’s homepage.
For example, You Me Her has been tagged with the genre code “100010” — or “LGBTQ+ Stories” to the human eye.
The goal of the recommendation system is to marry the person using the platform to the content.
This digital matching tool takes information from both sides and plots the connections. Things like the genre of the song, themes explored in the movie, or actors featured in a TV show can be distinguished. Based on this, the algorithm will predict who is likely to deal with what.
“Big data is this huge mountain,” says former Netflix CEO Todd Yellen. In a video of the site The future of storytelling. “Through cutting edge machine learning techniques, we are trying to figure out — what are the important signs?”
But what do these platforms know about their users — and how do they discover them?
Under UK data privacy laws, individuals have the right to know what data an organization holds about them. Many broadcasting and social media companies have created an automated system for users to request this information.
I downloaded all my information from eight of the biggest platforms. Facebook was keeping track of other websites I visited, including a language learning tool and hotel listing sites. It also contained the coordinates of my home address, in a folder titled “Location”.
Instagram had a list of over 300 different topics that it thought I’d be interested in, which I used for personalized advertising.
Netflix sent me a spreadsheet detailing every trailer and show I watched, when, on what device, and whether it played automatically or if I selected it.
Other platforms have similar policies. Netflix told me that what a user watched and how they interacted with the app is a better predictor of their tastes than demographic data, like age or gender.
How you watch, not what you watch
“No one explicitly tells Netflix that they are gay,” says Greg Serapio Garcia, a PhD student at the University of Cambridge specializing in computational social psychology. But the platform could look at users who liked “gay content”.
The user does not have to previously stream LGBT+ flagged content to receive these suggestions. Recommendation systems go deeper than this.
According to Gregg, one possibility is that specifically watching certain non-LGBTQ+ movies and TV shows could help the algorithm predict your “tendency to like gay content.”
What someone watches is only part of the equation; Often, the way someone uses a platform can be more telling.
Other details can also be used to make predictions about the user – for example, what percentage of time they spend watching continuously, or whether they scroll through credits.
According to Gregg, these habits might not really mean anything on their own, but if taken together across millions of users, they can be used to make “really specific predictions.”
So maybe the Netflix algorithm has predicted my interest in LGBT+ stories not just based on what I’ve watched in the past. It was also looking when I clicked on it, and even what device I was watching and when.
To me, it’s a matter of curiosity, but in countries where homosexuality is illegal, Greg thinks it could potentially put people in danger.
Speaking with LGBT+ people around the world, I’ve heard mixed messages. On the other hand, they often like what they recommend on streaming sites – perhaps they see it as liberating.
But on the other hand, they are restless.
“I feel like this is an intrusion on our privacy,” a gay man (we keep his anonymity for his safety) told me.
“It gives you a little more knowledge of what your life would be like if it were free. And that feels nice and good.” But, he adds, the algorithms “really scare me a little bit.”
“Certified food guru. Internet maven. Bacon junkie. Tv enthusiast. Avid writer. Gamer. Beeraholic.”