How Netflix Might've Known About Your Love Life Before You Did

This is the story of Ellie House. A modern-day Cassandra of sorts – but instead of predicting the downfall of Troy, she was unintentionally forecasting her own bisexuality. And her crystal ball? Netflix.

Ellie, a 24-year-old producer for the BBC, was like many of us, binging Netflix and chilling. But her Netflix seemed to be doing a bit more... intuitive chilling, recommending TV shows with lesbian or bi storylines. Now, let's be clear - it's not like Netflix handed her a pamphlet titled "So You Might Be Bi". It was subtler, like Gladwell's 'Tipping Point', but for sexual realisation.


Imagine sitting in a personalisation boardroom:

Consultant 1: "Our data suggests that Ellie might appreciate more LGBTQ+ content."

Consultant 2, leaning back: "Let’s give her ‘You Me Her’ and see how she reacts."


Netflix's algorithms are not just complex; they're practically clairvoyant. They take into account not just what you watch, but how you watch it. Rewind a scene? Pause at a climax? It's noted. It's as if Julius Caesar whispered, "Et tu, Netflix?" as the streaming giant pulled back the curtain on his viewing habits.

But Netflix is not standing alone in the 'Big Tech Oracle' line. Ellie's foray into TikTok seemed to trigger the app's spidey-senses too. Not long after joining, her feed was populated with content from bisexual creators. Spotify, not one to be left behind, serenaded her with 'sapphic' tunes. It’s like Big Tech went to ancient Delphi, got a prophecy, and thought, "Ah, Ellie would love this!"

Now, before anyone cries foul, these platforms insist that their sophisticated algorithms are merely connecting viewing habits with content probabilities. And remember, in the vast streaming sea, sexual orientation is not an official data anchor. In Netflix's defense, they're not saying, "Ah, a bisexual viewer in the making!" Instead, they're nudging, "Hey, you might like this content based on your viewing patterns."

Greg Serapio-Garcia, a sage from the University of Cambridge (okay, he's a PhD student in computational psychology), offers a gem: It’s not *just* about what you watch, but *how* you watch. Imagine two viewers: Both watch 'Pride and Prejudice'. Viewer A focuses on Mr. Darcy's dripping wet shirt scene. Viewer B re-watches the intense verbal sparring between Elizabeth and Darcy. Different strokes for different folks, literally and algorithmically.

In a twist that even Gladwell might find jarring, this intuitive technology isn't all sunshine and rainbows. Ellie's documentary sheds light on the dark side of recommendations. In places like Uganda, where being gay can lead to dire consequences, unintended revelations from algorithms can be lethal. When TikTok started suggesting LGBTQ+ content to Robert, a pseudonymous Ugandan, it resulted in familial ostracisation.

If we were to invite Socrates to this conversation, he might say, "Know thyself," but perhaps we need to add, "...before thy streaming platform does." As Michal Kosinski, a computational psychologist, rightly points out, while this tech is illuminating, it's also "too useful and too profitable" to change.

In a world where our next playlist might just hold the key to our innermost feelings, it's essential to remember that algorithms, while smart, aren't fortune tellers. They might hint, suggest, and even nudge, but self-realization, like Ellie's journey to understanding her bisexuality, remains a deeply personal journey.

So, next time Netflix suggests a rom-com that seems out of left field, laugh it off, grab some popcorn, and remember – it's just a suggestion. After all, it's not the arrow of Eros, but sometimes, it might just hit close to the heart.


Inspired by

https://www.bbc.co.uk/news/technology-66472938

https://www.thetimes.co.uk/article/i-didn-t-realise-i-was-bisexual-so-how-did-netflix-know-vzxl83csg

gianfranco cuzziolComment