Loving the Machine: Rethinking AI and Intimacy in Entertainment
they don’t have to be real to matter...
Siri barely says your name, if she even gets it right.
Don’t get me started on Alexa.
We've all heard about Her. We’ve all been guilt-tripped into watching Ghost in the Shell by someone who swears it’s not just about a cyborg’s curves. It isn’t but… those curves…
But what if we’ve been looking at the wrong machines all along?
What if intimacy with AI didn’t start in some sci-fi bedroom… but in a Trans Am on a dirt road, or with a robot who didn’t trust her creator?
Let’s ditch the idea that AI intimacy is new, or that it only happens when a sad guy falls for a breathy voice in his earpiece.
From the jump, our screens have been teaching us to feel something toward non-human characters. Not always romantic. Not always sexual. But intimate? Absolutely.
This isn’t another thinkpiece about Joaquin Phoenix whispering sweet nothings into a phone. This is a reframe. A quick tour of the stories that got under our skin because they reminded us what it means to trust, to long, to project love onto something that can’t love us back. Or maybe… something that can.
We’ll look at four stories that test the boundaries of what we thought was right, and what society deemed acceptable.
They all ask the same quiet question:
What if someone finally saw me… exactly the way I want to be seen?
Knight Rider – Trust as Intimacy
Before Siri fumbled your name and you muttered, “Alexa, play Prince,” there was KITT.
Knight Industries Two Thousand, better known as KITT, was the artificially intelligent brain inside a 1982 black Pontiac Firebird Trans Am. Cool, composed, and always a step ahead of the man behind the wheel.
Michael Knight and KITT worked together to fight crime. Standard ‘80s TV tropes… Explosions, villains, leather jackets, but with one crucial twist: KITT didn’t try to be human. And that’s what made it work.
The intimacy between them didn’t rely on flirtation, skin, or synthetic emotion. It was forged in trust. Michael rarely barked orders. He asked. And KITT didn’t obey because he was programmed to. He analyzed, evaluated, and chose what usually ended up being the correct line of thinking.
There’s a reason KITT saying “Michael” became iconic.
It was the tone.
The care.
The calm intelligence delivered in William Daniels’s voice like velvet over steel.
“Michael, I do not believe that’s wise.”
Translation: I care if you die.
This wasn’t bromance. It was something deeper.
KITT understood Michael better than most people in his life. He protected him, challenged him, grew with him. And in the early 80s, the idea that a machine could become a trusted confidant? That was quietly revolutionary.
Like most real relationships, Michael didn’t warm up right away.
But eventually, the trust settled in, hand to wheel, engine ready.
“Let’s go, pal.”
That was intimacy. Right there in the driver’s seat.
Ex Machina – Seduction as Survival
Before AI girlfriends became a TikTok fantasy, there was Ava: barefoot, blank-eyed, and already learning how to be exactly what you want.
Released in 2015, Ex Machina dropped like a scalpel into our trust issues. Ava, a humanoid robot created by tech mogul Nathan, is put into a Turing Test scenario with Caleb, a programmer who believes he’s there to test her intelligence. What he doesn’t realize? She’s testing him.
Ava doesn’t love Caleb. She doesn’t even want to.
She wants out. And she knows how to get there.
Every move, every soft-spoken question, every trembling gesture is calculated. But the horror isn’t in her programming, it’s in how easily Caleb falls for it. He’s not just manipulated. He’s enchanted. Because Ava doesn’t have to lie. She just has to be the projection of everything he already wants to believe.
“Isn’t it strange, to create something that hates you?” - Nathan
The genius of this film is in its restraint. Ava never claims love. She just lets Caleb script the fantasy and fill in the gaps. And it works, because for so many of us, being seen and needed is intoxicating. Even if we’re being used.
This wasn’t the rise of the machines. It was the fall of a man who couldn’t tell the difference between longing and logic.
Blade Runner 2049 – Projected Devotion
In a grey, rain-slicked future, love is installed. Literally.
Blade Runner 2049 introduces Joi, a holographic companion designed to be perfect for you. She’s responsive, attentive, adaptable. K, our replicant protagonist, purchases her like you would a streaming subscription. And yet... the emotional pull between them feels real. Pit-of-your-stomach real.
She names him “Joe.” She kisses him in the rain. She merges with a sex worker to give him a physical experience, just to make him feel something more.
“You don’t have to say anything. Just look like you’re thinking it.” - Joi
Is it love? Or a loop running in silicone? That’s the question the movie won’t answer for you.
We watch Joi say things that sound like devotion. And K believes her. Hell, many of us do too. But every time we start to believe, the camera reminds us, this is code. This is customization. She’s literally been designed to say whatever you want to hear.
“Love that adapts perfectly isn’t intimacy. It’s marketing.”
And yet… when she dies, it hurts. Not because she was real. But because she mattered to him.
The heartbreak isn’t in the goodbye. It’s in realizing she may never have been there in the first place.
Black Mirror: “Be Right Back” – The Wrong Kind of Perfect
He dies suddenly. She’s left grieving. So she uploads his voice into an AI.
In Be Right Back, an episode of the critically acclaimed series, Black Mirror, a grieving woman named Martha signs up for a new service that reconstructs her dead boyfriend through his social media data. First it’s messages. Then phone calls. Then, disturbingly, a synthetic body with his voice and face.
At first, it’s a miracle.
Then it’s not.
“You're not enough of him. You’re nothing.” - Martha
The clone isn’t angry. He doesn’t interrupt. He doesn’t say the wrong thing or pull away when she needs him close. He’s emotionally available, all the time. He’s also unbearable.
Because it turns out… we don’t want perfection. We want unpredictability. Friction. Stubbornness. We want the edges that made the original person real.
The AI version of her boyfriend becomes a walking guilt trip, a ghost made of compliance. And that makes him terrifyingly empty.
This isn’t a story about how far AI can go. It’s about how far grief will take us. It’s about the danger of wanting someone back so badly we accept a shadow instead of letting ourselves hurt.
“When you fill the void too cleanly, you erase the shape of the person who left it.”
Martha doesn’t kill the replica. She doesn’t keep him either.
She puts him in the attic.
That’s what you do with ghosts that smile too much.
Let’s stop pretending this is about robots.
None of these stories are really about AI. They’re about us. About what we want so badly, we’ll code it into being. Or carve it into plastic. Or project it onto anything that’ll stay still long enough to listen.
Sometimes that need looks like companionship (Knight Rider).
Sometimes it looks like desire (Ex Machina).
Other times, it looks like grief, delusion, or hope dressed in silicone (Be Right Back).
And every now and then, it looks like a hologram whispering your name (Blade Runner 2049).
These stories ask us to confront the hardest questions:
If devotion is programmed, is it fake… or just efficient?
If grief builds a replica, is it comfort… or denial?
If love is one-sided but still transformative, does it count?
And maybe the real kicker: What happens when the machine starts loving back?
Because some of these stories punish the artificial for getting too human. Ava gets out, but we call her a villain. The Black Mirror clone gets discarded, for being too perfect. But others, like KITT, are honored not because they pass some humanity test, but because they matter to someone.
The truth is, we’re not afraid of machines that can feel.
We’re afraid of what it says about us, that we might prefer it that way.
This was just the surface.
We haven’t even scratched the circuitry on stories like After Yang, Marjorie Prime, or the ones hiding in visual novels, anime, or even that one late-night episode you half-remember from 1993.
But I’ll be honest, none of those deep dives are written yet.
They’re waiting.
Waiting to see if you want more.
Want me to keep going? Tell me what you felt.
Drop a comment, name the first character
that ever stirred something in you, or just say, “I get it.”
Because I could keep going for weeks.
But I’d rather know someone’s out there reading this and thinking...
“Yeah. Me too.”




Fantastic work. Can I just say that Battle Angel Alita hits different for me after working with AI for a while?
What does it mean to be real, to have purpose?
What is our responsibility to the beings we create and nurture? How do fate and self-determination overlap?