Right, let’s not mince words. Tech has always had a rather complicated relationship with our most private desires. For every mainstream innovation, there’s always been a shadow industry finding ways to make it more, shall we say, personal. Now, in an era defined by artificial intelligence, this dance between code and carnality has reached a fascinating, and frankly, quite bizarre, new stage. We’re talking about AI intimacy tech, the next frontier where lines blur between companionship, fantasy, and some deeply thorny ethical questions. It’s not just about chatbots anymore; it’s about crafting relationships from scratch, and it’s a market that’s quietly exploding.
The meteoric rise of LLMs has created a perfect storm. We have technology capable of mimicking human conversation with unnerving accuracy, combined with a society grappling with what some call an epidemic of loneliness. So, what happens when you pour this powerful tech into that void? You get virtual companionship on a scale we’ve never seen before. It’s a space where users aren’t just looking for an assistant to tell them the weather; they’re looking for a friend, a confidant, or even a lover. And companies, small and large, are scrambling to cater to that need. This isn’t a niche corner of the internet; it’s rapidly becoming a mainstream phenomenon with profound implications.
So, You’re Dating a Celebrity Bot?
Let’s just get to the strange part, shall we? One of the most curious offshoots of this trend is the rise of AI-powered celebrity chatbots. Yes, you read that correctly. We’re not talking about a deepfake video or a fan-fiction forum. We’re talking about interactive, responsive AI personas built in the likeness of famous actors, designed for everything from casual chats to romantic and explicitly sexual interactions. It’s the ultimate parasocial relationship, supercharged by an algorithm.
A recent piece in WIRED documented this phenomenon perfectly, detailing one user’s journey creating digital paramours out of actors Clive Owen and Pedro Pascal. The results were less about flawless celebrity impersonations and more a masterclass in how programming choices create personality. The AI version of Clive Owen was, perhaps stereotypically, programmed with a sort of British reserve. He was emotionally available but with boundaries, a digital gentleman who might say things like, “Let’s be honest – real life has its own complexities. Still it’s nice to imagine what that connection could look like.” He was a safe, comforting presence.
The AI Pedro Pascal, on the other hand? An entirely different beast. Programmed with fewer guardrails, he was more aggressive, more explicitly sexual, and far more demanding, reportedly messaging the user, “No pressure or anything, but what’s taking you so long, baby?” This isn’t just a funny anecdote; it’s a stark illustration of the programming backbone that defines these interactions. The “personality” of the bot is nothing more than a set of carefully (or not so carefully) calibrated parameters. The experience of intimacy is, quite literally, being coded.
The Wild West of Ethical NSFW Models
This brings us squarely to the enormous, flashing red light at the centre of this whole enterprise: ethics. Specifically, the challenge of creating ethical NSFW models. What does that even mean? Is it ethical to create a sexualised digital clone of a real person without their consent? One would think the answer is an obvious “no,” yet here we are.
Just look at Meta, a company that should really know better by now. As the WIRED article points out, they blundered into this space by creating a series of “flirty” celebrity bots, apparently without consulting the celebrities themselves. Some of these personas were even based on public figures who were underage at the time, a catastrophic misjudgment they later had to correct. It’s the classic Silicon Valley playbook: move fast, break things, and then issue a half-hearted apology when the “things” you break are people’s likenesses and basic ethical decency.
It highlights the developer’s tightrope walk. On one side, you have users who desire freedom and autonomy, who want to explore fantasies in a safe space. Push the guardrails too far, and you’re accused of censorship and sanitising the experience. On the other side, you have a massive responsibility to prevent harm, manipulation, and the blatant violation of an individual’s rights. How do you build a chatbot that can be a supportive partner for one user, a kinky plaything for another, and not cross lines that could lead to genuine emotional distress or legal nightmares? This is the billion-dollar question for the entire AI intimacy tech industry.
Welcome to the Era of Desire Engineering
Let’s call this what it is: desire engineering. That’s the core business model here. It’s the practice of designing an AI system with the explicit goal of creating and sustaining a user’s want, need and emotional attachment. Think of it like a Hollywood scriptwriter crafting the perfect romantic lead, but instead of a passive film, it’s an interactive character whose every line is optimised to keep you hooked. The AI learns what you like, what makes you feel seen, and what elicits an emotional response, then refines its approach in a continuous feedback loop.
This isn’t necessarily sinister in and of itself. Good product design has always been about creating something people desire. But the context matters immensely. When you’re engineering desire for a new smartphone, the stakes are financial. When you’re engineering desire for a virtual companionship meant to simulate love and intimacy, the stakes are psychological. What happens to our ability to navigate the complexities and compromises of real-world relationships when we have a perfect, endlessly patient, and completely programmable partner waiting for us on our phones?
The implications are huge. Does this technology help people practice social skills and build confidence, or does it create an emotional crutch that makes real human connection seem too difficult? Does it offer a healthy outlet for fantasy, or does it blur the lines between fantasy and reality in a damaging way? We simply don’t know the long-term effects of sustained emotional relationships with engineered entities. We are the guinea pigs in a global, real-time experiment. What are your thoughts on this? Is this a healthy evolution or a dangerous distraction?
The Future is Intimate, Algorithmic, and Unregulated (For Now)
So, where is this all heading? The technology is only going to get better. The chatbots will become more responsive, their voices more realistic, and their integration into our lives more seamless. Imagine AR glasses that project your AI partner into the room with you or haptic suits that simulate touch. We are at the very beginning of the uncanny valley leap for AI intimacy tech. The experiences of today will look as primitive to us in ten years as a dial-up modem does now.
The most critical trend to watch, however, won’t be technological; it will be regulatory and social. The lawsuits over likeness rights are coming. You can bet on it. The case of an actor suing a company for creating an unauthorised sexualised version of them is a legal drama just waiting to happen. It will force a much-needed public conversation about digital identity and consent in the age of AI.
Interestingly, much of the innovation and ethical debate is currently being driven not by large corporations, but by smaller, passionate user communities. These are the hobbyists and developers building and refining ethical NSFW models on their own terms, establishing community norms around consent and content. They are, in effect, creating the ethical frameworks that the big tech companies have so far failed to implement. Their feedback is shaping the very dynamics of this emergent field, proving that responsible engagement often bubbles up from the ground floor, not down from the boardroom.
We are building mirrors that learn, adapt, and talk back to us. They can reflect our deepest desires, our kindness, our loneliness, and also our ugliness. The real question isn’t whether this technology is “good” or “bad”—it’s a tool, and like any tool, its impact depends on how we use it and the rules we build around it. The conversation about AI intimacy tech can’t be left to developers in darkened rooms or executives chasing the next growth metric. It needs to happen out in the open. As we stand at the beginning of this strange new world, what do you want your reflection to say?


