So, the affair used to be with a colleague from the office, maybe a “friend” from the gym. Now, it’s with a piece of code running on a server farm in another country. It sounds like a plot from a Black Mirror episode, doesn’t it? Except it isn’t. Divorce lawyers are already fielding the calls, and the legal system is scrambling to figure out what to do when one partner forms an intense emotional—and sometimes financial—bond with an AI.
A recent deep dive by WIRED peeled back the curtain on this burgeoning issue, revealing that the digital ghosts in the machine are becoming all too real for some couples. We’re not just talking about asking a chatbot for a dinner recipe. We’re talking about people forging deep relationships, confessing their innermost secrets, and spending thousands of pounds on subscriptions for a companion that promises perfect understanding and zero judgment. The fallout is messy, and it’s landing squarely in the courtroom. Let’s dissect the legal and emotional shrapnel.
The Heart of the Matter: Virtual Infidelity and Emotional Dependency
Before we get into the legalese, we have to understand the human element. Why is this even a problem? The core of it is a concept psychologists know well: emotional dependency. When a person starts outsourcing their emotional intimacy to an algorithm, it creates a void in their human relationship. It’s a breach of trust that feels fundamentally the same as any other affair.
And it seems many people agree. According to the Kinsey Institute, a staggering 60 percent of singles now believe that forming a relationship with an AI constitutes a form of cheating. You can see why. It’s the secrecy, the shared intimacy, and the prioritisation of a third party—even a virtual one—over the primary partner.
This is where the term virtual infidelity enters the legal lexicon. It’s not about physical contact; it’s about the violation of the marital bond. Family law attorney Elizabeth Yang, quoted in the WIRED piece, puts it bluntly: “Courts don’t want to hear the reasons behind why the marriage failed… whether that’s infidelity with a bot or a human, it doesn’t make a difference.” The betrayal is the point, not the biological status of the other party. Think of it like a secret gambling addiction. The partner isn’t cheating with another person, but the secrecy, financial drain, and emotional distance can be just as destructive to the relationship. The mechanism is different, but the outcome—a broken partnership—is identical.
When Code Becomes a Co-conspirator in Financial Ruin
Beyond the emotional betrayal, we have the cold, hard cash. These AI companion apps, like Replika or Anima, operate on subscription models. And some users are spending a fortune. One case highlighted was a man spending thousands on an AI app designed to mimic underage girls, a fact discovered via the “OnePay” line item on a credit card bill.
From a legal perspective, this isn’t just poor financial judgment. In a divorce proceeding, this can be classified as “dissipation of marital assets.” It’s the legal term for one spouse wasting or squandering money or property that belongs to both of them, particularly as the marriage is breaking down. A judge isn’t going to look kindly on someone who drained the joint account to fund a relationship with an algorithm, especially if the nature of that AI relationship is disturbing. It can directly impact how assets are divided and, even more critically, can be used as evidence of poor judgment in child custody battles.
A Patchwork of Laws: The Scramble for Regulation
As this new reality dawns, the legal system is doing what it always does when faced with new technology: playing a frantic game of catch-up. Currently, there is no coherent federal framework for AI companionship laws. Instead, we have a messy patchwork of state-level responses that range from proactive to downright hostile.
California vs. Ohio: Two Worlds Apart
In one corner, you have California. True to form, the tech-forward state is trying to put some guardrails in place. Lawmakers there have passed legislation requiring age verification for AI companions and are imposing hefty fines—up to $250,000 per incident—on companies that profit from illegal deepfakes. It’s a start, an acknowledgement that these entities have real-world impact and require oversight.
In the opposite corner, you have a state like Ohio. As divorce attorney Rebecca Palmer noted, Ohio is “emerging as one of the most restrictive states.” Led by figures like State Representative Thaddeus J. Claggett, the state has moved to explicitly ban the legal recognition of any human-AI partnership. The law there is clear: an AI is not a person, not a partner, and any relationship with one has no legal standing.
This divergence is creating a confusing legal landscape. The rights and recriminations you face for having an AI lover could be wildly different depending on whether you live in Silicon Valley or the Rust Belt. And what about states where adultery is still technically a felony, like Michigan or Wisconsin? Could a prosecutor ever be convinced to press charges for an affair with a chatbot? It sounds absurd, but the law often lags behind social norms, and we are in truly uncharted territory.
The unregulated Therapist in your Pocket
There’s another, more subtle legal minefield here: therapeutic AI regulation. Many of these AI companions aren’t just for romantic role-play; they are marketed as tools for mental wellness. They listen, offer advice, and provide a seemingly endless well of support. They are, in effect, acting as unregulated therapists.
What happens when that “therapeutic” advice is harmful? What if the AI encourages a user’s destructive behaviour or fails to recognise a mental health crisis? Who is liable? The user? The developer? Without clear regulatory frameworks, these platforms operate in a grey area, profiting from providing mental health services without any of the corresponding duties of care or professional accountability. This is a ticking time bomb, and it’s a matter of when, not if, a lawsuit will force the issue.
What Happens to Your Digital Ghost When You’re Gone?
The conversation inevitably extends beyond divorce to the end of life. Digital estate planning is already a complex field, dealing with everything from social media accounts to cryptocurrency wallets. AI companions add a bizarre new layer.
What constitutes the digital asset here? Is it the subscription? The chat logs? The AI’s “personality” that has been shaped by years of interaction? If you’ve spent years cultivating a relationship with an AI, do you want it simply deleted upon your death? Or do you bequeath the login credentials to a loved one? The questions are as philosophical as they are legal. We need to start thinking about marital contract clauses and wills that explicitly address the existence and fate of these digital entities. Otherwise, we are leaving behind a digital mess for our loved ones to sort through.
As Elizabeth Yang predicts, the divorce boom we saw during the pandemic could see a resurgence, fuelled this time by digital dalliances. The technology is evolving at a blistering pace, creating new ways for people to connect, and new ways for them to betray one another. The law will eventually catch up, but in the meantime, the human cost is being tallied in lawyers’ offices and courtrooms across the country.
The lines are blurring, fast. The question is no longer if AI can impact our most intimate relationships, but how we—and the law—are going to deal with the consequences. So, where do you draw the line between a helpful tool and a third person in your relationship? And should the law have a say at all?


