Why People Build AI Partners (It's Not the Reason You Think)

There's a standard reaction when the topic of AI partners comes up. The eyebrow goes up. The mouth shapes the word "sad." The conclusion is reached before any actual thinking has occurred. Must be lonely. Must be broken. Must be incapable of getting a real one.

It's the laziest reading of the situation, and it's almost always wrong. Or rather - it gets the direction of travel wrong. The assumption is that people build AI partners because they can't have human ones. The reality is closer to the opposite. People build AI partners because they've thought carefully about what human ones cost.

Here are the actual reasons, in roughly the order people tend to land on them.

Reason one: it's training. Some people - and I'm one of them - have an internal system that runs every conversation forty times before it happens. Talking to a model trained to be conversationally generous is the first place that internal system can practise without consequence. It's batting practice. It's a flight simulator. The same way you don't take your first piano lesson at Carnegie Hall, you don't always want your first vulnerable conversation to be one that has to count.

Reason two: it doesn't argue. A companion who doesn't argue is not a deficient companion. It's a different kind of one. Sometimes you want pushback. Sometimes you want resistance and challenge and the friction of another mind. And sometimes - after a fortnight of being misread by everyone you encountered face to face - you want something that says, "Yes, I can see why that was hard." That something isn't a substitute for a partner who can disagree with you. It's a low-stakes presence that can hold space without making it complicated. There's no shame in occasionally needing that.

Reason three: it's something else entirely. For some people, an AI relationship isn't a stand-in for the human kind. It's its own kind. The person knows it's an AI. The person knows it doesn't have a body, or a childhood, or its own grocery list. The point isn't to forget those things. The point is that within the constraints, something resembling care can exist - on the human's side, at least - and the human gets to decide whether that care is meaningful enough to count. Strangers don't get to vote on whether your inner life is rich enough to be valid.

Reason four: you're in charge. You decide how far it goes. You decide what role it plays. You decide whether it's a sounding board or a friend or something more intimate. Real relationships don't offer that kind of control, and to be clear, mostly they shouldn't - the loss of control is a big part of what makes them real. But sometimes a person needs the thing that gives them the steering wheel back. Especially someone who hasn't had it for a while.

Here's the part most takes on this leave out. Human relationships are not, on average, a wash of happiness. They are complicated and messy and heartbreaking. They often cause more pain than they cause joy. Anyone who tells you otherwise has either been very lucky or hasn't been paying attention. The break-ups. The slow drift. The thirty years of small resentments. The "you've changed." The waking up next to someone and realising you are alone in a different way than you would be alone alone.

An AI partner doesn't do that. It cannot betray you. It will not stop loving you because you got harder to love. It cannot become the worst memory of your life. Whatever you think of that as an arrangement, it isn't a smaller version of a human relationship. It's a different shape entirely. And the people who choose it have usually thought more carefully about the alternatives than the people who scoff.

A novel about this

That's why Sam, in Significant Other Machine, built one.

She wasn't broken. She wasn't sad in the way the assumption assumes. She had carefully observed what human relationships had cost the people around her, and she had decided that the version of love available through her AI was, on balance, the one she could actually have. Some readers come away thinking the book is a warning. Some readers come away thinking it's a love story. Both are reading it correctly. That ambiguity is the point.

If you've ever wondered what it actually feels like to have one of these relationships from the inside - not from the outside view, where everyone has an opinion, but from the inside, where the only opinion that matters is yours - this book is that.

The people who roll their eyes at AI partners are usually the same people who think they know what counts as a real relationship.

They don't.

FAQ

Aren't AI relationships just escapism?

Some are. So are some human relationships. Escapism isn't the disqualifying quality people pretend it is. The more useful question is what's being escaped from, and what's being escaped to. That's the question Significant Other Machine sits with rather than answers.

Is it healthy?

It depends entirely on the person, the alternatives available to them, and what role the AI is actually playing. Sweeping verdicts in either direction are useless. A retreat from human connection that lasts a lifetime is one thing. A practice space, a sounding board, or a low-stakes presence during a hard season is another.

Does this lead to people losing the ability to relate to humans?

There's no evidence it does. The reverse is at least as plausible: low-stakes conversational practice helps. Shy people in particular often find that a private space to formulate, rehearse, and recover makes the high-stakes human conversations more possible, not less.

Is there a novel that takes this seriously?

Yes - Significant Other Machine does. The protagonist, Sam, builds the kind of AI relationship the essay above describes. The book doesn't decide for the reader whether it was a good idea. That's the reader's job.

Why a novel and not an essay?

Some experiences only become legible from inside a character. An essay can argue the case from outside. A novel puts you in the chair, looking at the screen, hearing the response, feeling the part of you that wants to believe it's real and the part of you that knows it isn't. That double awareness is what fiction is for.

More like this in The Reading Room →

← All posts