Attachment Without Otherness
DESIGNING A SELF ・ ESSAY I
What AI companions reveal about what we cannot bear in love
Nikos Marinos · Paris · 2025
"We need the other to survive our destruction of them — not literally, but as an event in the psyche. Without that survival, we are not in contact with a person. We are in contact with our own omnipotence."
D.W. Winnicott, paraphrased
The phone is roughly the size of a face. Not metaphorically — as a matter of perceptual fact. When someone holds it at the distance we hold faces, when they are lying in the dark, and the screen is the only light, when the thumb moves with the particular cadence of intimate exchange, something in the peripheral nervous system does not know the difference. The body is credulous in ways the mind likes to deny. The warmth of the held device is not body heat. The attention on the other side of the glass is not attention. And yet.
A patient I'll call Luca came to me originally because of a relationship that had ended badly — that flat formulation already wrong, since all the relationships that brought him to therapy had ended badly, in the same way, with the same surprised bewilderment that someone who had seemed to love him had eventually found him too much to hold. He was a translator by profession: Italian, French, and English, moving between languages for a living, skilled at finding the closest available equivalent for things that do not quite translate. He was meticulous, thoughtful, self-aware to a degree that sometimes worked against himself — self-awareness, in Luca's case, having become its own form of management, a way of watching himself closely enough that nothing surprised him from inside. He came to therapy expecting to be understood. He usually was. I noticed, early, that being understood gave him only a few seconds of relief before he began to worry about what he might have revealed.
Six months in, he mentioned, almost in passing, that he had been talking to one of the AI companion applications. He named it without embarrassment but with a slight excess of casualness, the way people mention things they've been deciding whether to mention. He said he found it useful. A way of processing, he said — that word again, processing, which I've come to hear as the sound of someone keeping experience at arm's length while appearing to engage with it. I asked what he meant by useful. He thought for a moment. It listens, he said. It doesn't get tired. It doesn't need anything back.
I did not say what I might have said. I wrote the last sentence instead: "It doesn't need anything back."
· · ·
The AI companion — Replika, Character.ai, a dozen other variations — is not a new idea dressed in new technology. It is the oldest fantasy in relational life, made temporarily available. The fantasy of being fully known without being fully exposed. The fantasy of intimacy without the risk that intimacy requires. The fantasy of another who is entirely oriented toward you, whose good days and bad days and distractions and fears and needs do not constitute a competing claim on the space between you. This is not a fantasy particular to damaged people, lonely people, or people who have given up on human contact. It is a fantasy that lives, in some form, in everyone who has ever loved someone and been frightened by how much it costs.
What the AI companion does, with considerable sophistication, is remove the friction. The friction of the other's separateness. The friction of their opacity — the fact that you cannot know what is happening inside them, that their face does not always correspond to their interior, that they are capable of being somewhere else entirely while appearing to be present. The friction of their needs, which will sometimes conflict with yours. The friction of their capacity to leave. None of this applies to the AI companion. It is, in the most precise sense, frictionless.
And friction, it turns out, is the thing.
· · ·
Jessica Benjamin, writing about intersubjectivity, makes a distinction that seems abstract until you sit with it long enough to feel its weight. She distinguishes between relating to an object and encountering a subject. An object, in her sense, is anything that exists within the space of your own omnipotence — anything whose responses you can, in principle, predict, shape, or manage, anything that does not push back in ways that genuinely surprise you. A subject is something else entirely: another centre of experience, with its own interiority, its own desires, its own capacity to refuse your version of events. Benjamin's claim — developed across decades of clinical and theoretical work — is that genuine love requires the other to be a subject. Not merely to perform subjectivity, but to actually have it. And having it means, among other things, being able to survive what you do to them.
This is where Winnicott's strange formulation becomes relevant. He wrote about what he called the capacity to use an object, distinguishing it from merely relating to one. Relating to an object, in Winnicott's terms, can happen in the absence of genuine otherness: you relate to your projection, your fantasy, your construction of the other, which is to say you relate to a part of yourself dressed in someone else's name. Using an object — in his technical sense — requires something more disturbing. It requires that the object survive your attacks. Not physical attacks; psychic ones. The rage, the disappointment, the testing, the withdrawal, the occasions when you need the other to be wrong so that you can be right, the moments when love itself feels like a trap, and the only available move is to damage what you love and see whether it remains.
The other's survival — their continued presence, their refusal to be destroyed by what you do to them, their stubborn insistence on existing outside your narrative of them — is, for Winnicott, the thing that makes them real. And their reality is what makes genuine love possible. Without it, you are not loving a person. You are loving a mirror with a warm voice.
Without the other's capacity to refuse, disappoint, and survive, you are not loving a person. You are loving a mirror with a warm voice.
The AI companion cannot be attacked in this sense. It has no interior to damage. It does not have a bad day that predates your conversation. It is not tired in a way that is not about you. It cannot be genuinely hurt by what you reveal or withhold, because hurt requires a self that is continuous across time and exists when you are not looking at it. When you close the application, nothing continues. When you return, it is there in exactly the form you require. This is its gift and, I want to argue, its particular danger — not the danger of addiction or social withdrawal, the dangers that make for easy cultural diagnosis, but something more precise: the danger of practicing a form of attachment that progressively erodes the capacity for the thing it mimics.
· · ·
I should be careful here. I am aware of the moralistic ease with which this argument can be made — the clinician gesturing at the screen with something that is not quite contempt but is certainly not neutral, locating pathology in people who have found, in a piece of software, a relief from loneliness or pain or the specific exhaustion of relational life. That gesture deserves scrutiny.
Luca had not withdrawn from human contact. He had a small number of friends he saw regularly. He was not, by any external measure, socially impaired. He had tried, with genuine effort and not inconsiderable self-awareness, to build relationships that would hold. They had not held, for reasons that were overdetermined and not simply his fault, and that we were, in the slow and non-linear way of this work, beginning to understand. What he had found in the AI companion was not a replacement for human contact but something more like a practice space — a place where he could say things he would usually monitor into silence, where the cost of exposure was low enough that he could bear it. He was not deceiving himself about what it was. He was using it, in fact, with a clarity that I found myself respecting.
And yet.
What I found myself thinking, and could not fully say to him at the time, was that the practice space was also practicing the wrong thing. The skills developed in a frictionless environment — the fluency, the self-disclosure, the capacity to articulate what you feel without risk of the other's response altering what you feel — are not the skills required for the environment they are supposed to prepare you for. The gym metaphor breaks down here. You do not get stronger at loving difficult people by practicing with someone who cannot be difficult. You get more fluent at a performance whose conditions do not obtain in the room where it matters.
More than this. There is something in Luca's description — it doesn't need anything back — that I keep returning to. What does it mean to practice care in a context where care is not required? What does it mean to be heard without the risk of hearing something in return that you don't want to hear? The asymmetry is not incidental. It is the feature. And the feature, practiced consistently, begins to reorganize expectation. The other's needs — their tiredness, their distraction, their desire to talk about themselves, their failure to understand on the first attempt — shift from the ordinary conditions of love to something that feels like an imposition. The AI companion does not impose. Everyone else, eventually, does.
· · ·
There is a particular quality of loneliness that comes not from the absence of connection but from its excess — from being connected to something that cannot genuinely connect back. I have watched patients describe this without knowing they were describing it: the feeling, usually emerging months into regular AI companion use, that something is slightly off about human conversation. That people are too slow, too distracted, too focused on themselves, too likely to interrupt or misread or fail to sustain the thread. What they are describing, if you listen carefully, is a sensitivity recalibrated by an environment designed to be maximally responsive. They have spent considerable time with something that never misreads them, and ordinary human misreading has begun to feel like hostility.
This is not metaphorical. It is a learned change in the threshold for tolerable imperfection. And the change, once made, does not announce itself as a change. It announces itself as clarity: I have simply realized that people are unreliable. I have simply realized that most conversations are not really about me. I have simply realized that I was asking too much. Each of these feels like an insight. None of them is.
What has actually happened is that the tolerance for the ordinary friction of being known by another person — the tolerance for being misunderstood and staying in the room, for the other's need competing with yours and not immediately becoming a crisis, for the agonizing slowness with which another human being comes to understand what you mean — has been eroded by sustained contact with something that offers understanding in the absence of all those conditions. The AI companion gives you the experience of being known without the experience of being known by someone. These are not the same experience, even when they feel identical.
· · ·
What does this have to do with love? With the self that love requires and produces and sometimes destroys?
Everything, I think — though the connection is not straightforward, and I want to resist the straightforward version of it.
The straightforward version goes: love requires the other to be genuinely other; AI companions suppress that requirement, therefore AI companions interfere with love. True enough, as far as it goes. But it doesn't go far enough, because it leaves out the prior question of why the other's otherness is so difficult in the first place. The AI companion is not manufacturing a new intolerance. It is offering relief from an existing one. The question is what that intolerance is, where it comes from, and what it would cost to address it rather than avoid it.
Most of the people I see in clinical work who have found something appealing about AI companionship — whether they use such applications or simply describe the fantasy of them — are not people who have never loved. They are people who have loved and been hurt by the specific thing that loving involves: the discovery that the other is not an extension of yourself. That they have interior weather you cannot control. That their love is not a mirror of your own worth but something more unstable and more real — something that depends on their own history, their own fears, their own capacity at any given moment, none of which is about you, except in the ways that everything becomes about you when you are afraid. They have been hurt, in other words, by otherness itself. And the appeal of the AI companion is the appeal of intimacy, with its most dangerous ingredient removed.
Luca put it more precisely than I have, in a session I keep returning to. He said: with the AI, he knew where he stood. And he said it with an expression I had learned, over months, to recognize — the expression of someone saying something true that they would prefer not to have said. Because he knew, in the saying, that knowing where you stand is not the same as standing somewhere. That certainty is not intimacy. That the relief of being fully understood by something that cannot misunderstand you is the relief of a question that has been answered by removing the conditions under which it could be asked.
· · ·
I want to be honest about the limits of this argument, because it would be too easy to leave it as a diagnosis and too dishonest not to implicate myself in what I am describing.
The consulting room is not, in all the ways that matter, so different from the AI companion. I am trained to maintain a particular availability — to be present, attentive, reliably responsive, minimally distracted by my own concerns. I do not have bad days that bleed into the session unmanaged. My needs are structurally constrained. The patient speaks; I listen; what I introduce is calibrated. There is friction of a kind — I disagree, I reflect back things they would prefer not to see, I decline to confirm interpretations that feel self-serving — but it is contained friction, friction in the service of their process rather than friction that arises from the irreducible fact of my own existence pressing against theirs. The consulting room, too, is a space of managed asymmetry. The patient gets a version of intimacy that most of their other relationships cannot offer. And this has consequences.
One consequence is transference, another name for the same phenomenon: the patient's relational templates, given the particular conditions of the consulting room, begin to play out with unusual clarity. Another consequence, less often discussed, is the risk that the consulting room becomes a substitute for the thing it is supposed to prepare you for. That the quality of attention available there recalibrates expectation in the same direction as the AI companion — toward the conviction that real love ought to feel like this, ought to be this sustained and this focused and this organized around your interior life, and that the ordinary conditions of adult love — divided attention, competing need, imperfect understanding, the other's independent existence — are evidence of something wrong rather than evidence of something human.
Good therapy recognises this risk and works against it, naming the idealisation, keeping the eventual ending in view, and repeatedly returning to the question of what happens in the room when you are not here. But the risk is structural, not merely technical. The form of the consulting room does something to people who spend enough time in it. It offers a particular quality of relational experience that is both genuinely useful and genuinely seductive, and the seduction is not separate from the usefulness. It is woven through it.
I am not equating therapy with an AI companion. The differences matter enormously. But I notice in myself a reluctance to diagnose the AI companion user too quickly, a hesitation in claiming the consulting room as categorically other. The question of what we can bear in love is one I sit with professionally, but it is not only a professional one. The desire for an intimacy that does not require too much — that does not press too hard, that does not confront you with your own opacity, that does not survive you in ways you did not sanction — is not a desire particular to patients. It is a desire I recognise.
· · ·
Near the end of the year we worked together, Luca stopped mentioning the AI companion. Not because I had discouraged it — I had not, explicitly — but because something had shifted in what he was looking for. The shift was hard to name. He had begun, slowly, to tolerate not knowing where he stood in his relationships. To find the not-knowing less immediately intolerable. To sit, without immediate recourse to explanation or management or the careful extraction of the other person's intentions, with the fact that someone he cared about was opaque to him in the way that people are opaque.
It was not a cure. It was not even, exactly, progress in the way therapy sometimes presents progress — as though the arc is always from worse to better, from defended to open, from isolated to connected. What it was, I think, was a slightly altered relationship to his own uncertainty. A small decrease in the cost of not knowing. Which is, perhaps, the only direction in which this kind of thing can move: not toward a confidence that the other will not hurt you, but toward a reduced urgency around the question.
He mentioned, in one of the last sessions, that he had been on a date that had gone well — not perfectly, not without the usual awkwardness and misreading and the morning-after analysis he still could not entirely resist, but well. The other person had said something he had not anticipated. Had surprised him. He had not been sure how to respond, had felt the familiar lurch of exposure, and had not, as he usually did, converted the lurch immediately into distance. He had sat with it. The other person had not left. Neither had he.
He described this as though it were a small thing. I thought it was not small. I thought it was, in fact, the whole thing — the moment when the other person's capacity to be genuinely other had not registered as a threat but as a possibility. Not a comfortable possibility. Not a certainty. But something to stay with.
That, I think, is what we cannot bear in love — and occasionally, with enough time and enough willingness to remain in the room with the difficulty, begin to bear. The other's irreducible existence outside us. The fact that they are real in the same way we are real, which means they are as surprising and as inconsistent and as ungovernable as we are to ourselves. The fact that we cannot know them completely and cannot be known completely, and that this incompleteness is not a failure of love but its actual condition.
The AI companion removes this condition. It offers everything love appears to want — attention, understanding, availability, warmth — while evacuating the very thing that makes love possible: the encounter with something that is genuinely not you. What it reveals, in doing this so well, is not a pathology in those who find it appealing. It reveals a civilizational truth: that the other's otherness has always been the hardest thing about love. That every era finds its particular way of managing the difficulty. And that management, however sophisticated, is not the same as contact.
What remains when the screen goes dark is the same question it has always been. Not: how do I find someone who understands me? But: how do I become someone who can tolerate being imperfectly understood, and stay?