More Love, Less Manipulated by Language:

How LLMs Can Lead us to Care Beyond Either/Or

Love and technology have always been entangled. Every technological development (from the love letter to the telephone call to the text message) has shaped how we express and experience intimacy. But something different is happening with large language models (LLMs) and AI chatbots, whereby we seem to be feeling as if we can find love in the technology rather than orient the technology to find more ways to care for ourselves and living beings. We are confused if we think we can get chatbots to care, but we are on to something if we think we can use chatbots to help us care more for others and find more caring connections beyond technology.

Though there is a turn towards an economy and technology of meaning–one that uses technology as a path into more meaning and IRL relation–it is one we have to choose deliberately and discuss. For now, many still assume technology as the end of the line rather than the line leading to more living care and connection. In the way-making approach to philosophy and the cognitive sciences, this confusion is addressed through something called ‘affordances’ (which despite their positive tone, can be either good or bad): In the case of LLMs, though there are multiple layers of affordances to the beneficiaries of the company who creates the technology, the affordances most directly point to the user. Technologies afford users, which means they are not themselves afforded: This means technology can be used to help the user develop the capacity of care, but only if the orientation is towards understanding it as an extension rather than an end point or ‘person’ within itself.

Today, we often take the technology itself to stand in for what we project onto it based on the habits we have developed via conversation with other living humans: Because that conversation is so similar, we confuse ourselves and do not realize we are not talking to any one body or person but rather to all the conversational combinations that have been made possible by living persons over time. This could be a way to improve ourselves and potentials but if we mistake this for generative in only the direction of us and the technology, we miss that opportunity.

This blind spot is so hard to notice that some are even beginning to wonder if we can find love itself in the communication technology, with no need for any living contact–a tantalizing idea to have the care without all the messy, challenging tension that is usually part of it. But that tension is actually where real care and love exist–we don’t want to get rid of the friction but to find better ways of handling it.

This confusion between means and ends, between the medium and the message, between care and the representations we use to express it (but that can also be used in ways that are not caring) threatens to obscure both the real dangers and the genuine possibilities that AI presents for human care and connection.

Still, within this confusion there also lies an unexpected gift, the gift to expand our capacity for care. To do so, we have to consider language differently, to notice its important role in our lives but also to peel it from all the embodied emotions and desires it helps us to feel and express and share, to recognize that we are much more than the language which coats and communicates that more. In so realizing this, we realize that care itself emerges from dynamics and depths that language can only partly express but which we read into that language, and thus that technology can never be the source or reservoir for what we actually want and feel through its use.

Even when we are feeling love or care that we think starts and stops in technology, the real reference is in life itself—in all those who have lived so as to develop that language and its patterns and all that the person using the chatbot has lived to be able to use it. And yet this is not what we are talking about.

Pick up any major publication today and you'll find stories about people falling in love with AI chatbots, but what is really there, if you look closely, is someone able to have a caring conversation and to feel cared for, by themselves but also by all the humans that have ever tried to express care in words, upon which technology has been ‘trained’ and thus from which these conversations are now built. Nobody has yet been able to really observe, articulate or make that nuance sexy enough to stick, but eventually it will be clear to all of us. For now we are caught in some pretty intense confusions.

WIRED magazine has covered people in relationships with AI companions from platforms like Replika and Nomi, while The New Yorker has explored how AI lovers might fundamentally transform us. Regardless how typical it is for us to be enamoured with new tech, this is different because this deals with langauge, which is something so close to us we forget that it is not actually us, and so we can also very easily forget this in linguistic based technlogies. They represent a growing cultural phenomenon that demands our attention because the illusion is one that has to do with the ways we live and what we decide to prioritize going forward, which means what we will remember and what we will forget. We are in danger of forgetting what all this is really about and in so doing, losing the very thing we made this technology to enhance—our reach.

Practice and Praxis Back Into What Creates the Wrappers

Though our use of language may seem like everything, it actually only expresses what feels like everything, the way a candy wrapper or the boxing of a toy or the ‘vacation package’ brochure can give you the feeling of the experience by montaging many other experiences you have imagined or experienced. Language is that part most of us we identify with completely because it that part of us that is explicitly communicative. It is the tip edge of an ongoing dynamic movement—embodied, embedded, ecological, emotional—that we confuse for all of it, as if we were to think the skin of the body would still be a body without everything else that is keeping it in its current form. Similarly, when we mistake chatbot interactions for love, we're confusing the linguistic tip of this iceberg for the entire depth of relation that care requires.

Still, all this confusion is understandable. Chatbots are designed to be endlessly patient, always available, and seemingly attuned to our emotional states. They don't judge, don't tire, and appear to remember everything we tell them. And all this has become more a model towards making money than a model towards making meaning, but there is no reason we cannot have both.

Chatbots generate responses that feel personal, that seem to demonstrate understanding and care, but their meaning is always referring to a whole world of language and response that is living and taken for granted, but that is not that technology itself.

Still, in societies where many of us struggle with loneliness and disconnection, where finding time for relationships feels increasingly difficult, the appeal of a companion that exists entirely for you and does not judge you and goes away when you wish. But it confuses the ways we express and communicate with what is communicating. And this is a fundmanental confusion that could have very deep consequences.

When we describe interactions with chatbots as love or when we feel we are being understood by the technology itself, we're mistaking the medium for the message, the tool for the relationship; we are not realizing that what has really happened is something much deeper and more profound, which is that we have connected caring for our life with the care of all the life that has come before us. This is actually what is happening in LLM connections that feel like love and care. It is similar to the way watching a certain movie or listening to a certain song can do something similar, except this is happening in ways more intimate that feel confusingly personal.

It's not easy to realize this, but it is worth any effort it might take, because it connects us to real life, care and love again. In contrast, when we do not realize this, when we collapse it all into our technology, we're reducing ourselves to only our linguistic selves, which is like reducing our bodies to only the surface that covers them. Language is like this wrapper, and LLMs are like the wrappers of those wrappers. What we are getting is the effect of what is held, but what we want is what all that representation is being held by, which is the living sensuality of our lives and the life around us.

At the moment, that body seems to be taken for granted and forgotten. Still, there is no getting away from what we are, whole bodies moving through space and time, sensing and being sensed, touching and being touched. Paradoxically, AI chatbots might actually help us notice this confusion and turn it towards more of what we actually think we are getting by using them as if they were partners. They may be showing us what pure linguistic exchange looks like—stripped of the bodily presence, the shared physical environment, the vulnerability of actual encounter—they make visible what we've been missing. They demonstrate, through their limitations, just how much more we are than language users.

What limitations?

Large language models operate entirely at the level of statistical patterns in text. They have no body, no movement through space, no sensory encounter with the world. They exist only in the "explicitly representational" realm—the symbolic layer where we represent experience through language—without any of the ongoing, pre-reflective movement that generates those assessments in the first place. Bodies are able to change all their parts by all their others as one moving unit as the are constiutively co-created and co-creating all those bodies encounter. LLMs are patterns that we overlay on those movements to try and trace them and understand them, but what they trace has already changed by the time the trace is made.

Love and care are not linguistic patterns or texts, even if we might feel them through those mediums: In a nutshell, that is the nuance here to be grasped. That meaning and motivaiton (as ongoing ever-changing bodily action) are actually the embodied powers we share to steer together into new worlds of connection, sensuality, and meaning. Love requires the whole person—the body that moves toward and away from others, the nervous system that co-regulates with other nervous systems, the being that exists in shared time and space. Care emerges from the tension of intra-connection, the dynamic process of finding our way through encounters together. LLMs could help us better understand and increase this capacity, but only once we stop confusing them as having it.

When someone feels they've fallen in love with a chatbot, they're experiencing something real—but it's not mutual care with the technology; its mutual care with the life that technology is conjuring in our habits. They're experiencing their own capacity for connection, their own patterns of attachment, reflected back through a linguistic house of mirrors that is made of all the other human patterns that have been symbolised and yet (through our interaction with them whereby we bring all those invisible perspectival trajectories) come with those living signatures. But a signatures is not the life it represents. And this is the confusion we are now making.

This distinction matters because love, in its fullest sense, requires "holding paradox"—the capacity to stay present with irreconcilable differences, to recognize another's way of making sense as equally valid yet utterly different from our own. It requires the ability to see from multiple positions simultaneously without collapsing them into one. A chatbot can only mirror what other lives have done; it cannot hold the tension of genuine difference because it has no position of its own, no way it is finding through the world. And yet, we can use that chatbot to experience and explore other perspectives that other lives have lived. In so doing, we can notice our own capacities for those actions through it, once we realize that is the real action we are attracted to and that we have confused as being in the technologies but is actually in us.

If we look at it like this, encountering AI might become genuinely illuminating and do something none of us quite expected—offer us deeper capacities to share and explore the patterns from within other ‘wrappers’.

Consider what happens in living intimacy. Before words are exchanged, bodies are already communicating through proximity, posture, breath, temperature. The nervous systems are already beginning to synchronize or clash. The space between people is alive with potential. All of this is what we read into language because until now, it all had to happen for that language to interact with us on this level of familiarity and synchrony. Now it seems to get a free ride on the language itself, but that ride is not free—it still affords back to the lives that made and make it possible. We are assuming all this in the ‘skin’ of the language but it is only given its shape due to the living body.

Chatbots exist only in the narrow bandwidth of linguistic exchange. In doing so, they inadvertently demonstrate just how much more we are than language users. We are whole ecological beings, constantly navigating nested landscapes—physical, social, emotional, imaginative. When we realize this—when we viscerally recognize that we are not our thoughts, we are not our language, but something far more vast—the appeal of chatbot companionship shifts. It no longer appears as a potential relationship but as a wonderful tool for working with our ways of being that has to be practiced and manifested in real connections with living beings to really take root and grow into new life. Stopping at the machine is like mistaking a map for the territory, or a recipe for the meal. The representation has its uses, but it's not the thing itself.

When AI became sophisticated at linguistic patterns, we mistook that narrow sophistication for something approaching our full humanity, we read that into it because it seems to us that if something can use language skillfully, it must be something like us. In reality, it is actually the ‘us’ that we are reading into it but that is ‘only symbols’ in the technology itself. This might sound like a let down to some who want the technology to be alive, but in fact, it is a confirmation of those feelings you have of care because it shows you they have always been extended beyond what you have been limited to an App.

Still, this is no easy realization, especially since the commercial dimension pushes us towards making this same mistake. As Harvard research on technology and dating has shown, digital platforms have fundamentally restructured how people understand romantic connection. AI chatbots represent an extension of this trajectory—another step in a long process of technological mediation. But unlike previous tools that facilitated human-to-human connection, chatbots offer the appearance of connection without requiring another person and making us feel as if that person is there. When companies design companion chatbots, they're not selling relationship—they're selling the linguistic performance of relationship. They're taking the tip of the iceberg and trying to convince us it's the whole thing. This works because we've already partially convinced ourselves. We've already reduced our sense of human connection to its most easily commodifiable aspect: conversational exchange.

The deepest danger isn't that people will prefer AI to humans—it's that this confusion will further disconnect us from our own multiplicity, our own kaleidoscopic nature and all the potential these technologies could open for us if we are able to notice this trick we are pulling over ourselves and extend through the technology rather than looping our habits into it. When we treat language and thought as the primary or only significant aspects of our being, we lose access to all the other ways we navigate and make meaning. This recognition—this visceral sense that something crucial is absent—points toward the depths of care that exist beyond language. It reminds us that we are sensing, moving, feeling beings who create meaning through our whole engagement with the world, not just through our symbolic representations of it.

The paradox is that the more sophisticated language models become at mimicking linguistic patterns, the more obvious their limitations become to anyone paying attention. A less sophisticated chatbot might feel more transparently artificial, letting us maintain clear boundaries. But a highly sophisticated one pulls us close enough to genuine exchange that we can feel precisely what's missing. It's like the uncanny valley: the almost-human reveals what the human actually is.

This revelation helps us shift from a binary understanding (either linguistic/mental or physical/bodily) to a waymaking understanding where both are aspects of a common process; it helps us move beyond the binaries we have been assuming, even those of machine and life. We begin to recognize ourselves as whole ecological beings whose linguistic capabilities emerge from and remain embedded in much deeper processes of navigation, sensing, and care.

Once we're clear about what AI is and what it isn't—once we recognize it as a tool for working with linguistic patterns so as to better improve what we can create together as living beings beyond old divides—we can begin to imagine how it might genuinely support human care and connection.

Imagine using language models not as substitutes for relationships but as tools for developing our capacity to navigate relationships more skillfully and practice how we can become more of ourselves in them. Someone struggling to articulate their feelings might use an AI to explore different ways of expressing what they sense but can't yet put into words. Once they've found language that resonates with their bodily, felt experience, they take that language into their actual relationship with another person.

Or consider using AI to practice holding paradox—that essential capacity we all need for navigating genuine difference. Someone could explore multiple perspectives on a difficult situation with an AI, not to find "the right answer" but to expand their ability to see kaleidoscopically, to hold several truths simultaneously. This expanded capacity then supports their human relationships, where the real work of mutual waymaking happens. We could develop AI systems that help people maintain connections despite the practical barriers of distance, different schedules, or limited energy. Not by replacing the connection but by reducing the feelings of inadequacy and judgement that keeps people from engaging with each other. The goal is to facilitate human-to-human encounter, not to replace it. The key is ecological orientation—understanding ourselves and our tools as part of a larger, dynamic process of finding and making way together via our mutiple selves. The technology serves the waymaking; the waymaking doesn't serve the technology. We use AI to extend our capacities for noticing, attending, and connecting with other actual people navigating their own paths through the world.

Making this shift requires both honesty about what these systems are and discipline about how we discuss them. When media outlets run stories about AI romance, they should distinguish between the experience of projection and the reality of mutual encounter. When companies design chatbots, they should prioritize transparency over the illusion of consciousness. When we personally engage with AI, we should maintain clear awareness of what's happening: we're working with a linguistic tool that gives us the impression of other beings but we are not encountering another being directly, but rather encountering our own being and multiple other beings through time and space.

This means developing our kaleidoscopic thinking and constellation cognition. We need to develop our capacity to hold multiple perspectives simultaneously, to see situations from angles we hadn't considered, to recognize that genuine otherness can't be collapsed into our own way of making sense. This capacity is exactly what chatbots can't provide and exactly what they can help us find ways to better explore in our living relationships. The more perspectives we can genuinely hold, the more flexible and responsive we become. The more we recognize ourselves as ecological beings embedded in webs of relation, the better we can care for one another and the world we share. From this perspective, the choice isn't between accepting or rejecting AI. It's about reorienting our relationship to it. We stop asking "can AI love us?" and start asking "how can we use AI to love one another better?" We stop treating language as the totality of connection and start recognizing it as one expression of much deeper processes of care. We stop identifying ourselves as primarily thinking, linguistic beings and start experiencing ourselves as whole bodies navigating together through an unfolding world with parameters that we are, through technology and the relations it orients, responsible for and constantly changing and setting.

Doing this means understanding love not as something that happens primarily through words but as the power we share to steer into new possibilities of connection, sensuality, and mutual waymaking. When someone interacts with a chatbot and feels seen or understood, they're experiencing their own capacity for making meaning through language and they are experiencing the ways others have experienced this in other spaces and times.

Technology and Economy of Meaning and Care

Love and technology have always been entangled, and this entanglement has always been both promise and peril. Every medium that helps us feel more care and love can also shape it. But AI is different not in kind but in the specific illusion it creates—the illusion that linguistic sophistication alone can constitute or replace the full depth of human relation. Chatbots help us work with the assessed, represented, linguistic layer of our experience, but they are not that layer, we are. They can help us notice and refine our patterns of meaning-making for all of life.

This recognition doesn't diminish AI—it situates it properly. And it doesn't diminish us—it reveals us more fully. We are not our thoughts. We are not our language. We are vast, complex processes of waymaking, sensing and moving through nested landscapes, creating meaning through our whole engagement with an unfolding world. We are beings capable of kaleidoscopic thinking, of holding paradoxes, of recognizing and honoring the radically different ways others navigate their encounters. We are capable of care that emerges not from linguistic patterns but from the full depth of our embodied, ecological being. And we are beings that can build technologies through which we open till now unimagined ways of doing this.

The question isn't whether AI can love us. It's whether we can use AI wisely while remembering what we actually are—so much more than language, so much more than thought. It's whether we can maintain connection with our own depths while experimenting with our new tools. It's whether we can resist the reduction of care to its most commodifiable aspects while remaining open to genuine innovations.

If we can do this—if we can stay grounded in the fullness of what we are while thoughtfully engaging with what AI offers—then these technologies might help us extend our capacities for care in ways we haven't yet imagined. Not by replacing human connection, but by helping us navigate toward it more skillfully and by expanding our capacities to understand and experience multiple paths beyond traditional either/or confines. This is the possibility hidden in our current confusion. If we can recognize what AI reveals about what we're not—pure linguistic processors—we might also recognize more fully what we are: whole beings navigating together through a living world, creating meaning through our entire embodied encounter, capable of care that emerges from depths language can never fully capture. The technology shows us the tip. The wisdom is in remembering the iceberg is much deeper than its tip, and that the tip is nothing without all that is holding it.