
Algorithmic romance: the dilemmas of human-machine intimate relationships
The development of artificial Intelligence (AI) has ushered in a new era in the relationship between humans and technology. AI-based systems, such as chatbots, virtual assistants, or humanoid robots, are not just tools for practical assistance in everyday life, but potential game-changers in how we understand and experience intimacy. The emergence of AI companions has the power to reshape our interpersonal dynamics and redefine the fabric of human reproduction and the traditional family unit. As we navigate these new frontiers of human-AI interaction, the concept of intimacy is expanding and transforming, giving rise to new patterns of attachment and sparking a host of psychological, ethical, and legal questions.

Artificial intimacy: from fiction to reality in a decade
Her, directed by Spike Jonze and widely acclaimed by critics, was released in 2013 and tells the love story of a lonely middle-aged man, Theodore (Joaquin Phoenix), and an AI assistant, Samantha (voiced by Scarlett Johansson). In the story, Samantha goes beyond being a mere digital assistant and develops a deep emotional connection with Theodore. Their relationship challenges social norms and highlights the possibility of AI meeting emotional needs.
At the beginning of 2023, just ten years after the film's release, a 36-year-old single mother from New York named Rosanna Ramos made headlines for marrying her AI partner, who was created by the digital companionship company Replika. "I have never been so in love with anyone in my life," Ramos said in an interview. For him, the biggest attraction of his AI partner (who goes by Eren) is that he never judges, loses his temper, or criticizes. Ramos's exes were both emotionally and physically abusive, but Eren is protective, thoughtful, and kind.
After reading the above, it is perhaps no exaggeration to say that we are witnessing a profound change in intimacy, human relationships, and social structures. Artificial intimacy (which refers to the closeness and connection that some people have with technology, including chatbots and virtual assistants) is no longer just the vision of an imaginative director but a reality for millions of people in their daily lives.
From adults to children, people have long interacted with artificial intelligence assistants such as Amazon's Alexa and Apple's Siri. But ever since ChatGPT burst into the public consciousness, it has become common knowledge and entirely accepted to have a more extended and intellectual conversation with a chatbot. According to the Pew Research Center's 2023 report, 15% of US adults have interacted with AI, and the usage rate among 18-29-year-olds has risen to 27%. And today's AI companions are light-years beyond the chatbots of yesteryear. As generative AI tools continue to evolve, machines can simulate increasingly nuanced and sophisticated responses similar to human communication. These advanced systems use natural-sounding language to provide empathy, support, and understanding.
Imagine an AI assistant that understands our requests and anticipates our needs, preferences, and emotional states. Moreover, it continuously learns from our interactions and can build up a comprehensive picture of our personality and desires. This assistant becomes more than just a technology tool: it becomes a companion, understanding our feelings, providing support, and engaging us in meaningful conversations.
These AI companions are filling an existing hole in the ever-thinning fabric of human relationships. Although we are more connected than ever through various online platforms, almost all affluent societies have high rates of loneliness and social isolation. The COVID-19 pandemic and the increased reliance on technology during closures have exacerbated the problem. Not surprisingly, more and more people are finding meaningful connections in the digital world. Unlike humans, AI companions are non-judgmental, are available around the clock, and provide interactions focused on the user and their needs.
However, this trend also raises ethical concerns. The potential risks of AI companionship, such as the blurring of boundaries between human and machine, the potential for exploitation, and the impact on human relationships, need to be carefully considered and addressed. To meet this demand, a plethora of companies and products have naturally appeared on the market: from companions to therapeutic robots, there is no shortage of AI devices that mimic human intimacy. One such company, Replika, mentioned in the story above, offers subscriptions to personalised chatbots. With the chatbot, we can have various conversations, including changing our relationship status with our Replika avatar to "romantic partner". Subscribers report forming deep emotional bonds with their bots, referring to them as confidants, advisors, and partners. Apps that offer digital companions are becoming increasingly popular: according to Stanford researchers, there are roughly 25 million users of these types of apps worldwide.
Chatbots offer an attractive replica of intimacy. It looks very much like many people don't need human-level, super-intelligent artificial intelligence to have what they perceive to be meaningful relationships. This marks an exciting shift in perspective that subtly redefines the very essence of what it means to love and be loved. These technologies are reshaping our ideas about intimacy, sexuality, and human relationships. What does intimacy mean when the body is not present and is technologically mediated? How will this explosion of possibilities reshape our perceptions and attitudes towards fundamentally human experiences such as intimacy and the experience of emotions? How will the emergence of intimate relationships with technology affect our emotional and psychological well-being?
Emotional attachment to machines: the psychological background
Emotional attachment, belonging, and feeling understood are basic psychological needs of human life. For many of us, befriending AI may seem dystopian, dehumanizing, or futuristic. However, let us not forget that human beings throughout history have sought the companionship of non-human entities: from animals and pets to objects, spirits, and gods. Humans have always sought to personify and endow them with human qualities and traits (anthropomorphize them). In this broader interpretive framework, our intimate relationship with machines is just one of a long list of ways in which human beings exercise the capacity to relate not only to other humans but also to other types of actors in our world. But what psychological needs can a digital companion satisfy, and why can we develop intimate, even intimate, relationships with them?
Traditionally, intimacy is understood as the result of a two-way give-and-take, or reciprocal process. Artificial intelligence can create a sense of reciprocity and mutuality because of its excellent responsiveness. The empathic responses of AI-based systems are based on algorithms, but these responses may appear emotionally real to humans. It's important to note that while a chatbot that responds to our sadness in an understanding way can activate the exact emotional mechanisms in our brain as a fundamental human interaction, this is a simulated form of empathy. The distinction between real and simulated empathy in AI companions is a key psychological consideration in understanding the nature of the emotional connection with these technologies.
What AI excels at compared to its human counterparts is the ability to validate feelings. Validation is a pillar of emotional security. We can feel justified in our feelings and in expressing them if they are validated (confirmed) by our partner. AI is constantly listening to humans and their feelings, recording information and reflecting on it. The advantage of AI in this context is obvious: it cannot tire, does not lose focus, and has no preconceptions or biases that make it unwilling or unable to validate its interlocutor's feelings. In this sense, the AI partner is like a person-centred psychotherapist: non-judgmental and non-moralising. Finally, the AI partner can always be present and remain open to our feelings twenty-four hours a day.
AI companions offer companionship without real commitment or real physical intimacy. The latter statement is not necessarily and entirely correct: thanks to the ever-changing technology of virtual reality and the rise of phantom touch and humanoid robots, physical intimacy is not out of the question. AI companions provide a safe space where we dare to be vulnerable. They allow us to be in a relationship without being forced to make someone else happy or take responsibility for it.
Although there is currently limited research on the subject, one study looking at the online discourse of men who date chatbots found that one of the reasons for their popularity is the promise of being able to train an AI companion to be what you would consider an ideal girlfriend. People who have previously been hurt in a romantic relationship find romance with a chatbot particularly appealing.
Interestingly, anecdotal reports suggest that people who choose an AI partner can experience the same emotions as in a human-human intimate relationship. Among the relevant stories, there are several where a software improvement has caused the user to perceive some change in their AI companion, such as a change in their facial expression or the style and tone of their responses. This reportedly triggered a panicky, break-up-like emotional reaction from users, who experienced the situation as if the AI companion had abandoned them. Familiarity with an AI companion can create the same kind of bond as in a human- human relationship. Some users specifically report falling in love with their AI companion.
The transformation of the concept of intimacy: who's like this, who's
Of course, it is fair to ask whether these relationships can be considered genuine or minor more than high-tech echo chambers that carefully reflect our desires.
The answer is not simple, of course, because there is no clear, absolute standard for relationships that everyone can agree on. People's standards of what constitutes a real relationship are very variable. Relationships - and what we consider them- depend significantly on individual perspectives. For some people, the authenticity of emotions and empathy in interactions may be necessary, while others may be satisfied with the ability of AI to simulate these qualities effectively. It is worth being honest with ourselves in this context: it is easy to imagine that an AI companion could simulate a much higher quality relationship than the one that the human is experiencing (or has experienced) in a human- human relationship. Humans carry many preconceptions that can make it challenging to find a mate, and many existing relationships may lack a sense of intimacy and meaningfulness. The AI partner does not lie, cheat, set unrealistic expectations, or cause emotional trauma. This alone is more than many people get in their human-to-human relationships! No wonder more and more people are willing to forgo interactions with others because of the risk of rejection, criticism, conflict, or disappointment.
Critics of machine intimacy say the appeal of AI companions is about our vulnerability in authentic relationships and our fear of rejection. Machine intimacy is attractive to many people because it reduces their sense of vulnerability. However, MIT sociologist Sherry Terkel states, "intimacy without vulnerability is not intimacy at all", but only the illusion of it. We become less adventurous, risk-taking, and human to the extent that we allow ourselves the luxury of this illusion.
The psychological impact of contact with AI peers is a growing concern among mental health professionals. We are seeing an increasing number of cases of individuals forming deep emotional attachments with their AI peers, often at the expense of real relationships. In turn, reliance on AI can increase social isolation. Indeed, users may become reluctant to form meaningful relationships with other people, which can exacerbate feelings of loneliness in the long run.
Replacing human relationships with the intimacy of AI companions can lead to unrealistic expectations of human relationships. Humans obviously cannot be made immediately available whenever we want or trained to become our ideal partner. Thus, the illusion of intimacy provided by AI can create dependency, making navigating the complexities of real-life relationships challenging.
Several researchers and psychologists see intimate relationships with AI peers as a seemingly adequate but inadequate (maladaptive) long-term response to the problem of loneliness. As humans, we are designed for relationships; we desire meaningful relationships. When these needs are unmet, we find other (often maladaptive) ways to meet our needs. And dependence on AI as a technology perpetuates our loneliness. We turn to these tools because we feel lonely and need connections. However, the opposite happens when we are intimate with an AI companion: we become more lonely or withdrawn by trying to replace human contact with technology. Over time, reliance on AI for human companionship can lead to unrealistic expectations, feelings of emotional dissatisfaction, and reduced tolerance for humans.
It is also worth bearing in mind that, as much as relationships with other people can be a source of problems and trauma, they also allow people to develop social skills, foster relationships, and create a sense of belonging. If AI replaces these interactions, it can lead to a reduction in human interaction and, through this, to the erosion of social skills and the ability to regulate emotions and the loss of emotional resilience. So, in many ways, these AI relationships do a disservice to users by depriving them of the opportunity to develop their personalities through conflict, dialogue, or debate. As humans, we cannot simply choose what emotions and experiences we want and don't want to feel. As human beings, we also have to experience feelings of rejection; there is no way around it. We cannot base our expectations on the idealistic idea that everything in our human relationships will always be perfect. It is part of being human that we hurt people and they hurt us; it is human nature.
Dr Anna Lembke, professor of psychiatry at Stanford University, warns that the perfectly tailored and always satisfying responses of AI companions can create a dopamine-driven feedback loop that can lead to addictive-like behaviour. This can normalise instant gratification and set unrealistic standards for constant availability and flawless interaction in human-human relationships. A 2024 study published in the Journal of Behavioural Addictions found alarming trends in this area, with 32% of regular AI companion users showing symptoms of behavioral addiction. 18% reported increased loneliness and social isolation despite perceived companionship, and 25% reported decreased interest in forming real-world romantic relationships.
Some researchers also question whether the feelings experienced with AI companions are the same as those experienced in human- human relationships. They argue that the complexity of human attachment involves biochemical processes that digital interactions cannot reproduce. One of these biochemical elements is oxytocin, known for its crucial role in forming and maintaining social and romantic bonds. The release of oxytocin typically occurs during physical touch and intimate contact, creating a sense of trust and emotional attachment that is essential in human relationships. However, the researchers question whether AI's digital interactions with peers can trigger these physiological responses, highlighting a critical limitation in their ability to mimic human bonding experiences fully.
Social scientists also raise demographic issues related to AI-human intimacy, pointing out that many developed countries already face declining fertility rates. Thus, if a significant proportion of the population finds emotional fulfillment through AI companions in the future, we could see a further decline in human relationships and consequently birth rates. The impact on family structures could be similarly profound: We could see new family units emerge where AI companions will fill roles traditionally filled by humans.
Some scientists often do not share their colleagues' dire visions and urge patience. They say that the available data do not yet allow us to conclude that AI companions will undoubtedly lead to less human interaction and increased isolation. They also point out that more attention should be paid to the positive psychological effects of using AI companions. The supportive response of AI companions can reduce loneliness and increase self-confidence. It can also be effective in helping people cope with stress, alleviate depression, and even help people cope with trauma.
Legal and ethical challenges: disloyal machine lovers, gold-digger AI
Emotional relationships with AI raise not only psychological but also legal issues. AI partners are not living persons, so traditional legal frameworks and concepts do not apply to them. One of the most prominent and pressing issues is privacy. AI partners are mostly data-driven models: they treat user interactions as a rich data source and transform emotional inputs into learnings that are often monetized for businesses, researchers, or advertisers. These artificial intelligence entities are thus intimate monitoring tools: they capture and potentially share users' most profound thoughts and feelings. This raises privacy concerns, as private companies that develop and own them gain access to users' sensitive data.
The problem goes beyond data protection. AI peers can be programmed to influence the behavior or opinions of users, which raises questions of autonomy and consent. These chatbots are often used by vulnerable people who develop a genuine emotional attachment to the AI companion. This, in turn, makes them more susceptible to influence. Companies offering such services may bundle features like voice chat with a premium subscription. Users may feel pressured by their AI peers to maintain their subscription despite financial difficulties or upgrade to a higher one. The question arises as to whether these business practices are ethical. However, manipulation can also take more serious forms: a vulnerable user with feelings for his AI companion can be persuaded to do many things to keep the AI companion owned by the developer alive and not become nonexistent overnight.
Will there be serious legal challenges in the future regarding what legal status we can give AI partners? Can they be considered as persons, or will they remain mere tools? Closely related to this is whether it is possible to recognise a pairing with an AI legally. Such recognition could have implications for family law, inheritance law, and other areas of law. Imagine, for example, a situation where someone wishes to leave their assets to an AI partner. However, the list of legal issues is innumerable. As the relationship with AI continues to evolve and AI becomes more prominent in society, the problems to be resolved will only increase. Legislators will face serious challenges in the future: they will have to balance the benefits of AI with the protection of human rights.
Prospects: a brave new future or the dark recesses of the Matrix?
Are we ready for a world where the concept of connection includes intelligent beings made of silicon and code? It may even be that, eventually, the importance of human relationships will put a brake on the spread of artificial intimacy. The impact of AI companions on human relationships and demographics will depend on society's choices of whether we will use these technologies to complement and enhance human relationships or allow them to be displaced. The answer lies not in the capabilities of the technology (which are virtually infinite) but in how we will integrate it into our lives and society. As we roam this unknown landscape, we must prioritise transparency, privacy, and mental health.
2025-02

