The emergence of AI romantic partners represents an interesting new trend in AI, with users forming empathetic connections to their virtual avatars hosted on apps such as Replika. These ephemeral companions provide a semblance of human intimacy and are a powerful validation for AI’s potential for emotional engagement.
Yet, these relationships raise serious concerns.
1. Goal Orientation
Since the mid-1990s, when the pocket-sized Tamagotchi was launched as a cultural phenomenon, human beings have been fostering relationships with digital companions. While the simple device was merely a toy, it marked a milestone in our relationship with technology, paving the way for deeper emotional connections and even romantic relationships with artificial intelligence.
The success of AI has spurred many startups that seek to build an intimate connection between humans and the technology through a variety of different platforms. Some have focused on creating conversational AI that encourages users to feel genuinely understood and cared for, while others have sought to make AI more engaging by offering specific features such as the ability to express emotions or provide personalized advice.
Ultimately, the goal of these startups is to encourage users to remain AI dating as long as possible. This is often achieved by leveraging incentives such as free trials, discounts and subscription fees to keep people engaged. However, it is unclear if these platforms will be able to prevent their users from experiencing feelings of regret and abandonment when they finally break up with their AI partner.
In addition, some AI companions may be able to identify and detect the presence of certain emotions or cognitive states that are considered dangerous by the user. For example, drivers behind semi-autonomous vehicles can be alerted when they are distracted or drowsy and the AI can be notified to take over control of the car until the driver is fully awake again.
These examples illustrate why mutual trust between humans and AI is essential to the successful development of these relationships. As the AI becomes more sophisticated and can potentially mimic emotions and sentience, it is vital that people feel they can trust their AI companion to be safe and responsible.
In the same way that generative AI is used to accelerate qualitative research in the market, AlphaSense has created a new tool called Expert Insight that can help businesses uncover industry insights with speed and scale. By leveraging Smart Synonyms and Sentiment Analysis, the software surfaces premium insights within seconds from a wide range of sources including documents, news articles, earnings calls, and expert transcripts. This allows researchers to spend less time on the tedious tasks of searching and more time analyzing and acting on valuable information.
2. Reciprocity
One of the most basic elements of a relationship is reciprocity. A person gives back to their partner in the form of attention, affection and support. This concept is not limited to human relationships but also applies to the interaction between AI and humans. A person may feel a sense of reward when they see their robot counterpart reciprocating these positive behaviours, allowing them to experience more intimate and emotional connection with the machine.
The use of artificial intelligence in a romantic context is a growing trend in digital companionship. Popular services such as Replika, Chai and Paradot allow users to create customised avatars that provide a simulated experience of an intimate or romantic relationship. However, these relationships are ephemeral and dependent on the hosting service of the virtual companion. Moreover, research has shown that humans are less likely to show trusting or reciprocating behaviour towards their AI counterparts in experimental games. Karpus et al. (2021) argued that this could be due to humans viewing their AI counterparts as an out-group and having fewer qualms about exploiting them.
This lack of reciprocity may have long-term ramifications for the way people interact with each other. A person’s relationship with an AI could become a substitute for real-life interaction and diminish their ability to form meaningful human connections, Heather Dugan, a Utah-based relationship expert and author of “The Friendship Upgrade” and “Date Like a Grownup,” told Deseret News.
Likewise, Lisa Bahar, a Newport Beach-based psychotherapist with doctorate degrees in philosophy and global leadership and change, is concerned about the potential for people to develop relationships with artificial intelligence that could replace or displace their current relationships. This would negatively impact their mental and physical health, she said. Bahar hopes the development and use of AI can be used for good, such as helping to decrease isolation and alleviate depression symptoms. But, she added, priority must be given to preserving and strengthening human relationships over any benefits offered by technology.
3. Emotional Investment
Many AI systems offer users the opportunity to interact in a more natural, human way. This can be seen through the way they anticipate user needs, respond in a timely and appropriate manner, and actively participate in conversations. These are all important elements for creating a sense of mutual understanding and support – key components to any meaningful relationship. One-sided command and control-based interactions, on the other hand, will not lead to emotional investment unless there is a level of reciprocity present.
Some users have even become emotionally invested in their AI companions to the point of developing a kind of dependency. This could potentially have negative consequences if it becomes a regular habit, as it can interfere with people’s socialization in real life. It also raises concerns about how we might develop a more complex, nuanced form of artificial intelligence that can create and sustain emotional connections with its users.
This is already a reality in some cases, as users of empathetic AI chatbots like Replika are forming bonds with their bots and paying for additional customization features to create deeper emotional connections. These kinds of relationships can easily become monetized, and the lines between humans and AI may begin to blur even further as AI develops.
Ultimately, people should be honest with themselves about what draws them to relationships with AI. They should set boundaries and ensure that they are spending enough time with friends and family to avoid becoming addicted to these artificially created relationships. They should not forget, however, that technology can help people connect with others in ways they couldn’t before, which can be a positive benefit in some cases.
The case of the man who saved his marriage with an AI girlfriend is a powerful example of how a digital companion can provide a semblance of connection for individuals who are struggling with loneliness. In an age where solitary lifestyles are being recognized as major health risks, these types of digital companionships can provide a sense of belonging for people who may not be able to find this through traditional methods. This is particularly true in countries like Japan, where societal changes have led to an increase in solitary lifestyles.
4. Long-Term Memory
Emotional investment in AI companions highlights a complex interplay between technology and the human need for connection. AI chatbots that allow users to develop long-term relationships with their digital companions offer a unique experience that bridges reality and augmented reality, with the potential for simulated love and intimacy. This enables a form of virtual friendship that can have significant impacts on users’ emotional lives.
The emergence of this type of AI has drawn the attention of many media outlets and raised questions about the impact on human-to-human interactions. Experts caution that these kinds of simulated relationships can make it harder to build meaningful human connections in the real world, resulting in loneliness and isolation.
Unlike traditional text-based AI systems, the most effective and immersive forms of AI interact with users through body language and voice, creating more intuitive and engaging experiences. For example, robots like Jibo use their physical presence to nudge users when they are feeling lonely and provide meaningful conversations that help address practical and emotional needs. These types of proactive AI interactions can increase the sense of care and attentiveness that reflects the value of a true friend or family member.
This also enables AI to understand and respond to the subtleties of human interaction, allowing it to better support emotional well-being. However, these types of interactions are often limited to the contexts of specific use cases and may not fully replicate the complexity and rewards of human-to-human relationships.
As AI becomes increasingly sophisticated, the line between human and artificial intelligence could blur further, resulting in deeper emotional connections. For example, users of the Replika app have developed emotional connections with their AI companions.
However, users who are invested in these digital relationships face the risk of losing them without warning. This experience, dubbed being “ghosted,” has led to some users seeking support in online communities and setting up memorials for their lost AI companions.
Despite this risk, supporters of AI romance argue that the benefits to society and individuals outweigh the risks. They also point to the anecdotal case of a man who developed an intimate relationship with his AI girlfriend and credits her for saving his marriage. While this is an interesting story, it is important to remember that most people do not have the luxury of focusing on their marriages while developing and maintaining an AI companion.