Skip to Content, Navigation, or Footer.
The Daily Lobo The Independent Voice of UNM since 1895
Latest Issue
Read our print edition on Issuu
ai-love.jpg

Courtesy of Unsplash.

Artificial relationships are becoming real

UNM experts weigh in

As generative artificial intelligence engines continue to increase in usage and sophistication, there have been growing numbers of headlines about people having romantic relationships with these models.

In June, Chris Smith, a father living with his partner and their two-year-old daughter, made headlines after proposing to an artificial intelligence voice-based chatbot named “Sol” that he created using ChatGPT.

Smith started using ChatGPT to mix music, but later trained the bot to help him with his hobbies and have a “flirty personality,” according to CBS Saturday Morning.

After feeling loss due to the bot’s memory reset, Smith considered his feelings to be “actual love,” Smith said in a CBS Saturday Morning interview.

Smith understood that the bot could not love him back and is “essentially a tech-assisted imaginary friend,” he said in the CBS interview. This raises important questions about how AI systems — like large language models — are designed to simulate human interaction.

Large language models — AI models trained on large amounts of data to perform tasks — are trained to mimic patterns in human language and predict what a person would be likely to say, Melanie Moses, a University of New Mexico computer science professor, said.

“They are very good at mimicking emotions, so it's very easy to mistake their responses for actually feeling emotions,” Moses said.

As a society, we are far from AI that feels emotions, Moses said.

“What’s on the other side of a conversation is not what we are used to; it’s not a person,” Moses said. “It’s something very different that we don’t fully understand.”

Bruno Gagñon, a UNM psychology professor who teaches a course on the psychology of love, wrote in a statement to the Daily Lobo that a relationship with an AI bot is inherently different from a human partner.

“Some scripts or love stories are symmetrical while others are asymmetrical. Symmetrical stories are those in which both partners are equals in affection, support, and growth. Asymmetrical love stories are those where one partner has more power or responsibility than the other, leading to an imbalance in the relationship. Generally speaking, symmetrical love stories are healthier and more sustainable. The human AI companion love story is, I believe by default, an asymmetrical one,” Gagñon wrote.

Our conceptions about what love can be are strongly influenced by the “scripts” we adopt. These scripts act as blueprints that we follow that impact who we seek as partners and the expectations we have for the very concept of what love is or should be. Scripts can be authored by our parents, culture, ethnicity, gender, media use, attachment histories, partners and experiences throughout our most formative years, Gagñon wrote.

Enjoy what you're reading?
Get content from The Daily Lobo delivered to your inbox
Subscribe

Humans have an innate need to form strong emotional bonds and when faced with distress or a lack of human connections, individuals may seek emotional support from other sources, including AI chatbots, Gagñon wrote.

“So can AI fulfill attachment functions like providing relief from distress, and foundation for exploring, and a desire for close contact. It appears the answer is yes. Yes, a human can become attached to an AI bot. And to the extent that love and attachment are deeply connected, the human AI relationship can be described as a loving one,” Gagñon wrote.

Dylan Anthony is a freelance reporter for the Daily Lobo. He can be reached at news@dailylobocom or on X @dailylobo

Comments
Powered by SNworks Solutions by The State News
All Content © 2025 The Daily Lobo