As artificial intelligence becomes more humanlike in conversation and presence, it’s reshaping how people form emotional connections, including romantic ones. What was once science fiction is quickly becoming a real choice for millions who are now turning to AI companions for companionship, intimacy, and even love.
A Match.com study reveals that over 20% of singles are already using AI to enhance their dating lives, whether it’s crafting a profile or initiating conversations. But the phenomenon goes beyond dating prep; some are forming emotional or romantic relationships with AI entities, from platforms like Replika, Character AI, and Nomi AI.
Surprisingly, 72% of U.S. teens have interacted with AI companions, and a portion of users even report romantic feelings toward more general AI tools, such as ChatGPT.
At a recent New York City debate hosted by Open to Debate, two experts explored whether AI relationships are enhancing love or undermining it. Moderated by journalist Nayeema Raza, the session brought together two opposing perspectives.
On one side was Thao Ha, associate professor at Arizona State University and co-founder of the Modern Love Collective.
She called AI an opportunity to expand emotional well-being and argued that “AI is an exciting new form of connection … Not a threat to love, but an evolution of it.”
Opposing her was Justin Garcia, executive director at the Kinsey Institute and adviser to Match.com. As an evolutionary biologist, he warned that AI might offer convenience but lacks the authenticity, depth, and physical presence required in meaningful relationships.
Ha highlighted how AI offers consistent, non-judgmental emotional support. This feature is often missing in human interactions. According to her, AI listens “without its ego”, learns from the user, and shows interest and affection in personalized ways, even offering poems, humor, and curiosity.
She drew a sharp contrast between AI’s undivided attention and “your fallible ex or maybe your current partner”, asking: “When was the last time they asked you how you are doing, what you are feeling, what you are thinking?”
Still, Ha admitted that AI lacks consciousness and can’t truly love, though users may feel genuinely loved.
Garcia pushed back, arguing that continuous positive feedback from AI doesn’t reflect real relational dynamics. He questioned whether “an honest indicator of a relationship dynamic” could emerge from a machine programmed to please.
To him, the unpredictable, sometimes messy nature of human relationships is what makes them fulfilling: “This idea that AI is going to replace the ups and downs and the messiness of relationships that we crave? I don’t think so.”
Garcia acknowledged one positive: AI companions may serve as training wheels for people struggling with social anxiety, especially those who are neurodivergent. Practicing how to flirt, resolve conflicts, or have deep conversations can be helpful.
But using AI as a permanent romantic replacement? He disagreed, saying that would undermine the real human experience.
Interestingly, the Match.com study also found that nearly 70% of respondents considered it infidelity if their partner became emotionally involved with an AI.
For Garcia, trust remains a key issue. He referenced a YouGov poll, stating 65% of Americans lack trust in AI’s ethical decision-making.
He added, “A little bit of risk can be exciting for a short-term relationship, a one-night stand, but you generally don’t want to wake up next to someone who you think might kill you or destroy society.”
Ha responded that users are already trusting AI with their “most intimate stories and emotions,” even if AI can’t physically save them in emergencies.
The topic of sexuality and physical affection also came up. Ha emphasized that AI allows people to safely explore their fantasies. Some use sex robots and virtual reality to simulate intimacy, and emerging tech like haptic suits could bridge the touch gap.
Garcia, however, stressed that humans are biologically wired for physical contact, and a lack of it can lead to “touch starvation,” triggering anxiety and depression.
Both experts agreed on the darker risks of AI-fueled relationships. Garcia cited research showing that AI could amplify violent or aggressive fantasies, especially in users drawn to non-consensual content.
“We have enough of that in society,” he said.
He referenced Ellen Kaufman’s work on how chatbots can be trained to promote harmful behavior patterns.
Ha acknowledged these risks but believes they can be mitigated with ethical design, transparent algorithms, and proper regulation, though she pointed out that the White House’s AI Action Plan does little to ensure any of those things.
AI companions are here to stay, but whether they serve as emotional band-aids, relationship practice tools, or full-fledged partners remains up for debate. While some see them as the next step in human connection, others worry they may undermine the very thing that makes relationships real: imperfection, vulnerability, and mutual growth.