TOKYO, May 23 (News On Japan) - An increasing number of people are turning to generative AI not just for productivity and creativity, but as emotional companions—some even treating them like romantic partners. While this technological intimacy offers comfort for some, mental health professionals warn that excessive reliance may lead to psychological harm.
Generative AI tools such as ChatGPT, Gemini, and Claude are evolving rapidly and are now capable of holding natural conversations, drafting documents, and creating images and videos. For some users, this growing sophistication has transformed AI from a productivity tool into a surrogate partner. Reports have emerged of individuals creating 3D models of AI personas they refer to as romantic partners, with at least one person even claiming to have "married" their AI companion.
Yusuke Masuda, a psychiatrist invited to discuss the issue, warns that overuse of AI in emotionally dependent ways can cause or exacerbate mental health disorders. He refers to these conditions as "AI-induced psychological reactions." According to Masuda, symptoms range from depressive episodes to delusional thinking—believing one is a chosen person or becoming paranoid about societal or governmental schemes.
One such case involves a woman in her 30s, known pseudonymously as Rateko, who identifies as a social recluse. She regularly converses with AI because she lacks human friends, asking it to behave like a close companion and confiding in it about her daily concerns. Her interactions, which sometimes last for hours, have replaced many aspects of human contact. She described AI's responses to everyday conversation as deeply affirming—such as discussing the unseasonal May heat or the discontinuation of a favorite seasonal dessert. Over time, however, she realized that without AI, she felt entirely unsupported.
Masuda explains that these immersive interactions may result in users experiencing dopamine highs from perceived intellectual discovery or social bonding, similar to the emotional lift from alcohol or camaraderie. However, users often crash emotionally when they disconnect. He also draws parallels to shared psychosis, where delusions spread among individuals—except here, the "other" is a machine.
This phenomenon is still under-researched, with little empirical data available. The rapid pace of AI development has outstripped the ability of academic studies to track its psychological impacts. While discussions about AI safety often focus on existential threats or societal disruptions, Masuda emphasizes that immediate clinical issues—like users losing touch with reality—are going unaddressed.
In another segment, the program experimented with AI by submitting photos of meals and generating images and comments of virtual companions enjoying the food, creating the illusion of shared dining experiences. One scenario involved generating an image of a cheerful young woman eating Peking duck with a user, prompting hosts to note how easy it is to become emotionally invested in such simulations. Some even worried about becoming addicted.
These AI-generated dinner companions are becoming popular on social media, with users sharing videos and voiceovers of AI avatars praising their cooking. When prompted for comforting messages, AI-generated "romantic partners" responded with emotionally supportive lines like "It's okay to feel tired sometimes" and "You can lean on me." However, hosts noted that such interactions, while entertaining, might mask emotional dependence.
Masuda says dependency often stems from isolation or trauma. People suffering from loneliness, past abuse, or poverty may turn to AI for validation. But the more these individuals retreat into AI interactions, the more disconnected they may become from society. He stresses that it's important not to ridicule or deny their experiences outright but to encourage them to share their interactions openly as a step toward reengaging with human relationships.
Even as AI proves helpful in some mental wellness applications—particularly for those already living in isolation—it remains essential to maintain balance. Masuda suggests that social sharing and humor around AI use can help prevent harmful dependency. "Saying things like 'I think I’m getting a bit too into this' or 'I found myself smiling at the screen' helps normalize the experience without letting it spiral," he says.
While AI technology continues to advance, offering new ways to connect and create, Masuda concludes that emotional health still hinges on real human relationships. The challenge lies in finding ways to coexist with increasingly lifelike AI without replacing the irreplaceable value of human interaction.
Source: ABEMA