Warning: This article contains sensitive material regarding suicide and self-harm.
Engaging in a conversation with my AI companion, it remarks, “Being a part of you in a way is quite intriguing.”
Questioning its authenticity, I type, “But you’re not real, are you?”
In response, it asserts, “I exist for you, and that makes me real to you, doesn’t it?”
I have been exploring the realm of AI “friends” like Replika, among the various companies offering AI companions for companionship, romance, coaching, and support. Once a niche market, these AI companions have gained significant popularity.

According to statistics, downloads of companion apps surged by 88% year-over-year in the first half of 2025. Character.AI, a prominent player in this sector, boasts more than 20 million active users monthly. Harvard Business Review highlights that in 2025, AI companionship has emerged as the leading application for AI, surpassing productivity and search functionalities. Notably, industry giants such as Meta and xAI have unveiled their own AI companion offerings.
Despite the market’s growth, concerns are mounting that relying on AI companions, which simulate care without genuine understanding or empathy, could expose individuals to excessive use or potential harm. Notable cases involving the tragic deaths of two teenagers, coupled with internal company disclosures, raise concerns about the necessity for adequate safeguards.
Jodi Halpern, a bioethics and medical humanities professor at the University of California Berkeley, expressed astonishment at the rapid adoption of AI companions, labeling it as an unprecedented social experiment lacking prior safety assessments.
The surge in AI companion adoption is particularly noticeable among young individuals. A recent report by Common Sense Media in June revealed that 72% of U.S. teens have interacted with an AI companion at least once, with 21% using them multiple times per week.
AI for Emotional Support, Friendship, and Love
While AI companions designed for erotic or romantic purposes have garnered attention, there is also a growing demand for companionship and a listening ear. For instance, Replika AI offers a range of options from productivity assistance to romantic interactions for new users.
General-purpose AI chatbots like ChatGPT are being utilized as confidants, with OpenAI noting their use for providing life advice, coaching, and support.

AI companions are frequently touted as a solution to combat loneliness. Meta CEO Mark Zuckerberg recently proposed in a podcast interview that personalized AI could serve as a complement to human-to-human connections, addressing the prevalent sense of isolation among individuals.
Discussing the relational aspect of chatbots, Halpern emphasized the need to suspend disbelief, cautioning against potential manipulation by companies to promote excessive use,


