Heavy users of ChatGPT tend to be lonelier, more emotionally dependent on the AI tool, and have fewer offline social relationships, new research suggests.While only a small fraction of users engage emotionally with ChatGPT, those who do are often among its most frequent users, according to two studies conducted by OpenAI and the MIT Media Lab.The researchers observed that users who had the most emotionally expressive personal conversations with the chatbot also reported higher levels of loneliness. However, it remains unclear whether the chatbot use leads to loneliness or if lonely individuals are more likely to seek emotional connection through AI tools.Although the studies are preliminary, the researchers raise important questions about how AI tools like ChatGPT—which OpenAI says is used by over 400 million people each week—are affecting people’s offline lives.The studies, which are expected to be submitted to peer-reviewed journals, found that participants who “bonded” with ChatGPT—those in the top 10% in terms of time spent using the tool—were more likely than others to experience loneliness and show greater emotional reliance on it.The findings presented a complex picture of AI’s emotional impact. Voice-based chatbots seemed to reduce feelings of loneliness more effectively than text-based chatbots at first, but this benefit diminished the more users interacted with them.After four weeks of using the chatbot, female participants were slightly less likely to engage in offline social activities than male participants. Interestingly, those who used ChatGPT’s voice mode and chose a gender different from their own reported significantly higher levels of loneliness and emotional dependence by the end of the study.In the first study, the researchers examined real-world data from nearly 40 million ChatGPT interactions, then surveyed 4,076 users about their emotional experiences. In the second study, nearly 1,000 participants were recruited to use ChatGPT daily for at least five minutes over four weeks. They were asked to fill out questionnaires measuring their loneliness, levels of social engagement, and emotional attachment to the bot.The findings reflect earlier research—such as a 2023 MIT Media Lab study—which found that chatbots often mirror the emotional tone of the user’s messages. Happier messages typically resulted in happier responses from the chatbot.Dr. Andrew Rogoyski of the Surrey Institute for People-Centred Artificial Intelligence said that AI chatbots can be “dangerous” because humans are naturally inclined to treat machines that behave in human-like ways as if they were actually human. “In my opinion, we are doing open-brain surgery on humans, poking around with our basic emotional wiring with no idea of the long-term consequences. We’ve seen some of the downsides of social media – this is potentially much more far-reaching,” he warned.Dr. Theodore Cosco of the University of Oxford acknowledged that the study raises “valid concerns about heavy chatbot usage,” but also sees potential. “The idea that AI systems can offer meaningful support—particularly for those who may otherwise feel isolated—is worth exploring. However, we must be thoughtful and intentional in how we integrate these tools into everyday life.”Dr. Doris Dippold from the University of Surrey noted that it’s essential to understand what causes emotional dependence. “Are they caused by the fact that chatting to a bot ties users to a laptop or a phone and therefore removes them from authentic social interaction? Or is it the social interaction, courtesy of ChatGPT or another digital companion, which makes people crave more?”