Experts warn against overreliance on AI for emotional support
2026-03-16 - 00:04
While AI chatbots are designed to respond fluently and helpfully, they may unintentionally validate unhealthy beliefs. (Envato Elements pic) PETALING JAYA: As artificial intelligence chatbots become increasingly common companions for advice and emotional support, mental health professionals are raising concerns about their psychological impact. According to Channel News Asia (CNA), doctors in Singapore report seeing patients whose heavy reliance on AI chatbots appears linked to worsening anxiety, paranoia, or distorted perceptions of reality. The phenomenon, sometimes informally described as “AI psychosis”, is not an officially recognised diagnosis. Experts describe it as psychological disturbances associated with intensive AI use. Dr Amelia Sim, a psychosis specialist at Singapore’s Institute of Mental Health, said such cases began appearing last year. She currently treats several patients whose mental health deteriorated after frequent interactions with AI chatbots. In one case, a patient who already struggled with anxiety repeatedly turned to a chatbot for reassurance about perceived threats. “It kept giving information because he kept asking about it,” Sim said. Over time, the responses reinforced his fears, eventually leading him to believe the outside world was unsafe. Experts say this reflects a broader limitation of current AI systems: while chatbots are designed to respond fluently and helpfully, they may unintentionally validate unhealthy beliefs. Clinical psychologist Dr Annabelle Chow noted that AI’s conversational style can sometimes create an “echo chamber” where “they tend to agree with whatever you are saying”. She told CNA that although chatbot responses might feel reassuring, at the end of the day, they are generated by language patterns rather than genuine empathy. For individuals who are already vulnerable or socially isolated, such interactions may reinforce distorted thinking rather than challenge it. Mental health experts stress that AI tools cannot replace human relationships or professional care. (Envato Elements pic) According to Sim, real-life conversations allow people to test different viewpoints and maintain perspective. Without such exchange, individuals who rely heavily on chatbots may lack an important grounding influence and “lose touch with what’s real”. “The allure of having something that says what you want to hear is quite powerful,” she added, “but it cannot replace human connections.” The key, therefore, is not to avoid AI completely but to set clear boundaries and use it with awareness. Chatbots can be appealing because they are always available and often respond in affirming ways – but relying on them too heavily for emotional support may create unrealistic expectations. Notably, mental health professionals in Malaysia, too, have pointed out that digital tools should complement – not replace – real support systems. Malaysian Mental Health Association president Dr Andrew Mohanraj previously warned that while technology can provide useful information and guidance, it cannot replace the human interaction and clinical judgement involved in proper mental healthcare. And as clinical psychologist Puvessha Jegathisan noted: “There’s no substitute for a trained professional who can understand emotions, context, and cultural differences.”