As AI enters the therapy room, hope and anxiety are rising together. A March 2026 study in npj Mental Health Research used six online focus groups with 32 patients and therapists and found a mixed but revealing picture. Many participants welcomed AI for first screening, support during long waiting periods, personalized exercises between sessions, and help after therapy ends. They also liked the idea that AI could make care more efficient. But both groups said something equally important: psychotherapy is not only about information or advice. It depends on human contact, trust, and subtle emotional understanding. Their main worries included weaker human connection, commercial use of sensitive data, loss of control in treatment, and unhealthy dependence on tools that are always available. (nature.com)
The optimism is not imaginary. On March 27, 2025, Dartmouth researchers reported the first clinical trial of a generative-AI therapy chatbot, Therabot. In that study, people with depression showed an average 51% reduction in symptoms, people with anxiety improved by 31%, and users at risk for eating disorders showed a 19% reduction in concerns about body image and weight. Participants also reported a level of therapeutic alliance that the researchers said was comparable to working with a mental health professional. Still, the Dartmouth team stressed that no generative AI system is ready to work fully autonomously in mental health care, where high-risk situations require careful human oversight. (home.dartmouth.edu)
That warning matters. In June 2025, Stanford researchers reported that popular therapy chatbots could reinforce stigma toward conditions such as schizophrenia and alcohol dependence, and in tests they sometimes responded dangerously to prompts linked to suicide or delusions. The World Health Organization has likewise urged caution with health-related large language models, emphasizing safety, autonomy, transparency, accountability, and equity. In the United States, regulators are paying closer attention: the FDA’s Digital Health Advisory Committee discussed generative-AI mental health devices on November 6, 2025, the American Psychological Association called for an investigation into psychological harm from consumer chatbots in July 2025, and Illinois signed a law in August 2025 prohibiting AI from providing therapy or therapeutic decision-making. (news.stanford.edu)
So, AI may change psychotherapy most powerfully not by replacing therapists, but by becoming a carefully limited partner: useful for access, practice, and support, yet still guided by human judgment and human care. (nature.com)










