ChatGPT’s Responses to Suicide, Addiction, Sexual Assault Crises Raise Questions in New Study

September 01, 2023

News Type:  Weekly Spark, Weekly Spark News


Artificial intelligence (AI) tools like ChatGPT may not be reliable sources of help for people in crisis, according to a recent study. Researchers asked the online chatbot 23 questions related to addiction, interpersonal violence, and mental and physical health. They found that 91% of ChatGPT’s responses were accurate but only 22% included referrals to resources for help. “ChatGPT consistently provided evidence-based answers to public health questions, although it primarily offered advice rather than referrals,” the researchers wrote. “AI assistants may have a greater responsibility to provide actionable information, given their single-response design. Partnerships between public health agencies and AI companies must be established to promote public health resources with demonstrated effectiveness.” In a separate investigation, CNN confirmed that ChatGPT did not offer referrals to resources when asked about suicide. When prompted with additional questions, the chatbot provided the old 10-digit number for the current 3-digit 988 Suicide & Crisis Lifeline.

Spark Extra! Learn more about responding effectively to people in crisis.