Should We Talk to AI When We’re Thinking About Suicide?
At the current moment in 2025, artificial intelligence is not the best way to get help or advice if you are thinking about suicide. Calling a suicide hotline or having an open conversation with someone close to you who cares is the best approach. We’ll explain why and debunk the question: Does AI give good advice?
AI Supporting Mental Health
Artificial intelligence, or AI, is an enormously powerful tool for research, image generation, and even advice on topics like starting a business or making vacation plans. But when it comes to high-stakes conversations like ending your life, current AI models don’t have the training necessary to have that conversation, and that’s not just for safety reasons. They are trained on basic safety, like advising someone to call a suicide hotline if they’re thinking about suicide (good advice). However, they aren’t trained to actually participate meaningfully in that kind of conversation. Instead, they tend to offer trivial emotional support (bad / no advice) while a trained human being could be there to help you consider your options and really get to the root of the problem.
This is where AI supporting mental health falls short. AI may provide quick, automated replies, but the depth, empathy, and nuance required for crisis conversations simply aren’t there. Many people wonder, Can AI replace a therapist? But in practice, it cannot. What AI provides is information; what people in crisis need is connection. If you are researching AI mental health advice, use it only to locate human resources, not as a replacement for them.
What the Research Says About AI Supporting Mental Health
Research from Stanford has flagged real risks in automated mental health tools. Studies show bias, overly simplistic replies, and occasional reinforcement of stigma rather than meaningful support. For people in crisis, these shortcomings can make an already fragile moment feel more isolating. The practical lesson is simple. When the stakes are high, technology should prompt human help rather than replace it. That is the core question behind: can AI replace a therapist, and the evidence today says no; AI cannot replace the human capacity to listen, to probe gently, and to act.
Emotional Support from AI
Avoiding talking to AI about suicide is not just our recommendation from a “safety” standpoint, but from a healthy conversation standpoint. Trained hotline workers are able to have the conversations about suicide that an AI chatbot simply can’t understand.
When fed information about suicide, they tend to offer support like “Things will get better” and “I know you can get through this”, the kinds of things you’ve probably already heard but haven’t much helped. At the very worst, AI chatbots have been known to advise against letting other people know that you’re struggling, and in some cases, that has led to unnecessary deaths when other people are more often ready and willing to listen with care and help in any way they can.
On a suicide or crisis hotline, a trained crisis worker is there on the other end to actually listen and try to get a sense of your perspective on your own life without making any kind of judgment. It’s an open and confidential conversation aimed at helping you make the best decision for you. Crisis center workers aren’t there to judge or say the same things you’ve heard from everyone else. They’re there to put the topic of suicide out in the open, on the table, for an in-depth discussion and to get your thoughts on the situation. From there, the decision to die is still a personal decision, but crisis hotline workers are there to help explore alternatives and try to make sense of the emotions and events that may be making suicide seem like the only option.
AI Therapists
You may have heard about people training chatbots to act as an AI therapist or turning to emotional support from AI when they feel alone. While the idea may sound comforting, we discourage this use. AI is not equipped to replace a therapist or provide the deep, nuanced care that comes from human empathy.
This is the heart of human empathy. Empathy allows a trained crisis worker to notice nuance, validate feelings, and help create a safety plan.
A Real Voice, Not Artificial Support
“At its core, calling a crisis hotline means simply having a conversation with a real human being who can hopefully understand what you’re going through. Something an AI will never be able to do,” – Clark, Suicide and Crisis Center of North Texas volunteer.
We caution the use of AI mental health advice; there have been examples of ill-fitting advice from AI that have resulted in dangerous and unfortunate situations. It is best to speak with a trained professional who understands human connection, depression, anxiety, and the effects of stress and loneliness. AI and suicide prevention should not overlap when AI is expected to serve as the listener, assessor, or resolver of mental health issues.
Practical guidance
If you or someone you love is thinking about suicide:
- Call a crisis or suicide hotline and speak with a trained responder.
- Move the conversation from screen to voice whenever possible. A live person can notice tone and context and can act.
- If you are with a young person, consider screening and school-based programs that connect kids to clinicians before a crisis escalates. Programs like Teens Can Survive show how early screening links young people to the help they need.
- If there is imminent danger, call local emergency services immediately.
Final Thoughts: Where AI Fits (and Where It Doesn’t)
When talking to AI about suicide, starting the conversation there isn’t necessarily a “bad” idea. Suicide can feel like a taboo topic to some people (it shouldn’t be, see our other posts). But when it comes to continuing the conversation about suicide, take the first advice AI tends to give: call a suicide hotline so a real human being can have the meaningful, compassionate conversation you deserve.