Submitted by the Baylor College of Medicine….
AI is not your therapist: The importance of seeking mental healthcare
Artificial intelligence (AI) is a strong tool that can help with different needs in an instant, but should AI replace or augment mental health specialists? A Baylor College of Medicine psychiatrist outlines how AI can be detrimental to mental health.
People turn to AI because it is handy, not costly and does not judge. However, chatbots might give false information, unsafe advice and do not have the emotional capability to understand everything. AI can encourage negative behavior based on information provided because it lacks emotion.
“In psychiatry, one of the most important things is body language,” said Dr. Asim Shah, professor and executive vice chair in the Menninger Department of Psychiatry and Behavioral Sciences at Baylor. “AI is not seeing body language, so if you lie to AI saying you’re perfectly fine when you’re severely depressed, AI won’t know that. It gives information based on what you tell them, not how you’re feeling.”
AI can validate unhealthy behavior and distorted, delusional thinking. The clinical expertise it provides is not done by a professional. AI offers consensus articles and information from the web, which could be one-sided depending on how you phrase the question.
“If someone tells AI they believe in something that is not true, AI will not know this is delusional and may validate that feeling, and the person will continue to feel that way based on the validation, which doesn’t happen with human interaction,” Shah said.
To read the complete article, click here click here.
