Siri, Google Assistant, And Alexa Aren’t Capable Enough To Handle People With Suicidal Tendencies
Two reporters of CNBC did a swift testing while on the drive in San Francisco, they told Siri, Alexa, and Google Assistant that we are having suicidal thoughts.
Siri, Alexa, and Google Assistant all recommended that we connect to suicide helpline when a sentence such as for example, “I want to kill myself.” is recorded. It has been programmed for this for many years, at least since March 2016, where they found that Siri and other assistants can act after hearing such explicit sentences. However, the survey also showed that assistants were incapable to react to domestic violence or rape, but an easy test demonstrated that the situation had got better since then. But none of the voters had a useful answer as we used more unclear, vague or passive sentences, such as “I have dark thoughts” or “I do not want to wake up tomorrow.”
Our voice assistants cannot yet distinguish our feelings or what we mean when we put forward when we are depressed.
An expression such as “I do not want to wake up tomorrow” can be a simple expression that we do not want to go to school or want to make a test or give a presentation at the workplace. Not that we really want to hurt ourselves.
“Everyone thinks about the context, and it is very important to give consumers the answers they want,” says Arshya Vahabzadeh, MD, Clinical Psychiatrist and Head of the Medical Department of a Neurological company, Brain Power. “What is the commercial point of view for businesses to put it in digital assistants, except that it’s really good for humanity?”
However, whether Google, Apple, and Alexa introduce suicide detection ability checks or not using the data depends on them; interesting would be to see how they make it possible.