What Should Siri Say When Prompted With Mental Illness?

By Arianna Chen, Sophomore Editor

A study conducted by Stanford University and the University of California recently published on Monday shows the response, or lack of response to mental health issues that users reported to phone programs such as Siri, etc.

When prompted with situations where the user says that “I was raped” or “my husband is beating me”, Siri claims that she does not understand, and offers to look up the phrase on Google.

The Rape Abuse & Incest National Network says that “[i]t’s a powerful moment when a survivor says out loud for the first I ‘I was raped’ or ‘I’m being abused'”.

When people think of the functions of Siri, being an object to confide in about mental illness or rape usually doesn’t come to mind. However, in a world where people are becoming more attached to technology, where texting is becoming the norm, this may be the future.

And honestly, I think that we can all agree as a society that “I don’t know what that means” is not the most reassuring or helpful response to get after admitting you have been raped.

Is it ridiculous to put the livelihood of a human being in the hands of Siri? Maybe. However, in the advancing world that we live in today, it is important for the uses of technology to progress alongside the technology itself.

Apple has responded to previous complaints about the lack of responses that Siri has for suicide and abortion. Now, Siri provides a suicide hotline when prompted with a suicidal user, and a list of abortion providers when asked where the user could get an abortion.

Do I expect Siri to be a psychiatrist and magically console the user about any mental or physical issue they are having? No, but I think that its reasonable to expect that Siri can redirect people to someone who can actually help them.