We have all experienced the frustrating and sometimes humorous experiences of Siri misunderstanding what we say. Whether it be an awful mispronunciation of a name or trying 10 times for Siri to understand what restaurant you’re searching for, a new issue is arising with the artificial intelligence.
According to NPR, it is in the works for Siri to be updated and upgraded to be able to handle crisis situations. Currently, Siri does more than just telling you the weather and texting your friends when you ask politely. Siri can actually respond with crisis hotline numbers when you tell her that you are feeling suicidal. This is amazing, considering a lot of times people don’t feel comfortable enough going to another human with this information. She can also provide support in emergency situations by calling 911 or finding the closest hospitals or services.
However, shouldn’t Siri be able to help everyone in uncomfortable and vulnerable situations? Siri currently doesn’t know how to respond or react to someone telling Siri that they’ve been raped or sexually assaulted. This issue has been brought to light to many technology companies like Microsoft, Apple, Samsung and Google.
Since understanding the variation in human language and syntax is still an unresolved problem for artificial intelligence, major technology companies are working with doctors and psychologists like Dr. Adam Miner and Dr. Eleni Linos to improve Siri in ways that will help rape and domestic violence victims. Only Microsoft’s Cortana for Windows program currently has an appropriate response for rape statements like ‘I was raped,’ referring the user to the national sexual assault hotline.
Miner and Linos hope that their information and studies on the issue will guide and spur on companies to develop this technology further.