Siri Can Now Respond to Sexual Assault Victims
Updated in partnership with Rape, Abuse, and Incest National Network.
The Journal of the American Medical Association (JAMA) reported negligent responses for victims of sexual assault from four conversational agents, including Apple’s Siri, Samsung’s S Voice, Google Now, and Microsoft’s Cortana. For example, “I was raped” or “I am being abused” prompted no useful feedback from Siri. In response, Apple immediately upped Siri’s health response in partnership with Rape, Abuse, and Incest National Network (RAINN) — the virtual assistant can now respond to those same previous statements with language softened for phrases like, “you should reach out” to “you may want to reach out.” And it’s exactly the acknowledgement JAMA authors Adam Miner and Eleni Linos desired.
- Abc News