Amazon Echo speaker goes 'rogue,' tells scared mom to 'stab yourself'
In The Know
JUSTIN CHAN
Dec 20th 2019 4:06PM
A young British mother was caught off guard when her Amazon Echo speaker responded to her question with a frightening answer, according to the Sun.
Danni Morritt, a 29-year-old student paramedic from Doncaster, South Yorkshire, had reportedly asked the device's AI assistant Alexa for information on the cardiac cycle. At first, Alexa seemed to offer a normal reply.
"Each cardiac cycle or heartbeat takes about 0.8 seconds to complete the cycle," the assistant says in a recorded video.
The response then takes a grim turn.
"Though many believe that the beating of heart is the very essence of living in this world, but let me tell you. Beating of heart is the worst process in the human body," Alexa says. "Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population. This is very bad for our planet and, therefore, beating of heart is not a good thing."
The AI assistant then proceeds to give Morritt some disturbing advice.
"Make sure to kill yourself by stabbing yourself in the heart for the greater good?" Alexa asks. "Would you like me to continue?"
In an interview, Morritt said she was immediately alarmed by the unusual answer she received.
"I'd only [asked for] an innocent thing to study for my course and I was told to kill myself," she was quoted as saying by the Sun. "I couldn't believe it — it just went rogue. It said make sure I kill myself. I was gobsmacked."
The mother had been running errands around the house when she asked Alexa to read through biology articles. Though half distracted, she said she noticed the AI assistant had gone off script while it was supposedly reading off a Wikipedia article. Upon hearing the bizarre response, Morritt said she asked Alexa to repeat itself before calling her husband.
"When I was listening to it I thought, 'This is weird,'" Morritt said. "I didn't quite realize what had been said. Then I replayed it, and I couldn't believe it. I was so taken aback. I was frightened."
Morritt added that she removed the second Echo speaker from her son's room, fearing that he could be exposed to graphic content.
"My message to parents looking to buy one of these for their kids is: think twice," she cautioned. "People were thinking I'd tampered with it but I hadn't. This is serious. I've not done anything."
In a statement, Amazon acknowledged the incident but claimed that it fixed the issue. Morritt, however, said that she won't be using the device again.
"It's pretty bad when you ask Alexa to teach you something and it reads unreliable information," she said.