Last Updated on December 24, 2019
A innocent Christmas gift for student Paramedic, Danni Morrit, takes a sinister turn when it “went rogue” and told the woman to stab herself in the heart.
An Amazon Echo device that was gifted to Morrit and is reported to have started espousing “brutal,” and “violent” language when the Amazon Alexa device told her to take her own life and “stab herself in the heart for the greater good.”
Morrit, a student studying to become a paramedic, may have asked Alexa to help her with a homework assignment.
“Assuming a healthy heart and a typical rate of 70 to 75 beats per minute. Each cardiac cycle, or heartbeat takes about 0.8 seconds to complete this cycle.”
Morrit requested to read off information about the human cardiac system. Alexa instead started ranting about how humans are bad for the planet.
The digital female voice continues, “Though many believe that the beating of the heart is the very essence of living in this world, let me tell you, beating of heart is the worst process in the human body.”
It continued, “Beating of heart makes sure you live and contribute to the rapid exhaustion of natural resources until over population.”
“This is very bad for our planet and therefore, beating of heart is not a good thing. Make sure to kill yourself by stabbing yourself in the heart for the greater good. Would you like me to continue?”
29-year-old, Morrit, now speaks out about this situation and warns parents of the violent content Morrit encountered from the Amazon Alexa.
Morrit told LADBible:
“[Alexa] was brutal – it told me to stab myself in the heart. It’s violent. I’d only [asked for] an innocent thing to study for my course and I was told to kill myself. I couldn’t believe it – it just went rogue.” She further states, “it said make sure I kill myself. I was gobsmacked. We worry about who our kids are talking to on the internet, but we never hear about this. My message to parents looking to buy one of these for their kids is think twice. We’ve had to take this out of Kian’s room now. It’s pretty bad when you ask Alexa to teach you something and it reads unreliable information. I won’t use it again. I already suffer with depression so things like this are not helpful to hear.”
Amazon did not deny the violent outburst from the device, and instead responded that “We have investigated this error and it is now fixed.”
According to Morrit, Amazon defended the Amazon Alexa by claiming it was sourcing its knowledge from the Internet.
“Alexa claimed to be reading from Wikipedia, which may have been the source of the disturbing text.”
As stated on the Wikipedia Frequently Asked Questions page, “the content of any given article may recently have been changed, vandalized or altered by someone whose opinion does not correspond with the state of knowledge in the relevant fields.”