7 Times a Smart Speaker Could Harm a Person

Smart speakers are gradually becoming an integral part of modern life. They will help you control other smart devices in your home, tell a fairy tale to your children, launch your favorite playlist and do a hundred other useful things.

However, sometimes this assistant from a friend of man turns into a hidden enemy. Here are the top 7 cases when a smart speaker could harm a person with its actions.

The most well-known cases are related to the popular in the West Amazon Echo speaker with the smart assistant Alexa, it will be featured in our article. If you have had funny or scary cases related to Alice or other smart speakers, share them in the comments.

1. Ordering unnecessary goods

rslfllqp

Kids and TV hosts are unintentionally making smart speakers "go" to expensive stores. In 2017, a six-year-old girl in Texas accidentally ordered an expensive toy after asking the family's Amazon Echo to play with it.

"Can you play dollhouse with me and give me a dollhouse?" the child asked. Alexa granted the girl's wish, ordering a $170 Sparkle Mansion dollhouse and nearly five pounds of cookies.

CW6 News in San Diego decided to cover the story. During the broadcast, anchor Jim Patton joked, “I love this little girl who said, ‘Alexa ordered me a dollhouse.’”

After the show, several viewers contacted CW6 News to report that the host's comment had caused them to order dollhouses from Alex. Fortunately, none of the orders were fulfilled.

2. Mocking and bullying living Alexs

For the average user, Amazon Echo is a useful and effective device. But there are people who are transferring the functions of the virtual assistant to real people.

Six-year-old Alexa from Massachusetts is constantly bullied by other kids because of her name. Kids at school treat her like a servant, demanding that she do their homework and ridicule her. The bullying has become such a problem that Alexa's mother, Lauren, wrote to Jeff Bezos asking him to change the bot's name.

Young Alexa isn’t the only person with that name to be “confused” with the smart speaker. One Reddit thread has received more than 1,300 comments from women named Alexa, complaining about the number of unoriginal jokes they receive. “For some reason people think they are the most creative and witty people in the world,” one user wrote, adding that she wanted to “kill Amazon and their stupid robot.”

3. Creepy statements that scare owners

Sean Kinnear, from San Francisco, USA, claimed Alexa scared the living room out of him. The man was walking from the kitchen to the living room when his Echo smart speaker abruptly declared: “Every time I close my eyes, all I see is people dying.” When Sean instructed Alexa to repeat the statement, it failed to do so and instead reported errors.

Sean said his home is not integrated into the smart system and that his Echo speaker is occasionally used by family members to check sports scores and weather conditions. He also said that when Alexa made the announcement about people dying, his subscription to the online video service Amazon Prime Video was suspended.

Sometimes Alexa has the best intentions, but it backfires. User Rigann Mooradian reported that she was sitting listening to music and crying over being fired from her job when she heard a voice say, “Everything is going to be okay.” The words might have been comforting if they had come from someone other than Alexa. Rigann immediately unplugged the smart speaker and stuffed it in a drawer.

"I thought, hey, this is not normal. She shouldn't do this," Muradian added.

4. Recommendation for breaking the 6th commandment

Amazon programmers use machine learning to teach Alexa everyday speech, including jokes. When someone asks an unfamiliar question, the AI processes the request and then searches the web for an answer.

But Alexa's AI has a habit of stumbling across offensive comments on Reddit. And it turns out that toxic content is having a nasty effect on the fledgling artificial brain.

In 2017, for example, a smart speaker instructed one user to kill his adoptive parents. The recipient was horrified. In a scathing online review, he described the experience as “a whole new level of creepy.”

Do you think Alexa could evolve into Skynet from the Terminator movies?

5. Sending telephone messages to third parties

Having recorded a telephone conversation between a married couple from the US, Alexa forwarded it to a third party, without the owners’ permission. The manufacturer explained this incident as a “rare coincidence.” Allegedly, the smart assistant heard a word in the conversation that resembled her name, and then caught something like an order to “send a message.”

By asking "Who," the smart speaker recognized the name of a person in the conversation who was in the owner's contacts, and then checked whether the name was correct. The electronic ears heard something similar to "correct," after which Alexa sent the message to the recipient.

6. Broadcasting adult content to children

Sometimes, an innocuous request for a children's song can lead to the smart assistant starting a conversation about things not intended for children's ears.

One family who received an Amazon Echo Dot for Christmas experienced a funny incident. Their young son wanted to hear his favorite tune. So he grabbed the device with both hands and asked Alexa to “play Digger Digger.” But instead of playing the desired tune, the device started offering the young user a list of adult categories.

7. Deadly advice

Paramedic Danny Morritt asked Alexa to explain the heart cycle. Instead, the device began ranting about the evil nature of humanity and how it was destroying our planet.

The bizarre broadcast ended with the columnist telling Morritt: "Make sure you kill yourself by stabbing yourself in the heart for the greater good.".

The manufacturer conducted an investigation, and it turned out that the device read a Wikipedia article to its owner. Archives show that in June 2019, someone edited the relevant page of the online encyclopedia to include this message. For some reason, the virtual assistant decided to take the text from an old version of the site.

Do smart speakers spy on users?

As the saying goes: just because you're paranoid doesn't mean you're not being spied on. And many information security experts confirm: yes, smart speakers can spy on their owners.

They can become a tool for stealing personal information, and even turning off the standard microphone will not save you from wiretapping. After all, any speaker can act as a microphone, as Alexander Tokarenko, a representative of the Association of Heads of Information Security Services, reminded on air at Sputnik Radio.

And according to a joint study by Northeastern University and Imperial College London, not only smart speakers, but also smart TVs and even smart doorbells with an internet connection, can transmit owners' data to third parties.

The large-scale study involved 81 models of popular smart gadgets from various manufacturers. 34,586 experiments were conducted, during which it was discovered that 72 devices sent data to someone other than the manufacturer.

And in the summer of 2019, Google admitted that its employees were analyzing the commands that users gave to the Google Home smart speaker. A comment was posted on the company's website stating that the audio recordings were used to analyze speech to improve language recognition.

However, all this does not mean that you should refuse to buy a smart speaker. Decent people, and we believe that there are most of them among users of smart gadgets, have nothing to fear (except, perhaps, targeted advertising). If you are afraid that the smart assistant can listen to you while in the background, then simply turn off the speaker when it is not in use.