The serious mistake that many continue to make with their smart speakers

While the smart speakers in our home They make our lives much easier when it comes to finding information without having to access our mobile phone or computer, they can also be an important security breach if we end up neglecting it, and a recent study makes things very clear about it.

It is likely that your smart speaker has ever heard a word or phrase that it should not have heard, and that can be a serious privacy problem and a violation of your security if it has also been heard by a human engineer on the other side of the server.

As you well know, the operation of smart speakers does not keep much mystery since this type of devices first they wait for some activation word or phrase, and from there is when we ask them our question, a question that goes to the servers, to the automatic algorithm of the device in question, and from there we will receive an answer. The problem comes that to refine this automatic algorithm many times these questions also reach the ears of human engineers.

Now a study by the Ruhr Bochum University and the Max Planck Institute for Security and Privacy has found more than 1000 words and phrases that assistants Alexa, Siri and Google Assistant have frequently mistakenly identified as “activation commands”, something known as false positives.

An activation command is when our smart speaker or virtual assistant “wakes up to listen actively” to answer our questions and surely phrases like “Hey Google”, “Hello Cortana” or “Hey Siri”, among others.

What this study has come to point out is that there are words similar to these activation commands that are confused by the device in question, and that makes them activate without us noticing, being able to record later what we are saying.


If you have one of the new Amazon Echo with Alexa in Spanish, here are some commands you can use to get the most out of this speaker.

According to the mentioned study, these are some false positives:

  • Alexa: “unacceptable,” “election” and “a letter”
  • Google Home: “OK, cool,” and “Okay, who is reading”
  • Siri: “a city” and “hey jerry”
  • Microsoft Cortana: “Montana”

For example, imagine that by mistake instead of saying “Cortana” you indicate “Montana”, and there the device is activated in listening mode, and then your next phrase is the PIN of your bank card to close a reservation to Montana. Well, this information reaches the servers of the device, and with a minimal possibility of being heard by an engineer.


Siri. Alexa. Google. Cortana … Digital assistants have been integrated into our daily routine, and despite their enormous unstoppable success, the vast majority of people are unaware of the story behind their names. Today, we will tell you about them.

That is why the study gives a series of recommendations so that we can prevent these failures in the activation commands:

  • Change the word or phrase of activation of your smart speaker whenever possible, trying to choose a phrase or word less prone to error.
  • In the case of Google Home you have an option to reduce the activation sensitivity.
  • Certain devices also allow you to mute the device’s microphone.
  • Don’t always have your smart speakers turned on if you’re not really using them regularly.
  • Delete all recordings and update security options in each of the accounts of these electronic devices so that the audio is never saved, listened to or shared with engineers.

In this way, using the tools that these speakers put on the tray and also some common sense, we can avoid falling into one of the most common mistakes that continue to be made with these smart speakers.

[Fuente: ArsTechnica]

1 Shares:
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like