Alexa, Google Assistant, Siri Vulnerable To Inaudible Commands

Smart assistants, especially those in smart speakers, have to always be listening. There is no other way to have them ready to answer your questions or obey your commands at a moment's notice, whether or not you have trigger phrase for it. Unfortunately, this also means that they will always be receiving and hearing any and all audio to intercept commands. The problem, according to researchers, is that these smart assistants might follow even those commands that you might not be able to hear.

Advertisement

The paper authored by U.C. Berkley's Nicholas Carlini and his colleagues is really just the latest in a string for research papers and demos proving the efficacy of these kinds of "subliminal" attacks on smart speakers and other devices with always listening smart assistants. The theory is simple to understand really. Almost all of these devices are designed to receive and analyze audio, including spoken commands, on a wide range of frequencies, including frequencies inaudible to the human ear. In other words, someone could "play" a practically soundless command that could get the smart assistant to do things the user never asked it to.

This theory was put to practice last year when Chinese researchers created a device from off the shelf parts to send such inaudible commands to virtual assistants. Their prototype was crude and require close proximity with the target phone or speaker. Since then, however, more and more researchers have improved on that theory and were even able to make one that worked even 25 feet away. Carlini's group created an even more ingenious method. They embedded the commands inside normal audio, like recorded speech or even music.

Advertisement

The three smart assistant makers naturally defended how their respective AIs can't be so easily exploited. Both Amazon and Google make sure that it's the user's voice giving the command before acting on them. Apple, on the other hand, limits the actions that Siri can do by voice and requires iPhones and iPads to be unlocked first for some of those.

Nonetheless, the growing number of papers on this topic should be a cause for concern. Carlini hopes that their paper should serve as a wake up call for both users and software makers and fears that criminals may already have a head start. It is also an area where laws have to catch up, as there is very little legislation on sending subliminal messages to humans and no such laws against sending those inaudible commands to other people's machines.

SOURCE: New York Times

Recommended

Advertisement