Published: Sat, May 12, 2018
Research | By Raquel Erickson

Someone is tricking voice assistants with white noise and secret commands

Someone is tricking voice assistants with white noise and secret commands

The problem, according to researchers, is that these smart assistants might follow even those commands that you might not be able to hear.

Researchers in the US and China have discovered ways to send hidden commands to digital assistants-including Apple's Siri, Amazon's Alexa, and Google's Assistant-that could have massive security implications. Researchers have stumbled upon the fact that digital assistants can be manipulated using white noise and commands that the human ear doesn't register.

The researchers say criminals could exploit the technology to unlock doors, wire money or buy stuff online, simply with music playing over the radio. However, it's probably only a matter of time before these types of attacks trickle out into the wild-if students at universities are working on this sort of thing, it's likely that bad actors are doing the same.

"My assumption is that the malicious people already employ people to do what I do", said Nicholas Carlini, a fifth-year Ph.D. student in computer security at U.C. Berkeley and one of the paper's authors. Major smart speaker manufacturers like Amazon, Google and Apple say that they have safeguards in place to prevent their assistants from being hijacked. Specifically with respect to the Apple Homepod, the device has been created to "prevent commands from doing things like unlocking doors". Amazon and Google use technology to block commands that can not be heard.

There is no American law against broadcasting subliminal messages to humans, let alone machines. For its part, the Federal Communications Commission (FCC) has discouraged the practice, calling it "counter to the public interest". While DolphinAttacks require the transmitter to be placed in close proximity to the smart device receiving the hidden message, last month researchers were able to send these subliminal message via ultrasound from 25 feet away.

As Sheng Shen of University of IL at Urbana-Champaign points out, the commands don't even need to be audible-they can be ultrasonic. During the Urabana-Champaign, they showed that though commands couldn't yet penetrate walls, they still had the potential to control smart devices through open windows in buildings.

MANY people have grown accustomed to talking to their smart devices, asking them to read a text, play a song or set an alarm.

You can hear the audio files on Carlini's website.

It was also able to embed the same command within a four second segment of Verdi's Requiem.

But Carlini explained their objective is to flag the security problem - and then try to fix it.

Like this: