Can Voice Assistants Be Hacked?

By Richard Dahl on November 11, 2019

"Hey, Siri. Order my usual pizza."

So convenient, yes? Instead of having to tap in or phone in your order, you let Siri do the work. Millions of Americans are using Siri or Alexa or other voice assistants to not only order their food, but also buy household items and book their travel.

But are they really secure? Could a hacker find a way to gain control of your voice assistant and make purchases on your credit cards?

Mounting evidence suggests that they probably can. The security systems used by voice assistants, also known as smart speakers, are apparently more vulnerable than we are led to believe.

Hijacking Voice Assistants: A Few Examples

CNET has shown how easily Amazon Echo's voice-recognition function could be fooled into granting access to an unauthorized user, giving them free rein to ring up credit card purchases. Multiple news stories have arisen, for instance, about Alexa or Siri letting children order hundreds of dollars worth of toys.

More troubling, though, are the recent findings of researchers who have discovered how hackers not in the room are capable of taking control of your voice assistant.

Researchers in the U.S. and China have shown that they can send silent commands to voice assistants to enable them to dial phone numbers or open websites. Known as "dolphin attacks," the technology "could be used to unlock doors, wire money, or buy stuff online," the New York Times reported last year.

Then, on Nov. 4 of this year, researchers in Japan and the University of Michigan revealed that they had discovered a way to gain access to Apple's Siri, Amazon's Alexa, and Google Home by shining laser pointers at them from hundreds of feet away. "We show how an attacker can use light-injected voice commands to unlock the target's smart-lock protected front door, open garage doors, shop on e-commerce sites at the target's expense, or even locate, unlock and start various vehicles if the vehicles are connected to the target's Google account," the authors summarized.

In the case of the laser beams, the hackers would have to physically be within a few hundred feet; so the way to protect your voice assistant is to make sure it's not visible beyond your dwelling. (Laser beams can penetrate glass windows.)

Initially, the dolphin attacks mentioned above were only effective within close proximity of the voice assistants in a lab. But last year, researchers at the University of Illinois conducted successful attacks from 25 feet away. While that might not sound like a big risk, it also sounds worrisome for the future if hackers can develop more sophisticated laser applications.

Users Should Take Precautionary Steps

"Right now the dangers of voice-command hijacking seem mostly theoretical and isolated," Rafael Lourenco, executive vice president of retail fraud prevention company ClearSale wrote recently on the website VentureBeat.com. "But the recent past has shown us that fraudsters adapt quickly to new technology."

Manufacturers say they are working to make the devices more secure, but in the meantime, owners of the devices should take protective steps as well.

Lourenco recommends the following:

  • Use strong passwords on all devices.
  • Don't leave your phone unlocked when you're not using it.
  • PIN-protect voice-assistant tasks that involve finances, home security, personal data, or health records — or just don't link that information to the voice-command devices.

And don't forget to pull the shades. Who knows? There could be a laser-wielding hacker living right across the street.

Related Resources:

Copied to clipboard