Crimeware

Secret Voice Commands Can Hijack Devices

Perhaps you have seen the stories about a little girl using her family’s Amazon Echo to order a dollhouse. Or maybe you heard that news reports about that incident caused the Echos of television viewers to try to order the same dollhouse. And then there was the Google Home Super Bowl commercial: As people in the ad gave commands to Google to turn on the lights or turn off the music, the Google Home devices in some viewers’ homes tried to execute those commands. Although these occurrences are mildly annoying (or even amusing to those of us who are not affected) there are more dangerous possibilities with voice commands. Researchers discovered that voice commands that were unintelligible to human ears could be heard and executed by devices using smart assistants, such as Siri or Google Now. As noted by PCWorld, “It might not work every time, but it’s a numbers game.” The article will address the damage that can be done by malicious voice commands and ways users can secure their devices against unauthorized voice commands, whether those voice commands are coming from a hacker or a 6-year-old girl who wants a dollhouse.

Link: http://www.voanews.com/a/mht-google-super-bowl-ad-activates-google-home-devices/3708026.html

Link: https://www.wired.com/2015/10/this-radio-trick-silently-hacks-siri-from-16-feet-away/

Link: http://www.pcworld.com/article/3092493/security/heres-how-secret-voice-commands-could-hijack-your-smarthphone.html (demo video)