Siri and Alexa are vulnerable to ultrasonic voice commands
Researchers in China have uncovered a relatively simple technique called a DolphinAttack to hack every personal voice-activated digital assistant that they tested. This includes Apple’s Siri, Amazon’s Alexa, Microsoft’s Cortana and Samsung’s Bixby with the technique working on iPhones, Apple Watches, MacBooks, Galaxy phones, PCs running Windows 10 and even car-based systems. Each could be defeated with standard voice commands converted into ultrasonic frequencies detectable by microphones in the affected devices.
To carry out an attack researchers used a smartphone with roughly US$3 worth of additional hardware including a tiny speaker and amp. Attacks needed to be undertaken from within a few inches in some instances, but others including the Apple Watch could be hacked from several feet away. Regardless, the victim would be completely unaware of the attack. While humans can’t hear frequencies above 20,000 Hz, some microphones can detect sounds at up to 42,000 Hz. The researchers were able initiate basic commands like “Hey Siri” or “Ok Google,” along with more complex requests like dialling a phone number or opening a malicious website.
While the obvious fix is for companies to disallow voice commands above certain frequencies, it is not quite as easy as it sounds. Many of the voice-activated assistants need to be able to detect a wide range of frequencies in order properly decipher a user’s commands. Further, some companies like Google actually take advantage of ultrasonics to initiate pairing such as the Chromecast, which pairs with an Android phone at frequencies around 18,000 Hz.
Until a solution is developed, the researchers suggest that the best way to defeat any DolphinAttack is to turn off the always-on settings in any personal assistant that you use.