Hackers can take control of Siri, Cortana and other digital assistants with ultrasonic commands
Security researchers have discovered that digital assistants, including Alexa, Siri and Cortana, are vulnerable to hacking via inaudible voice commands. Known as the DolphinAttack, the exploit involves the use of ultrasonic commands that cannot be heard by humans.
Researchers from China's Zhejiang University have detailed the attack technique in a paper, but there are so many limitations and caveats that the vulnerability is not something that most people need worry about.
The DolphinAttack -- so-called because dolphins use ultrasonic frequencies to communicate -- could be used to control a large number of smartphones, computers, cars and other smart devices, and there is certainly potential for dangerous malicious activity. The researchers' report is fairly detailed, and it demonstrates how the team was able to use frequencies above 20kHz to deliver inaudible commands that target devices acted upon.
The attacks were carried out using $3-worth of audio hardware, and you can see how it works in the video below:
In practice, the attack is limited by a number of factors including background noise and proximity. An attacker would not only need to be very close to a target device, but would need a very quiet environment to successfully exploit the vulnerability.
Despite the unlikelihood of a successful attack, the researchers advise manufacturers to place an upper limit on the frequencies to which digital assistants react. But with both Google Chromecast and Amazon Dash buttons (and probably many other devices) relying on ultrasonic frequencies for the purposes of pairing, it's unlikely that such a limit will be introduced.
You can read through the full paper -- entitled DolphinAttack: Inaudible Voice Commands -- for full details.