Dolphin Attack: How hackers can take control of your smart device using Siri, Google Assistant and Alexa

0

Security researchers have discovered how cyber criminals can hijack smart assistants, raising security concerns about the increasingly popular voice activated devices.

The researchers said that Amazon’s Alexa, Siri by Apple and Google’s Assistant, as well as similar products from Huawei and Samsung are affected by the so called Dolphin Attack.

The assistants can be hijacked by hackers using sounds that are inaudible to the human ear.

The sounds enable the hackers to take control of the devices and instruct them to download malicious software or even open the front door of a connected ‘smart’ house.

Researchers from Zhejiang University also said that they were able to use the hack to take control of iPhones and MacBooks running Siri, as well as Windows 10 PCs running Microsoft’s assistant Cortana.

In a video uploaded to YouTube, the researchers were able to demonstrate how they used ultrasonic frequencies which are inaudible to humans but which can be used to control the devices.

Using the method, the researchers were able to instruct a Nexus 7 and a MacBook to open a malicious website and tell an iPhone to call the number “1234567890”.

The researchers said that the hack is made possible because the hardware is able to pick up the ultrasonic sounds, while the software is unable to distinguish between computer generated noise and genuine human speech.

In total the researchers identified 16 devices which can be hacked using the Dolphin Attack.

The devices include iPhone 4 to iPhone 7 Plus, iPad mini 4, MacBook, Apple Watch, Nexus 7, Samsung Galaxy S6, Huawei Honor 6, Amazon Echo and the Audi Q3.

The researchers said the attack does have a limited range and a hacker would need within 1.7 metres of the device in order to exploit the vulnerability.

“We validate DolphinAttack on popular speech recognition systems, including Siri, Google Now, Samsung S Voice, Huawei HiVoice, Cortana and Alexa,” said Guoming Zhang, who led the research team from Zhejiang University in China

“By injecting a sequence of inaudible voice commands, we show a few proof-of-concept attacks, which include activating Siri to initiate a FaceTime call on iPhone, activating Google Now to switch the phone to the airplane mode, and even manipulating the navigation system in an Audi automobile.

“We propose hardware and software defense solutions.

“We validate that it is feasible to detect DolphinAttack by classifying the audios using supported vector machine (SVM), and suggest to re-design voice controllable systems to be resilient to inaudible voice command attacks.”

Share.

Comments are closed.