Hackers can take control of your phone with ‘DolphinAttack’
WHILE voice-activated assistants like Siri and Alexa were designed to make our mobile lives easier, hackers can turn it into a complete nightmare by using ultrasonic sounds to take command of your smart phone.
Cybersecurity researchers from China’s Zhejiang University recently discovered the phone hijacking method which allows hackers to make calls, send text messages and browse malicious websites on your phones, via voice-controlled assistants like Apple’s Siri and Amazon’s Alexa. Essentially, the cyber criminals would be able to take control of your device.
The hacking method has been called DolphinAttack, in which two teams of the researchers found that the phones could be hijacked through the AI assistants with commands broadcasted at high frequencies above 20kHz; sounds audible to animals like dolphins but can’t be heard by humans, the BBC reported.
As voice-controlled assistants were available in many smartphones, the feature works by responding to a “wake word” to be activated to take orders from the user.
Using a loudspeaker to broadcast voice commands in ultrasonic frequencies, the researchers said they were able to activate the voice-controlled assistant on a range Android and Apple devices from several feet away, which is sufficient in a real-life scenario. The mode of attack could even unlock doors if you used a smart lock for your home.
And since the attack works on almost all major voice recognition platforms, popular smart phones like the iPhone, Nexus, or Samsung were all vulnerable.
Cause for alarm
By hijacking the phone, attackers can use it as a spying tool to make outgoing video and phone calls and see the surroundings of the device. The hack also enables attackers to send out fake text messages, put up online posts, and even mark fake events to a calendar.
By turning on the “airplane mode” of the phones, the hackers can also deny the user network service and take them offline from wireless communications.
A total of 7 systems on 16 devices were tested against the method and it worked on all of them including Siri, Google Assistant, Samsung S Voice, Huawei HiVoice, Cortana, and Alexa. Apart from smart phones, DolphinAttack worked on iPads, MacBooks, and even on an Audi Q3 vehicle.
Even more alarming is the fact that the attack works even if the hacker did not have direct access to the targeted device, despite the user taking all necessary security precautions, The Hacker News reported
Device manufacturers were advised to make key alterations to counter the risk by making their devices ignore voice commands above 20kHz
“A microphone shall be enhanced and designed to suppress any acoustic signals whose frequencies are in the ultrasound range. For instance, the microphone of iPhone 6 Plus can resist to inaudible voice commands well,” the researchers were quoted as saying.
The researchers added users could also prevent such attacks by disabling their voice assistant apps.
Next month, the research team is expected to present their findings ACM Conference on Computer and Communications Security in Dallas, Texas.
- Adobe’s Achilles heel: How InDesign became a hacker tool and what other options are out there
- Unprecedented data breaches of the last ten years – and their aftermath
- Adobe products continuously targeted for phishing attacks
- Singapore’s AI strategy 2.0 explained
- Can AMD disrupt Nvidia’s AI reign with its latest MI300 chips?