Tuesday, 30 September 2014

VOICE HACKERS WILL SOON HAVE A FREE WAY INTO YOUR TECHNOLOGY

Voice-activated technology is so vulnerable to attack that users should immediately disable speech recognition on all their devices, a security researcher at AVG has warned.
Yuval Ben-Itzhak, chief technology officer at the anti-virus company, has carried out several experiments which revealed the new techniques hackers might use to gain control of voice-controlled devices.
He made the ominous prediction that a “thief outside the door” could take control of gadgets such as smart televisions or laptops from outside a target’s home, potentially burgling them without even smashing a window.
The vulnerability of technology which uses voice commands is likely to become an important issue in the coming years, as smartwatches and connected home devices grow in popularity and the technology becomes commonplace.
His warning presages a future where voice hackers use recorded or synthesized speech commands to bypass security mechanisms. But this scary reality is not as far away as it might seem, as security researchers have already managed to trick Siri into letting them bypass the lock screen on an iPhone and post Facebook messages, access call history, send text messages and fire off emails.
“Microphones should be disabled immediately and our current recommendation is that the user switch off features [involving voice commands],” he said in a phone interview with Forbes. “At the moment, leaving biometric technology as it is today is like leaving a computer without a password and just allowing anyone to walk by, click and take an action.
“We realized there is something very basic here that everyone seems to have forgotten: authentication. If you have a smart TV at home, for instance, it will respond to a synthesized voice as well as yours.”
His team has conducted several experiments which give a clue as to how hackers might hijack voice controlled devices. In the first, the researchers designed a game for Android which was able to recite a voice command which ordered the personal assistant Google GOOGL -0.1% Now to send an email from the same device.
“If you played Google Now a recording of a voice, you could easily make it send a message to all your contacts and say: I’m broke, I need money,” he claimed.
In a further experiment involving another biometric form of identification, he designed an application which responded to the accelerometer in a phone and dialed up a premium rate number when the phone was in motion. When it came to rest, the call would immediately cut off. This could potentially raise significant sums of money for a thief before the hack was even noticed.
These sound like relatively innocuous crimes, but are likely to just be the first of many targets for the voice hackers.
“There is already voice recognition software in a car,”  Ben-Itzhak added. “Just imagine what could happen if someone targeted that.”
AVG does not have specific plans to release software to guard against voice hacking. Indeed, Ben-Itzhak claimed that few vendors were tackling the problem, even though it represented something of an open goal to hackers.
“We haven’t seen this in the wild yet,” he said. “But it is something that is pretty easy to take advantage off.”
One device which is sure to rely on voice commands is the Apple AAPL +1.03% Watch, which is expected to use a modified version of Siri. In the coming years, we are expected to see voice recognition embedded in everything from light bulbs to refrigerators.
Just keep an ear out for the thief at the door.

0 comments:

Post a Comment