With the code-name technology of light commands, wrong-purpose commands can be injected into voice-controlled devices such as speakers, tablets, and phones. Once the voice assistance systems get breached, then the attacker also uses this same flaw to attack other systems.
If you are in touch with voice assistance products from top technology companies like Apple, Amazon, Google, then this news is for you. Research teams from the University of Electro-Telecommunications, Tokyo, and the University of Michigan showed in a demonstration that “attackers can inject remotely heard and unseen commands to voice assistants with the help of laser lights.” These are voice assistants – Google Assistant, Amazon Alexa, Facebook portal, and Apple Siri.
Through this code-name technique of Light Commands’, ‘wrong motive commands can be injected into voice-controlled devices such as speakers, tablets, and phones. The research team released a demonstration video earlier this week. In this video, it was shown that commands up to the target could be sent to large distances, even in locked rooms through glass windows.
The research team claims that ‘taking advantage of a similar flaw in these popular devices, the attacker can send commands that are heard and not visible from a distance, then these devices adopt that command themselves.’ Once the voice assistance systems get breached, then the attacker also uses this same flaw to attack other systems.
According to the research team, the attacker can also use this drawback to gain unofficial control over your “online purchases, smart home switches, smart garage doors, certain vehicles, smart locks”.
How does it work?
The research team includes Takeshi Sugawara, Benjamin Cyr, Sarah Rumpazzi, Daniel Jenkin, and Kevin Fu. Research published team drawback in a research paper stating in detail that, in addition to the audio microphone of these devices also come on directly to react to the light. Smart voice assistants rely on the voice of the consumer to interact with official users. In the ‘Light Commands’ set-up, a flashing laser is used to access the microphones, and voice assistance is effectively hijacked. Unheard commands are then sent to Alexa, Siri, Portal, and Google devices.
Based on this principle, the research team was successful in tricking the microphones and generating such an electric signal as if they were receiving the audio. For this, the electrical signal was modulated according to the intensity of the light beam.
How much does it cost?
The setup uses products available in the open market such as telephoto lenses, laser drivers, telescopes or binoculars, and other equipment. According to researchers estimates, all the equipment required for ‘Light Commands’ can be acquired for less than $ 600.
The basic drawback of devices such as voice assistants cannot be overcome until the microphones used in them are redesigned. However, researchers have approached popular manufacturers like Google, Amazon, Apple, for a possible solution to this problem.