TWIFT | Digital | Hackers can gain access and mess up with Alexa devices using lasers

Hackers can gain access and mess up with Alexa devices using lasers

Hackers can gain access and mess up with Alexa devices using lasers

DARPA, together with the Japanese Organization for the Promotion of Science, conducted a study that unveiled something horrific: ordinary lasers can hack voice assistants. They can be used to embed malicious commands into a device. For example, a hacker can remotely start a victim’s car, all it takes is a connection to the victim’s Google account – that easy.

To avert this from happening to you, we recommend keeping Alexa devices away from windows to prevent hackers from aiming their laser beams and taking control of them. If they succeed in breaking into the system, they will be able to send commands to smart assistants and also collect information from a victim’s authorized account.

Laser beams can breach a signal of a tablet, phone and voice assistant over a long distance even through a glass covering. Smart speakers such as Google Assistant and Amazon Alexa are vulnerable to hacking. Essentially, all that is required for intruders is that the device’s voice control is there.

Recently published article “Light Commands: Laser-Based Audio Injection Attacks on Voice-Controllable Systems” explains why exactly devices with microphones are susceptible to lasers. They call this phenomenon a “sound injection”. It is based on the photoacoustic effect that converts light into sound through a microphone. 

Researchers have found out that using voice assistant you can perform both harmless commands such as switching the backlight on the device on and off, and also more serious ones – paying for online purchases with the victim’s cards, opening the doors using smart-blocking and even controlling vehicles if they are connected to the Google.

Two large organizations in Japan and the United States, Japan Organization for the Promotion of Science and the Defense Advanced Research Projects Agency (DARPA), as well as other companies that funded the study, have teamed up to identify security vulnerabilities of devices connected by the Internet of Things.

Scientists’ concern is that a great deal of attention is being paid to developing voice systems as such. While the security aspect is shrouded in uncertainty. For instance, there is no proper user authentication. And yet, a user is restricted to interact with a device using his voice only.

Let’s see how voice assistants are exposed to hacker attacks:

  •      Microphones can convert sound into electrical signals;
  •      Microphones are susceptible to light directed at them;
  •      Fraudsters use a high-intensity light beam that creates an electrical signal;
  •      Using the light beams, hackers send “voice” commands not audible to humans;
  •      A device reacts to the voice and executes a command.

Researchers have discovered that lasers can access a device over up to 110 meters away, which is the average length of a football field. 

The fact of the matter is that almost every voice-controlled device is subject to such attacks. The authors of the article have tested and confirmed the following vulnerabilities:

alexa hacks

Perhaps the most disturbing part of the story is the easiness of this type of breach. According to the researchers, all you need is a simple laser pointer, laser diode driver (which supports a constant current of laser power), sound amplifier and telephoto lens for a lasting focusing of the laser.

Despite there haven’t been any cases of laser hacking yet (perhaps the intruders have tried to do it already, but no one traced it), and all the hacking was carried out as an experiment, still, you need to be careful and watch your devices. At least you’ve been warned now. But beware – intruders can also come up with a new idea for hacking anytime.

As of now, it is difficult to determine whether your device has been compromised by other people or not. But if you notice a laser or light beam pointed at your device, or a sound reaction of the device and the lights changing, or some random activity of the device, turn it off immediately. And then, under no circumstances, place it near a window in your apartment in the future.

Here are some other ways to protect against laser attacks, according to the article:

  • Two or more layers of authentication will help mitigate the attack. For example, you can set the device to ask a certain question before launching, which the hacker cannot know and therefore cannot perform the attack. This can be a good way to protect yourself from being hacked.
  • This method implies a manufacturer’s involvement. It is all about using sensor fusion. That is, getting sound from a number of microphones. This way, using a laser an intruder will send a signal to one sensor only, while the other sensors will not receive it. Thus, the device will perceive the signal as a system error and ignore it.
  • Another method relies on reducing the amount of light reaching the microphone. It can be done by creating a physical barrier that will prevent light rays from hitting the microphone. But since the speaker implies the presence of a light indicator, the complete blockage of the indicator will not be possible. And in case an intruder increases his laser’s power, it is still possible to break in. This method is not ideal, but at least it will delay the attack and get rid of some of the hackers with weak lasers. 

F$cking related   F$cking related    F$cking related   F$cking related    F$cking related   F$cking related    F$cking related   F$cking related