Pull on the curtains!

Editor’s note: This article is from WeChat public account “brain body” (ID: Unity007), author Xiaohao.

With the advent of the intelligent age, the pattern of human-computer interaction has become richer and diverse. Traditional touch interactions still dominate, voice interactions are growing, and gesture interactions are on the way. Voice interaction, due to its simplicity of interaction and low learning cost, as well as the maturity of technology, has become one of the most promising forms of human-computer interaction in the future.

Now almost everyone can interact with voice more or less: mobile assistants, smart speakers, car assistants, smart headphones, and even home air conditioners, etc., it’s just not too much. Cool.

Laser wears Zhuhu: It may be talking to your smart speaker

But the voice interaction can sometimes be annoying, such as accidentally triggering a wake-up call from a smart speaker or other device, and you don’t know, you may have a more loyal “listener.”

But this is nothing, if I tell you now: can you control your smart speaker, or all other voice devices with just one laser, will you feel more scary?

This has happened.

The smart speaker heard the sound of “laser”

Things are like this.

Suguwara, a Japanese cybersecurity researcher, demonstrated his own surprise discovery to KevinFu, a professor at the University of Michigan. When he pointed a microphone of an iPad with a high-powered laser, KevinFu wearing headphones heard the sound from the iPad microphone. When suguwara changed the intensity of the laser, KevinFu heard different sounds.

To further validate this experiment, they organized a group of researchers at the University of Michigan six months later to conduct an in-depth study of the interference of lasers on speech equipment. They found that by changing the laser frequency, it was easy to “cheat” the microphone of the voice device, making it mistakenly think that the incoming laser is a digital signal like a sound, which produces a certain response. What are these responses?

Laser wears Zhuhu: It may be with youSmart speaker cordial conversation

With a laser, the researchers had a pleasant conversation with 16 different devices, including mobile phones, Amazon’s Alex, Google Home, and Facebook’s Portal. They even bought a wave and opened it. The door lock of the underground garage and so on.

The farthest test distance for these devices is 110 meters, while the most recent is 5 meters. And this is only the researcher’s laser equipment power is subject to certain restrictions, if the test with a higher power laser, the control distance will only increase.

This method of using lasers to attack voice devices is called “Light Commands.”

Think about it, if your ex-boyfriend or other people who have a holiday stand in the distance, use a laser to penetrate the window and come to your voice assistant and turn it on and off, or instruct other appliances to keep on Power-off and power-off: TV suddenly shuts down and turns on the platform, tireless, air-conditioning cooling and heating mode conversion without intermittent, smart speaker cut songs, neurosis… Are you going crazy?

Perhaps it seems that the accidental income of a laboratory is not enough to trigger such a terrible plot, but it does give us a wake-up call: in the tide of the development of intelligent voice interactive technology in full swing Is it also a lock for security?

The dawn of the intelligent age, you need to block the “dark light”

In fact, as one of the major discoveries of human beings, lasers have always been favored by industry and folk science enthusiasts for their powerful known and unknown performance and functions. Hackers are no exception, and using lasers to do something, especially security breaches, is nothing new. For example, as early as 2013, German hackers used laser technology, and in just two days, they broke the fingerprint unlock of iPhone5S, which once caused Apple a headache.

So, how do you deal with the threat of silent lasers to voice interaction devices when there is a known risk?

From the user’s perspective, the things bought are obviously not refundable. In this regard, the researchers believe that the best way is to put these devices in places that are not easily illuminated by the light beam; for some exposed devices such as garage door locks, it is also necessary to cover them as much as possible, from physical Take precautions.

Laser wears Zhuhu: It may be talking to your smart speaker

On the other hand, we try to pay attention to the difference when setting the wake-up words. Many voice devices have personalized settings for wake-up words, but few people will make changes, but use device defaults. This is like setting the bank card password to “123456” or “000000”. If the card is stolen, it will naturally not blame the bank.

But it’s too embarrassing to buy a smart speaker and fire prevention and security. It is obviously not the best way for users to pay the bill. It is the ultimate policy for product manufacturers to propose corresponding technical solutions.

So, what should vendors do?

First, try to understand the theory of laser conversion into electrical signals as soon as possible. At present, Paul Horowitz, a professor of physics and electrical engineering at Harvard University, proposed two possibilities for laser pulses to cause air expansion around the microphone diaphragm to produce sound and laser hitting the chip causing the vibration to cause the chip to interpret it as an electrical signal, but both Not fully confirmed. If you understand the final principle, you may be able to go upstream, and then find the solution.

At the same time, for this matter alone, terminal vendors need to increase the level of authentication used by the device. In the course of the test, the researchers found that if the use of the device requires authentication, such as iPad or iPhone, which requires fingerprint verification or face recognition, and some voice assistants need to recognize the user’s voiceprint to respond, it is difficult to use the laser. Interference, but there is still the possibility of being cracked. As for echo and Google assistants, there is a lack of voice authentication, so the success rate of attacks is almost 100%. Then, for the manufacturer, how to do the pre-use authentication is the first step to block the “light”. Of course, if the future laser attack mode is upgraded, the upgrade battle for identity verification will not stop.

In the face of events like “light commands”, terminal manufacturers must also improve their ability to discover vulnerabilities and fix vulnerabilities so that they can first identify problems and solve problems. A technology or product tends to play a positive role in human social life at the beginning, and then an evil attack will occur, and its double-edged sword effect will follow. At this time, the key to whether technology can follow its most primitive wishes lies in the “invaders” and “guardians” who first occupied the highlands. After all, compared to hackers to find vulnerabilities first, it is clear that terminal manufacturers take the lead in self-examination and timely processing is more reassuring. The “Light Command” originated from an experimental accident, and it has also attracted the attention of Google, Apple, Facebook and other manufacturers, and it has begun to actively seek countermeasures. But if the criminals first discover and carry out technological upgrading to cause damage, the loss may have already occurred.

Laser wears Zhuhu: It may be talking to your smart speaker

As we said at the beginning, voice interactive devices represented by smart speakers have blossomed in the era of intelligence, and they have a profound impact on people’s lives in the historical torrent of IoT revolution. At the same time, there are various unexpected attacks and loopholes. By blocking these “dark light”, we can enjoy the true age of intelligence without worry.