This article comes from the WeChat public account:AI frontline (ID: ai-front) , author: Dongmei, original title “Tesla autopilot accused of false propaganda, Musk: we are based on aircraftTian Industrial is named, what’s wrong? “, the picture from: Visual China

Tesla’s accused of autonomous driving is a false propaganda

Recently, the German court banned Tesla from using the words “Autopilot” or “Full Autopilot” on its website and other advertisements. Tesla CEO Musk disputed this.

In a tweet, Musk said: “Tesla Autopilot is named according to the terminology used in the aerospace industry, then AutoBahn(German highway)< /span>How to explain?

Wettbewerbszentrale, the German agency responsible for regulating unfair competition, submitted the case to the German court, saying that these terms or words used by Tesla mislead the public with advanced driver assistance systems.

Tesla cars are equipped with the Autopilot advanced driver assistance system, which can help the vehicle complete adaptive cruise control and lane steering. The most powerful and advanced version of Autopilot is called “Full Self-Driving (Full Self-driving), or FSD for short, which includes automatic Parking function, an active navigation system that can guide cars entering or exiting ramps and changing lanes from highways. The system can now also recognize and respond to traffic lights.

However, Tesla cars are not self-driving cars, because the system requires a human driver to keep working.

Wettbewerbszentrale also pointed out that Tesla hinted in a statement on its website that by the end of this year, autonomous drivingThe car will run on city streets, however, this hint distorts the fact that some features of Tesla’s autonomous driving are not legal in Germany.

The Munich court agreed with this statement.

Although Musk does not agree with this statement at present, it is unclear whether he will ask Tesla to appeal. Tesla has not yet issued any official statement or responded to requests for comment.

Tesla autopilot “turned over car”

On June 1 this year, according to media reports, when a Tesla Model 3 was driving on a highway in Taiwan, it directly hit a large truck that fell on the highway. There was only one driver in the vehicle at the time of the incident. Fortunately, no casualties were caused.

According to the video surveillance screen on the road, the weather was clear and the driver’s vision was good. When the truck overturned to the ground, the truck driver stood on the side of the isolation belt and waited for rescue. When the truck driver saw Terra driving straight to the truck and rushed over without any slowdown, the truck driver also tried to wave his hand to the Tesla driver However, even so, the Tesla car did not stop in time and hit the truck.

After the incident, the local media investigated the cause of the accident. According to the police: “The driver said that he turned on the automatic driving assistance function when the vehicle was driving, and fixed the speed to 110 km/h. Some are distracted and don’t focus all of their attention on vehicles and roads.”

In fact, this is not the first time Tesla has “grown up” in autonomous driving.

As early as 2016, Tesla had two fatal accidents in autonomous driving, which caused Tesla to be in doubt for a long time that the safety of autonomous driving cannot be guaranteed. According to the “New York Times” report, on January 20, 2016, in Handan No. 1 Expressway, China,A driver driving a Tesla S was killed in the accident, and CCTV reported on the incident. The driving recorder in the car showed that the car hit a road sweeper while driving at the left lane at the road speed, and Tesla was in an automatic driving state at that time.

Shortly thereafter, in May 2016, Tesla had another car accident in Florida, USA. In this accident, the radar and camera of the autopilot could not recognize the white truck against a bright sky background, and there was no indication that the driver or the autopilot took braking measures before colliding with a towed trailer.

In the accident of the Taiwan highway, Tesla’s accident car Model 3 declared: Can support L3 level automatic driving. L3 level automatic driving requires the driver to be ready to take over the car at any time. On Tesla’s official website, it is also noted that its Auto Pilot is a driving assistance function. The owner’s hand cannot be separated from the steering wheel, and attention needs to be focused on driving.

Previously, the AI ​​frontline interviewed automatic driving technical experts on possible causes of automatic driving/speed cruise, and he said:

For the automatic cruise system, the simplest cruise needs to sense the vehicle speed, and control the car engine throttle and fuel injection to form a negative feedback loop. If the vehicle speed sensor, throttle control, fuel injection, or engine controller are out of order, it may cause loss of control. But the car is not a simple negative feedback mechanism, and there are many engineering designs that return the system to a safe state when the system fails, the so-called fail-safe design. Therefore, it is not that a single component failure will make the cruise out of control. Even if there is a problem with the cruise, any refueling, braking or manual operation can exit the cruise state. Therefore, the possibility of cruise out of control is very small.

Automated driving involves too many aspects. The control involved in cruise is only the change of vehicle speed. The control of automatic driving includes vehicle speed, steering, braking, etc. The sensors include at least Lidar Lidar, camera and satellite positioning system, such as Beidou And GPS, for passenger comfort and body stability, there are inertial acceleration sensors, three-dimensional gyroscope and other sensors. The middle processing unit is more complicated, involving computer vision, real-time operating system, high-definition maps, precise positioning and other aspects. We all know that a complex system without a sufficiently fault-tolerant design is more prone to failure. For example, Professor Yoshi of the University of Washington in Seattle has created a method that can deceive autonomous driving. With only a few sticky notes, the computer vision system of autonomous driving can be mistaken for a stop sign as a speed limit. FromAnd cause an accident. From the perspective of computer security, the more sensors, the easier the computer is to be deceived by the sensor input, resulting in runaway. Therefore, all kinds of visual deception, laser irradiation, and changes to sensor data may cause automatic driving to get out of control.

Musk: Tesla is expected to achieve L5 autonomous driving by the end of this year

At the just-concluded World Artificial Intelligence Conference, Musk also expressed his confidence in the future L5 level of autonomous driving and believed that it will be realized soon.

Tesla is very close to L5 level autopilot. I am confident that I will complete the basic functions of L5 level autopilot this year. I think there is currently no underlying underlying challenge to achieve autonomous driving at L5 level, but there are many details. The challenge we face is to solve all these small problems, and then integrate the system to continuously solve the long tail problem. You will find that it can handle most scenarios, but there will be some strange situations from time to time. You must have a system to train to solve these strange scenarios. So, you need to have realistic scenes, nothing is more complicated than reality, any simulation is a subset of the complexity of the real world. At present, we are focusing on the details of L5 autopilot, and we are very confident that this can be achieved. We can use Tesla’s current hardware, and only need to improve the software to achieve L5 level of autonomous driving.

It seems that L3 level autopilot has not been trusted by everyone. I am afraid the on-road time of L5 level will slow down.

Reference link:< /span>

This article comes from the WeChat public account:AI frontline (ID: ai-front), author: Dongmei