Techrecipe

Homeworks caused by Uber autonomous vehicle death accident

uber.jpg

Autonomous vehicles caused traffic accidents. On March 18 (local time), a Uber autonomous vehicle running a road test in Tempe, Arizona, hit a 49-year-old woman who crossed the road. The woman was transferred to the hospital, but soon died. She said she was dragging her bicycle and crossing the road.

The Uber side was carrying a driving test under the control of a person as an auxiliary driver to ensure safety, but it seems that the emergency stop action was not done well. After the accident, Uber stopped all autonomous driving tests on the roads in Pittsburgh, San Francisco, Toronto, Phoenix, and expressed his condolences to the victims’ family through Twitter.

Of course, this accident is not the first accident involving autonomous vehicles. In 2016, a driver who used Tesla’s Autopilot was killed. However, it is the first time in the United States that a pedestrian was killed.

Autonomous driving technology is entering a credible stage, but new technologies can lead to many unexpected situations. The accident could be an opportunity to raise the voice that autonomous vehicles are not yet safe.

Autonomous vehicles continue to be seen on the road. In this case, it is difficult to avoid traffic accidents only when there are more and fewer problems than human beings. In fact, in the case of an autonomous shuttle bus that started operating in Las Vegas last year, a truck-to-vehicle accident occurred in an hour on the first day of operation.

The shuttle bus, which has 15 passengers and runs around the city of Las Vegas, will be operated by French public transportation company Keolis, the American Automobile Association and the City of Las Vegas. It is. The vehicle runs at a maximum speed of 45 km / h but an average speed of 25 km / h. In addition, the vehicle computer checks a road condition and traverses a predetermined section.

A traffic accident occurred when the trailer was hit by a backward movement although the trailer in front of the vehicle was recognized as an obstacle. Fortunately, there were no injured passengers at the time. A minor traffic accident says that the trailer driver is finishing at a fine.

Going back to 2016, Google, which has been developing autonomous vehicles for a long time, is the first to admit negligence. Google has been in full-fledged road testing since 2015. However, on February 14, 2016 (local time), the Google car caused a crash with the bus during the trial operation. At the time of the accident, there was another vehicle waiting to turn right at the front intersection. The autonomous vehicle tries to move from the straight lane to the right turn lane, finds the sand bag, and stops. The traffic light turned green and the autonomous vehicle stepped back and tried to avoid the sand bag again in the straight lane, but the bus collided with the self-propelled vehicle as it turned from the rear to the straight lane.

The accident is the first time Google has acknowledged the negligence of autonomous vehicles. Previously, light accidents have occurred, but most of the transport authorities’ investigation has revealed that they are responsible for the autonomous vehicles and the drivers of the accidents.

For example, in July 2015, you can see the traffic situation that autonomous vehicles recognize when you see an image of an accident on Google’s car. In the video, the autonomous vehicle recognizes the surrounding situation as a polygon, and confirms that the preceding vehicle has stopped at the intersection and stops slowly. However, the car behind the autonomous vehicle does not stop, and the distance between the cars narrows and eventually collides. It can be seen that the autonomous vehicle is shaking by the reaction.

Of course, it was due to human error at the time. However, it has been pointed out how to cope with other ‘human’ drivers even if only autonomous vehicles are secured.

In addition to the problem of response to the human driver, the frequency of accidents caused by the autonomous vehicle itself is inevitably increased. On January 22 (local time), a Tesla Model S collided with a fire truck that was at a stop on the Culver City Highway in California. Half of the Model S hood that was under the autopilot function was laid beneath the fire engine. Miraculously, no injuries have been reported, but in the course of the accident analysis, it has been revealed that most autonomous navigation systems, including Tesla, can not recognize vehicles that are stationary. In fact, Tesla does not recognize all objects, nor does it detect that a vehicle is parked and can not brake or decelerate. In fact, it is said that the driver can change the lane while driving at more than 80 km / h, but it can be more prominent in situations where a stationary vehicle appears.

This is also true of Volvo’s self-propelled Pilot Assist. Adding cruise control helps to calculate and maintain the distance between the vehicle and the preceding vehicle, but if the trailer, low-speed vehicle, or stationary vehicle or object is low, the brake may not work. It is not advisable to use this autonomous driving function even when driving on slippery road surfaces, dented road surfaces, snow or muddy roads, heavy rain or snow. It can be seen that the semi-autonomous driving function thus far is an assisting function. If the system can not detect a forward vehicle, the driver must operate the vehicle.

The accident shows that there are unexpected blind spots in autonomous vehicle sensors. Of course, it is a well-known fact that autonomous vehicles can not recognize the braked vehicle. There is limited hardware here. Autonomous vehicles, which are still on the market, use radar and camera to sense the surroundings of the car body. However, the radar can not recognize moving and can not recognize obstacles on the road as well as stalled vehicles.

The reason for this is that the ability to process computing is not enough to recognize such a stopped object. If the vehicle is running, you may think that it recognizes almost everything caught in the radar, but in reality it only recognizes moving objects or vehicles that run around without knowing what is fixed on the road, such as signs, signals, or guard rails. It is doing semi-autonomous driving (at least for now) while confirming this situation.

Of course, to solve this problem, you can use the LIDAR, which can measure the distance from the distant object by irradiating a lot of laser light. However, riders are still expensive to mount on ordinary vehicles. In addition to the impact resistance, the performance of the weather, etc., has not reached the level required for practical use.

There was also a problem pointing out the security weakness of the rider sensor. Using a low-power laser and a pulse generator, it is possible that a rider sensor could receive false radio waves, causing an autonomous vehicle to stop as if it were mistaken as if there were other vehicles or pedestrians around it. Using these vulnerabilities, there is a possibility that the vehicle may misunderstand the surrounding situation and suddenly stop at the center of the road. In this case, the self-propelled vehicle may suddenly collide with another human driver’s vehicle and cause an accident.

Even if the rider is put into practical use, it is obvious that the performance itself is not perfect, so it is necessary to use the existing sensor and camera together. Because of this problem, Level 5, which means that there is a lot of room to go through until the realization of self-driving.

In fact, in the US, Chevrolet Volt, who tested GM’s autonomous driving capability, also made contact with a man who was riding a bicycle. When the autonomous vehicle that was running in front of the man moved the lane, the bike tried to pass it, but the vehicle came back to the original lane again and a contact accident occurred.

In the United States, the California Driving License Test Management Corporation (DMV) has announced that it will be allowed to test autonomous vehicles without a driver’s seat on public roads from April this year. It is the first time a vehicle has allowed an autonomous driving vehicle to be driven without a driver. According to this, even if the test vehicle does not get on the person, remote monitoring is necessary and the conditions for remote operation are attached. There is also a condition that if an accident occurs, you should contact the police or the manager.

I do not know what impact this accident will have. However, experts say that in order to promote autonomous vehicle distribution, the vehicle must be able to be remotely controlled to respond to all situations. This can be a new business. In fact, companies such as Phantom Auto ( https://phantom.auto/ ) will operate remotely if an autonomous navigation system fails and try to enable remote control switching if a person is predicted to drive erratically. It is like a remote control system for autonomous vehicles like air traffic control. The US National Road and Traffic Safety Administration (NHTSA) is pushing for the introduction of vehicle without drivers by 2019. It is said that more than 50 companies have been allowed to carry out autonomous driving test on a road test using GM and Google Wei Mo.

Autonomous vehicles running on the road are increasing. Of course, safety-related technologies are being steadily secured. As technology evolves, it will surely change to the safe side, but despite these problems, the dilemma for ethical issues can still exist unless accidents become zero. The so-called trolley problem is typical. It is a situation that can not control the vehicle that changes the lane and can not be stopped, and if left as it is, five people in the end die. You can change the lane, but if you do this one person on the other side will die. If so, who should we live and kill?

In simple terms, it is necessary to save five people and kill one person, but the moral dilemma is not solved. When a human is driving, the driver’s morality and momentary judgment are complex. But if you are an autonomous vehicle, you have to algorithmize how you have to deal with these problems. The rule setting is necessary.

The Uber Accident can be a reminder that it is necessary to prepare for the era of autonomous vehicles, while at the same time telling about the technical completeness and complement of autonomous vehicles.

lswcap

lswcap

Through the monthly AHC PC and HowPC magazine era, he has watched 'technology age' in online IT media such as ZDNet, electronic newspaper Internet manager, editor of Consumer Journal Ivers, TechHolic publisher, and editor of Venture Square. I am curious about this market that is still full of vitality.

Add comment

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most discussed

%d 블로거가 이것을 좋아합니다: