There’s growing evidence Tesla’s Autopilot handles lane dividers poorly

There’s growing evidence Tesla’s Autopilot handles lane dividers poorly

https://ift.tt/2q9Acom

Enlarge /

Thanks in part to a functional safety barrier, the damage from the September crash in Hayward was much less severe than last month’s crash in Mountain View.

KGO-TV

Last week, we reported on the death of Walter Huang, a Tesla customer whose Model X vehicle slammed into a concrete lane divider in Mountain View, California. Tesla has acknowledged that Autopilot was active at the time of the crash.

Now KGO-TV, the local ABC television station in the San Francisco area, has uncovered details of another crash involving a Tesla vehicle slamming into a highway divider with Autopilot engaged. This incident occurred in Hayward, across the San Francisco Bay from Silicon Valley.

Thankfully, this crash was different in one important respect: there was a working highway safety barrier in front of the concrete divider. As a result, the crash was much less severe and the driver was able to walk away.

The September crash isn’t the only evidence that has emerged that Tesla’s Autopilot feature doesn’t deal well with highway lane dividers. At least two people have uploaded videos to YouTube showing their Tesla vehicles steering toward concrete barriers. One driver grabbed the wheel to prevent a collision, while the other slammed on the brakes.

Tesla argues that this issue doesn’t necessarily mean that Autopilot is unsafe. “Autopilot is intended for use only with a fully attentive driver,” a Tesla spokesperson told KGO-TV. Tesla argues that Autopilot can’t prevent all accidents but that it makes accidents less likely.

There’s some data to back this up. A 2017 study by the National Highway Transportation Safety Administration (NHTSA) found that the rate of accidents dropped by 40 percent after the introduction of Autopilot. And Tesla argues that Autopilot-equipped Tesla cars have gone 320 million miles per fatality, much better than the 86 million miles for the average car.

These figures don’t necessarily settle the debate. That NHTSA figure doesn’t break down the severity of crashes—it’s possible that Autopilot prevents relatively minor crashes but is less effective at preventing the most serious crashes. And as some Ars commenters have pointed out, luxury cars generally have fewer fatalities than the average vehicle. So it’s possible that Tesla cars’ low crash rates have more to do with its wealthy customer base than its Autopilot technology.

What we can say, at a minimum, is that there’s little evidence that Autopilot makes Tesla drivers less safe. And we can expect Tesla to steadily improve the car’s capabilities over time.

How Tesla and others combat driver distraction

The danger, of course, is that as Autopilot gets better, drivers could become increasingly complacent and pay less and less attention to the road. This could become a more serious concern as Tesla begins selling the mass-market Model 3 in volume, putting Tesla vehicles into the hands of ordinary drivers who are not Tesla superfans and might not fully realize the limitations of the Autopilot system.

Tesla tries to combat driver distraction by requiring the driver to keep his hands on the wheel most of the time. If the car detects that the driver has let go of the steering wheel, it will give several warnings and then gradually slow down and come to a stop. But it seems that these warnings were not aggressive enough to save the life of Walter Huang.

Cadillac has taken a different approach with its Super Cruise technology. Super Cruise vehicles are equipped with driver-facing cameras that track where the driver’s eyes are focused. If the driver’s gaze strays from the road for too long, the vehicle will issue a series of warnings and eventually slow down the car and stop it.

Also, Super Cruise will only engage if you’re on a limited-access highway that Cadillac has mapped in advance—something it has done for more than 160,000 miles of highway in the US and Canada. If the car knows exactly where all the lanes are supposed to be, that may allow it to avoid the kind of lane-confusion problems that may have contributed to Walter Huang’s death.

The long-run solution, of course, is to eliminate the driver distraction problem altogether by building cars that drive themselves so well that they never need to rely on human intervention for safety. Tesla says it’s working on this kind of full self-driving technology, but the project had some setbacks last year.

In contrast, Waymo believes it is very close to creating fully autonomous vehicles that are reliable enough for driverless commercial use—at least in Phoenix metropolitan area. The company is planning to debut a driverless taxi service in Phoenix later this year, and the company’s plan is for the driving technology to be so good that customers never need to take over the wheel.

Tech

via Ars Technica https://arstechnica.com

April 5, 2018 at 11:02AM

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.