Video suggests huge problems with Uber’s driverless car program

Video suggests huge problems with Uber’s driverless car program

http://ift.tt/2GjcMah

There’s something very wrong with Uber’s driverless car program.

On Wednesday night, police released footage of Sunday night’s deadly car crash in Tempe, Arizona, where an Uber self-driving car crashed into 49-year-old Elaine Herzberg. The details it reveals are damning for Uber.

“The idea that she ‘just stepped out’ or ‘came out in a flash’ into the car path is clearly false,” said Tara Goddard, an urban planning professor at Texas A&M University, after seeing the video. “It seems like the system should have responded.”

The video shows that Herzberg crossed several lanes of traffic before reaching the lane where the Uber car was driving. You can debate whether a human driver should have been able to stop in time. But what’s clear is that the vehicle’s lidar and radar sensors—which don’t depend on ambient light and had an unobstructed view—should have spotted her in time to stop.

On top of that, the video shows that Uber’s “safety driver” was looking down at her lap for nearly five seconds just before the crash. This suggests that Uber was not doing a good job of supervising its safety drivers to make sure they actually do their jobs. The combination of these failures—and Herzberg’s decision to jaywalk in the first place—led to her death.

But zooming out from the specifics of Herzberg’s crash, the more fundamental point is this: conventional car crashes killed 37,461 in the United States in 2016, which works out to 1.18 deaths per 100 million miles driven. Uber announced that it had driven 2 million miles by December 2017 and is probably up to around 3 million miles today. If you do the math, that means that Uber’s cars have killed people at roughly 25 times the rate of a typical human-driven car in the United States.

Of course, it’s possible that Uber just got exceptionally unlucky. But it seems more likely that, even with the safety driver, Uber’s self-driving cars are way more dangerous than a car driven by the average human driver.

This shouldn’t surprise us. Uber executives know they’re behind Waymo in developing a self-driving car, and they’ve been pulling out all the stops to catch up. Uber inherited a culture of rule-breaking and corner-cutting from its founder and former CEO Travis Kalanick. That combination made a tragedy like this almost inevitable.

Uber probably wasn’t at fault, legally speaking, in recent crashes

An Uber self-driving car in San Francisco in 2017.
Enlarge /

An Uber self-driving car in San Francisco in 2017.

Justin Sullivan/Getty Images

Consider these recent crashes involving self-driving Uber cars:

In March 2017, an Uber self-driving car was struck on the left side as it went through an intersection in Tempe, Arizona. Uber was in the right-most lane on a six-lane road, approaching an intersection. The other two lanes in Uber’s direction were backed up with traffic. The other car was traveling in the opposite direction and making a left turn. The driver of that other vehicle said that cars stopped in the other lanes blocked her view, preventing her from seeing the Uber vehicle.

“Right as I got to the middle lane about to cross the third, I saw a car flying through the intersection, but I couldn’t brake fast enough to completely avoid the collision,” the driver of the non-Uber car said in the police report. Police cited the non-Uber driver for failing to yield the right of way. The Uber driver was not cited.

In February 2018, an Uber vehicle in Pittsburgh collided with another vehicle after the other car made a left turn in front of it. The Uber vehicle had its turn signal on, and the other driver thought this meant the Uber vehicle was going to turn at the intersection rather than go straight through. Uber says the car had its turn signal on because it was planning to change lanes.

Police did not determine who was at fault in the accident. But a Pennsylvania attorney told Ars that “generally speaking, before you take a left-hand turn, you’re required to ensure there’s not traffic coming from the other direction.”

In March 2018, we had this Sunday’s deadly crash in Tempe. Authorities have not reached any final conclusions about the case, but experts have told Ars there’s good reason to believe Herzberg may have been at fault, legally speaking. She was jaywalking in the middle of the night in a poorly lit area outside of a marked crosswalk.

“I think that preliminary results coming out is that the automation of the car was not at fault because the pedestrian stepped into the road,” said Mohamed Abdel-Aty, a civil engineer and traffic safety expert at the University of Central Florida.

So in all three of these incidents, there’s a strong argument that the other people involved—not the Uber car—were legally at fault for the crashes.

That doesn’t mean Uber’s cars are driving well

Jessica McLemore took this picture of the damage to her car shortly after a crash with an Uber vehicle in Pittsburgh in February 2018.
Enlarge /

Jessica McLemore took this picture of the damage to her car shortly after a crash with an Uber vehicle in Pittsburgh in February 2018.

Jessica McLemore

“One of my big concerns about this incident is that people are going to conflate an on-the-spot binary assignment of fault with a broader evaluation of the performance of the automated driving system, the safety driver, and Uber’s testing program generally,” said Bryant Walker Smith, a law professor at the University of South Carolina.

“Human drivers recognize that they are going to deal with all kinds of behaviors that are not exactly lawful,” he added. “An obligation imposed under most if not all state vehicle codes are that drivers shall take due care to avoid a collision. You never get to say, well of course I hit them, they were in the road in my way.”

Indeed, it’s entirely possible to imagine a self-driving car system that always follows the letter of the law—and hence never does anything that would lead to legal finding of fault—but is nevertheless way more dangerous than the average human driver. Indeed, such a system might behave a lot like Uber’s cars do today.

For example, in that March 2017 collision in Tempe, the Uber driver reported that he was traveling 38 miles per hour at the time of the crash—just shy of the 40-mile-per-hour speed limit.

“As I entered the intersection, I saw the vehicle turning left,” he wrote. “There was no time to react as there was a blind spot created by the line of southbound traffic.”

The Uber car may have had a legal right to zip past two lanes of stopped cars at 38 miles per hour. But a prudent driver could have anticipated the possibility of a car in the blind spot—or, for that matter, a pedestrian trying to dart between the stopped cars in the next lane—and slowed down to 30 or even 20 miles per hour.

So, too, in Pittsburgh. There was no police report, so we don’t know how fast the Uber car was traveling or if it tried to stop. But a prudent driver approaching an intersection where an oncoming car has a left turn signal on will slow down a bit and be prepared to stop—just in case the other car decides to turn illegally.

As for this month’s crash in Tempe, there seems to be little doubt that there was a serious failure of the car’s technology. It may or may not have been feasible for the car to stop based on camera data. But lidar works just as well at night as it does in the daytime. Even if Uber’s software didn’t know Herzberg and her bicycle was a person, a car should always slow down if it sees an object that big moving into its lane.

Moreover, if the car really couldn’t have stopped in time to avoid killing Herzberg, that seems like a sign that the car was driving too quickly. It’s not like she jumped out from behind some bushes.

Tech

via Ars Technica https://arstechnica.com

March 22, 2018 at 03:39PM

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.