Experts say crash video shows Uber's failure to protect pedestrian

Uber's deadly self-driving car crash caught on camera
Uber's deadly self-driving car crash caught on camera

New video of the fatal crash involving a self-driving Uber vehicle in Tempe, Arizona, shows a failure of the expected performance of the autonomous vehicle's technology, according to experts. The technology's backup -- the human test driver who was also in the vehicle at the time of the crash -- also failed to react.

The test driver is responsible for ensuring safe operation of the vehicle, which is operating with unfinished software that is a work in progress and known to have limitations.

The Tempe police department released a short clip Wednesday of the March 18 crash in which a Volvo XC90 operating in autonomous mode struck and killed a woman walking a bicycle across the road. Volvo supplies Uber a vehicle, which Uber then retrofits with its autonomous driving technology.

The clip shows the woman, Elaine Herzberg, walking across multiple lanes of traffic, as the SUV drives forward without slowing. The test driver appears to be looking down or out the side window.

Experts told CNN that video released by the police department -- which included both internal and external camera views -- appears to show the autonomous technology fell short of how it should be designed to perform.

Related: Self-driving cars are already really safe

"This video screams to me there are serious problems with their system, and there are serious problems with their safety driver not being able to pay attention, which is to be expected," Missy Cummings, an engineering professor and director of the Humans and Autonomy Laboratory at Duke University. Cummings has testified on Capitol Hill about autonomous systems.

"[The pedestrian] wasn't jumping out of the bushes. She had been making clear progress across multiple lanes of traffic, which should have been in [Uber's] system purview to pick up."

According to Bryan Reimer, research scientist in the MIT AgeLab and the associate director of the New England University Transportation Center at MIT, it is possible the video represents what's called an "edge case," a rare situation the autonomous system was never trained to handle.

He also said the video could represent a failure of a vehicle sensor or its algorithms, but it's unknown at this time if that's what happened. A full investigation has not yet been completed.

Reimer also criticized police for publicly releasing a video that could prematurely influence consumer perception of a technology that may one day help save lives.

The Tempe police department has not yet responded to a request for comment.

Self-driving advocates have long pointed to automation as a way to address motor vehicle crashes, which are overwhelmingly caused by human error.

"It is really hard to understand how the system didn't react, when even a standard automatic emergency braking system could reasonably be expected to at least hit the brakes," Michael Ramsey, a research director at Gartner's CIO Research Group, told CNN.

Related: The life of an Uber test driver

Autonomous car companies have test drivers on board to take over in case of emergencies. Companies hire human test drivers to sit behind the wheel of self-driving vehicles and take over when necessary -- for instance, if a car's sensors fail to recognize a bicyclist, pedestrian or other vehicle, or if the software system crashes.

Cummings said the video of the distracted Uber test driver is a reminder of how humans can develop too much trust in automated systems and may not be prepared to take control of them quickly when needed.

"The video is disturbing and heartbreaking to watch, and our thoughts continue to be with Elaine's loved ones. Our cars remain grounded, and we're assisting local, state and federal authorities in any way we can," an Uber spokesperson said in a statement.

The Uber test driver, Rafaela Vasquez, who has been cooperating with the investigation, has not responded to a CNN request for comment.

Tempe police chief Sylvia Moir told the San Francisco Chronicle Monday that the collision would have been difficult for any driver to avoid. The incident occurred at night and the pedestrian did not use a crosswalk while crossing the street.

"From what I know of autonomous systems, it was entirely avoidable," said Paul Godsmark, co-founder of the Canadian Automated Vehicles Centre of Excellence, a Canadian organization that advises governments as they prepare for autonomous vehicles.

"I'm very concerned that people are blaming the pedestrian," Godsmark said.

While the pedestrian is only visible in the video at the last second, the SUV's LIDAR and radar sensors should have detected and classified her to avoid a crash, according to Bryant Walker Smith, a University of South Carolina law professor who has written extensively about autonomous vehicles.

"I'm really looking forward to the [results of the] investigation and an explanation from Uber," Godsmark said. "Why didn't the [technology] see, and why didn't the car react?"

Personal Finance

CNNMoney Sponsors