A new federal government report provides new details into the circumstances surrounding the death of a pedestrian who was struck by a self-driving Uber in March.
According to the National Transportation Safety Board, Uber's self-driving car accurately identified pedestrian Elaine Herzberg, 49, as she walked a bicycle across a Tempe, Arizona, road. But Uber had turned off the vehicle's automatic emergency braking, so the SUV did not attempt to brake.
The SUV also lacked a way to alert the human driver behind the wheel to manually brake.
The report also noted Herzberg had methamphetamine and marijuana in her system at the time of the crash.
Uber's self-driving Volvo SUV, which observed Herzberg six seconds before the crash, didn't know "what" she was at first. The software originally classified her as an unknown object, then a vehicle, then a bicycle, the report said.
Related: Experts say crash video points to Uber's failure to protect pedestrians
At 1.3 seconds before the crash, the self-driving system realized emergency braking was needed to prevent a crash. But Uber had disabled the feature to reduce the potential for unwanted braking, such as for a plastic bag in the road.
"The most shocking portion of the report is emergency braking maneuvers were not enabled," said Bryan Reimer, research scientist in the MIT AgeLab and the associate director of the New England University Transportation Center at MIT. "Is the driver expected to look at the outside world continually? It's impossible when you're providing tasks that interfere with that."
According to the report, Uber operators are responsible for monitoring diagnostic messages that appear on the vehicle dashboard. They also must tag notable events for later review. A short clip of the crash showed the vehicle operator was looking down shortly before the crash.
Related: The life of an Uber test driver
Some self-driving car companies use teams of two in their test vehicles. A person behind the wheel monitors the road, and someone in the passenger seat takes notes on a laptop.
The vehicle operator, Rafaela Vasquez, told the NTSB she was monitoring the self-driving system's interface. Vasquez had her personal and business phones in the car, but they were not in use, she told NTSB investigators in a post-crash interview.
The self-driving system was not designed to alert Vasquez when emergency braking may be needed. That was a mistake, according to experts.
"There absolutely should have been an indication to the safety driver of the need for emergency braking," said Missy Cummings, an engineering professor and director of the Humans and Autonomy Laboratory at Duke University. "How else would the safety driver know there was a problem ahead, particularly in the dark?"
"This highlights a huge problem with an overtrust of the systems by the engineers and a lack of Uber's understanding for the need to include explicit vehicle-safety driver communication," she added.