Federal probe finds Tesla's Autopilot contributed to fatal crash

New info in Tesla autopilot death
New info in Tesla autopilot death

A federal investigation has found that Tesla's Autopilot is partly to blame in the fatal crash of a Model S last year.

Joshua Brown was killed in May 2016 when his Tesla crashed into a tractor-trailer in Florida while the self-driving software was active. Brown had not tried to control the car in two minutes, and the Autopilot didn't brake to avoid the collision.

The Tesla Autopilot wasn't designed for the road Brown was driving on, and drivers are expected to watch the road and take control as needed. Tesla Autopilot is not designed to handle the entire job of driving.

The investigation by the National Transportation Safety Board criticized Tesla for allowing Autopilot to be activated on roads it can't handle, and for how it determines whether drivers are engaged. It monitors wheel torque, and a driver could have a hand on the wheel without paying attention to the road.

The NTSB researchers noted that decades of research has shown humans are bad at monitoring automation systems. They may be lulled into complacency, and trust the Autopilot too much.

"We've found limitations with Tesla's system, but we think we find limitations with other systems," NTSB researcher Robert Molloy said Tuesday at a board meeting. "We think it's an industry-wide problem."

Tesla said in a statement that it will evaluate the NTSB's findings.

"We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times," the company said.

The NTSB finding conflicts with a report from the National Highway Traffic Safety Administration, which found in January that the crash was not the result of any defect. The report also found a nearly 40% reduction in crashes after Tesla autosteer installations.

But some safety experts have doubts about that finding. It's not clear whether NHTSA accounted for a Tesla software update that added safety features, including emergency braking. NHTSA declined to comment.

Related: Tesla driver killed in crash was warned multiple times

"In no shape or form does the NHTSA document pass for a scientific assessment," Missy Cummings, a Duke professor and director of its Humans and Autonomy Laboratory, told CNN.

Quality Control Systems, an organization specializing in data research, filed a lawsuit this summer to get the data from the NHTSA study.

1.25 million people are killed on roads every year, and human error is overwhelmingly the reason. Fully autonomous vehicles are widely expected to save lives, but it will be dangerous if automakers rely on partial autonomous systems in the meantime.

"Until we get there, it's going to be painful," NTSB chairman Robert Sumwalt told reporters afterwards. "But ultimately the utopia will be when everything is automated."

CNNMoney Sponsors