Who's responsible when an autonomous car crashes?

Fatal crash sparks Tesla Autopilot investigation
Fatal crash sparks Tesla Autopilot investigation

Tesla has a full plate, but it could soon get even more crowded.

The electric car maker is currently attempting to purchase the solar energy company Solar City, while also scaling up its own production to meet the unprecedented demand for the Model 3, Tesla's first mass-market vehicle.

And now Tesla may have to deal with a lawsuit over its autopilot system after a driver got in an accident using the self-driving technology on a Florida road in May.

Joshua Brown died when his autopilot system did not recognize a tractor-trailer turning in front of his Model S and his car plowed into it. The Ohio man's family is now being represented by a law firm with expertise in product defect litigation, and it's conducting an investigation. The federal government is also investigating Tesla's autopilot system.

"They're potentially vulnerable," W. Kip Viscusi, a Vanderbilt Law School professor who has written two books on product liability, said of Tesla. "A reasonable consumer might expect [autopilot] to work better, that you wouldn't be crashing into a semi that crossed the highway."

Related: Can we trust driverless cars?

Product liability attorneys have long expected lawsuits stemming from autonomous vehicles. Such lawsuits could trigger stricter safety regulations.

A Tesla owner bringing a lawsuit would likely allege a design defect -- that an alternate design would have made the car safer without impacting the vehicle's utility. There's also the potential for a class action lawsuit in which customers allege that defective autopilot software has damaged the resale value of their cars.

But currently it's unclear how many crashes have happened while autopilot was activated. A Tesla driver who crashed Friday on a Pennsylvania highway said autopilot was active, but Tesla has not conceded this. Safety regulators are looking into that incident as well.

According to Tesla (TSLA), Brown's Model S did not automatically brake to prevent the crash because of the high ground clearance of the trailer and its white reflection against a brightly lit sky. The Model S tunes out what looks like an overhead road sign to avoid braking unnecessarily, Tesla CEO Elon Musk said on Twitter.

Tesla said in a blog post Wednesday that there is no evidence autopilot malfunctioned at the time of the Brown's crash. It described the autopilot system as meant to keep a vehicle's position in a lane and adjust a car's speed to match surrounding traffic. Drivers are instructed to keep their hands on the wheel and maintain responsibility for the vehicle.

Related: Tesla driver killed in autopilot crash said the technology was 'great'

John C.P. Goldberg, a Harvard Law School professor with an interest in product liability, doesn't think that alleging a defect in a case would be a slam-dunk success.

"This is all new territory technologically and legally as well," Goldberg said. "There are well established rules of law but how they apply to the scenario and technology will have to be seen."

Tesla labels the technology as being in beta, meaning it is not finalized. It also reminds drivers to remain attentive and keep their hands on the steering wheel. Tesla has argued that given the rate of fatalities while using autopilot, it's actually safer than traditional driving. Some autonomous vehicle experts have cautioned that it's difficult to draw such conclusions from the available data.

Ultimately, Viscusi and Goldberg don't think lawsuits would be financially detrimental to Tesla. Worst case, Viscusi expects Tesla might be required to retrofit its vehicles and make modifications to how its autonomous systems works. Tesla, for example, could require drivers to keep their hands on the wheel while using autopilot, or devise a system to make sure drivers keep their eyes on the road.

These partially self-driving vehicles -- which take on some but not all driving responsibilities -- can be problematic because of the possibility of lulling a driver into a state of complacency.

Related: BMW promises fully driverless cars by 2021

While a system such as Tesla's can be counted on to handle many circumstances with flying colors, there are still cases like Brown's, when a driver will need to immediately assume control of the vehicle so that it doesn't crash.

Because of this issue, some have argued that self-driving vehicles should not include a steering wheel or pedals, leaving humans totally out of the driving task. That's the approach favored by Google (GOOG). Its test drivers have watched the company's cars drive more than 1.7 million miles in autonomous mode, but it has yet to let the public use the technology.

Chris Urmson, who leads Google's program, has said he hopes to have the technology ready before his son's 16th birthday in 2019.

CNNMoney Sponsors