Dr. Timothy Carone is a former astrophysicist and an associate teaching professor at the University of Notre Dame's Mendoza College of Business. Carone specializes in automation and artificial intelligence and is the author of "Future Automation -- Changes to Lives and to Businesses." The opinions expressed in this article belong to him.
We are in the midst of a technological revolution.
Over the past 40 years, we've witnessed rapid changes to technology and discovered new ways of using it that were unanticipated.
Unfortunately, this revolution has had some gut-wrenching implications that we were unprepared to accommodate. In some cases, it's clear that technology is moving far faster than our culture.
Uber's fatal self-driving car crash in Tempe, Arizona is the latest example.
On Sunday night, a self-driving Uber SUV struck and killed 49-year-old Elaine Herzberg as she walked her bicycle across a street. The crash occurred while Rafael Vasquez, a 44-year-old test driver from Uber, was behind the wheel, according to the Tempe police.
It's believed to be the first fatality involving a fully autonomous car. Uber responded by removing self-driving cars from the roads.
As we continue testing and perfecting the technology behind self-driving cars, there will be more accidents like this.
Related: Uber's self-driving car killed someone. What happened?
Self-driving cars may seem like the logical step in the evolution of personal transportation. They are conceptually simple, easy to understand and in many ways, very desirable.
But they are not cars in the usual sense. These cars are complex systems. They are the most complex robots that humans have built.
Self-driving cars contain hundreds of computer processors. The amount of software these processors run is greater than the combined amount of software in the Chevy Volt, the F-35 fighter jet and Facebook.
The software contains many different types of artificial intelligence capabilities that can detect an object near a car, identify the object and decide what to do next with the car.
Complex systems like this take a decade or more to mature into dependable operation.
Google started testing its first self-driving car in 2009. The effort continues almost 10 years later and will continue for another 10 years, at least.
The same timeframe is true for the other self-driving car manufacturers. They have solved the basic problems with self-driving cars. Now, they're in the process of putting cars on the road.
There will inevitably be accidents during this transitional period.
Related: Lyft tests $199 monthly subscription plan for rides
We won't understand how they occurred, nor will we be able to interrogate a self-driving car to ask why it drove off the bridge with the family of five in it. We will have to deal with more deaths and the destruction of property that to us appears unfair and arbitrary.
In the meantime, test drivers will need to be the ones to make the tough choices behind the wheel.
Driving a car can seem like a rote process, but it is not. We make complex decisions and value judgments continually when we are behind the wheel, and eventually, so will the car.
The danger with being in this transitional period is that companies on the frontline of this technology will be forced to slow their efforts due to unnecessary regulatory interference.
Complex systems, including self-driving cars, are never free of defects. But the researchers and innovators behind self-driving technology argue that autonomous technology will eventually make driving much safer some day because it will take unpredictable human behavior out of the equation.
Therefore, calls to stop all self-driving cars from operating on the roads until they are proven safe demonstrate a true lack of understanding of how complex systems are made operational. This approach would have slowed down progress with aviation safety in the 20th century so that we would not now have the safe skies we do.
Related: Loophole would protect self-driving car companies from lawsuits
The federal government and its partners in the state and local governments need to understand that they cannot change the process for how complex systems evolve over time.
Their role is to ensure that when accidents occur, there is the proper analysis of what happened and that this information is available for anyone to use. This is the time for increasing transparency into the maturation process, not stopping it.
According to the National Safety Council, a nonprofit that promotes health and safety in the US, 110 people will be killed today, tomorrow and the next day by traditional cars.
Today, we need to maintain the momentum of self-driving car implementation so that in 10 years, that number is far lower instead of staying the same because we feared technology.