The Uber Crash, Failing To Learn From History

By Chuck Dinerstein, MD, MBA — Mar 26, 2018
The coverage of the Uber-caused fatality in Arizona continues to mislead us about our autonomous future. And since aircraft's history of automation can tell us about the likely path forward, why aren't we listening?
By Wald1siedel

The media and Twittersphere are full of articles and comments on the recent fatal crash involving an Uber vehicle and pedestrian in Arizona. I will leave the National Highway, and Transportation Safety Administration and the National Highway and Transportation Safety Board investigate the accident and publish their findings. But I feel compelled to point out a few salient points about autonomous technologies.

The Society of Automotive Engineers has published a classification of autonomous vehicles. Most cars currently on the road are at level 2, e.g., lane and cruise control – to be used in conjunction with an active driver, supervising and monitoring the environment. At the next level, the level of aircraft automation the pilot or drive cedes “full control of all safety-critical functions under certain traffic or environmental conditions to rely on the vehicle to monitor changes in those conditions requiring transition back to driver control. The driver is expected to be available for occasional control, but with sufficient transition time.” (emphasis added).

Flying at 35,000 feet, widely spaced from other objects provides you with ‘sufficient’ transition time in most circumstances. The loss of Air France 447 over the Atlantic, when the pilot was woken from a nap to take control of a system that was already alarming despite the presence of two co-pilots, highlights that situational awareness is an important component of safety. Take a look at the driver of that Arizona Uber in the moments before the crash. He is not distracted, he is not paying attention. And I would add, in the world of traffic, we do not have anywhere near the same amount of time to respond as an aircraft pilot would.

Level 3, is the current level of automation of our aircraft, vehicles that have had the benefit of nearly 100 years of development is one level lower than the complete autonomy to which cars aspire. If the truth be told, airplanes can fly entirely autonomously from takeoff to landing, but are you going to get in a plane that has no crew? So why would you get into a car with no driver and why would some experts believe we can safely remove the steering wheel and brakes for these autonomous vehicles? Because they are either crazy or do not read history.

It seems clear that the safety driver was not situationally aware - human error was involved. [1] In fact, if you look at the history of crash investigations in aircraft, human error is the predominant cause. Once these mechanical systems are developed, they have few problems. But again, the history with aircraft is that when the systems fail, human error is culpable – and that includes cases where they turned off the autopilot and when they turned it on. I expand a bit on the history of autopilots here.

 

The discussion around driverless cars involves job loss for truckers and taxi-drivers and the images of us jumping into a vehicle and mindlessly watching TV or reading a book as we are whisked to our destination. That will happen a day after we start flying around in pilotless planes.

 

Another concern, often ignored, is the training to use automated systems. Pilots, again the individuals working with our most autonomous vehicles, require significant training in the use of the safety systems; far more than the training necessary to simply fly a plane. And that training is recertified annually, not with a written test, but in simulators, to demonstrate the ability to take control, rather than simply the knowledge of how to take control. So given the possible need for safety drivers, who will train, regulate and certify their abilities?

Can more autonomous cars reduce accidents? Probably. But our technologists do us no favors in creating narratives where no one is at the wheel. It creates false expectations that legislators, being told these narratives by ‘experts’ use in guiding regulation. That is why the current legislation is more concerned with federal oversight, time limits on reporting of car failures, limitations on corporate responsibility for the autonomous vehicles used in testing and amending “…federal vehicle safety regulations to clear away any unintended hurdles for driverless vehicles — i.e. regulations that require equipment or systems that are necessary for a human driver, such as a steering wheel, but are irrelevant for a self-driving car,” and of course the concern de jour, data privacy. The concern about the loss of jobs, a promise rather than a reality given our experience with planes, is the ‘false news’ that causes Congress to exempt self-driving trucks from current regulations; after all, why complicate matters tangling with the Teamsters.

More autonomous cars are coming, but we do ourselves no favors by ignoring our 100 years of experience with autonomous aircraft in planning for how that future is likely to look. It’s not that the Uber crash is unimportant or the reporting false, it is that the coverage misleads, failing to dig deep enough to talk about what we have already learned about autonomous vehicles.

[1] Herzberg becomes visible in the car’s headlights as she pushes a bicycle across the road at least two seconds before the impact."This is similar to the average reaction time for a driver. That means that, if the video correctly reflects visible conditions, an alert driver may have at least attempted to swerve or brake," Smith said. From Bloomberg.

Category

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: