On March 10, the pilots aboard Ethiopian Airlines Flight 302 were unable to correct a failure in one of the Boeing 737 Max 8’s automated systems, resulting in a crash and the deaths of all passengers and crew. A year earlier, almost to the day, another automated vehicle – not an aeroplane but an Uber self-driving car – struck and killed Elaine Herzberg in Tempe, Arizona.
The crash of Ethiopian Airlines Flight 302, as well as that of Lion Air Flight 610 in 2018, happened despite oversight from one of the most technologically capable regulators in the world. Air travel is remarkably safe in light of the potential risks.
The FAA cleared the 737 Max 8 and its flight control system to fly – and retained that clearance not only after the Lion Air crash but also for three days after the Ethiopian Airlines tragedy.
From airplanes to automobiles
As aeroplane automation is increasing, the same is true for cars. Various companies are testing autonomous vehicles on roads all around the country – and with far less oversight than the aviation industry. Local and federal rules are limited, often in the name of promoting innovation. Federal safety guidelines for autonomous vehicles require them to pass only the same performance tests as any other car, like minimum fuel economy standards, seat belt configurations and how well they’ll protect occupants in a rollover crash.
There’s no reliability testing of their sensors, much less their algorithms. Some states do require companies to report “disengagements” – when the so-called “safety driver” resumes control over the automated system. But mostly the self-driving car companies are allowed to do what they want, so long as there is a person behind the wheel.
In the months before the March 2018 collision, Uber was under pressure to catch up with GM Cruise and Waymo. Uber’s cars had a sensitive object-recognition system, which at times would be deceived by a shadow on the road and brake to avoid an obstacle that wasn’t actually there. That resulted in a rough, stop-and-start ride. To smooth things out, Uber’s engineers disabled the car’s emergency braking system. The company appears to have assumed the single safety driver would always be able to stop the car in time if there was really a danger of hitting something.
That’s not what happened as Elaine Herzberg crossed the road. The Uber self-driving car that hit and killed her did see her with its sensors and cameras, but was unable to stop on its own. The safety driver appears to have been distracted by her phone – in violation of Uber’s policies, though it’s unclear how the company briefed its safety drivers about the change to the automated system.
Regulators are relying on safety self-assessment practices, whereby private companies vouch for their own products’ compliance with federal standards. The best assurances they – and members of the public – have for the safety and reliability of these vehicles are the guarantees of the companies who intend to sell them.
This is all the more unnerving because there are far more cars on the roads than there are planes in the air – 270 million cars registered in the U.S. alone, compared with 25,000 commercial aircraft worldwide. In addition, self-driving cars have to handle not just weather conditions but also close-range interactions with other cars, pedestrians, cyclists and e-scooters. Safety drivers don’t get nearly the amount of training that pilots do, either.
Arizona, where we’re based, is a popular place for public testing of autonomous vehicles, in part because of looser oversight than in other states. In the Phoenix area, however, there is growing public concern about safety. Some citizens are harassing autonomous vehicles in efforts to discourage them from driving through their neighborhoods. As one Arizona resident told The New York Times, the autonomous vehicle industry “said they need real-world examples, but I don’t want to be their real-world mistake."
Connecting with the public, innovating responsibly
In the absence of federal safety standards for autonomous vehicles, states and local governments are left to protect the public – often without the expertise and resources to do so effectively. In our view, this doesn’t mean banning the technology, but rather insisting on corporate transparency and true regulatory oversight.
Engaging the public about what’s happening and who is – and isn’t – protecting their safety can help officials at all levels of government understand what their citizens expect, and push them to ensure that technological innovation is done responsibly.
Aeroplane and car passengers need to trust their vehicles and understand what risks are unavoidable – as well as what can be prevented. Relying on industry to self-regulate when lives and public trust are at stake is not a viable path to ensure that rapidly emerging innovations are developed and deployed responsibly. To the riders, customers and others sharing the road and the skies, there is only one bottom line – and it doesn’t have a dollar sign attached to it.
The Conversation Africa The Conversation Africa is an independent source of news and views from the academic and research community. Its aim is to promote better understanding of current affairs and complex issues, and allow for a better quality of public discourse and conversation. Go to: https://theconversation.com/africa
About the author
Adam Gabriele is a Ph.D. Student in Sustainability at Arizona State University.
Thaddeus R. Miller is an Assistant Professor at the School for the Future of Innovation in Society and The Polytechnic School; Co-Director Center for Smart Cities and Regions at Arizona State University.
LEGAL DISCLAIMER: This Message Board accepts no liability of legal consequences that arise from the Message Boards (e.g. defamation, slander, or other such crimes). All posted messages are the sole property of their respective authors. The maintainer does retain the right to remove any message posts for whatever reasons. People that post messages to this forum are not to libel/slander nor in any other way depict a company, entity, individual(s), or service in a false light; should they do so, the legal consequences are theirs alone. Bizcommunity.com will disclose authors' IP addresses to authorities if compelled to do so by a court of law.