Op-Ed Commentary: How Dangerous Is It Testing Autonomous Vehicles In The Real World?

Published on

Op-Ed Commentary: How Dangerous Is It Testing Autonomous Vehicles In The Real World?

By John M. Simpson, AUTOMOTIVE TESTING TECHNOLOGY INTERNATIONAL

July 12, 2018

How dangerous is it testing autonomous vehicles in the real world?

John M Simpson, Consumer Watchdog’s privacy and technology project director, believes that vehicle developers are moving too quickly to deploy self-driving cars without adequate enforceable safety standards.

Lured by the conviction that driverless cars will someday offer improved safety on our highways, developers like Waymo, GM Cruise and Uber, are rushing to deploy self-driving cars without adequate enforceable safety standards.

Moreover, they are using our public highways as their private laboratories, turning us into human guinea pigs. We’ve already seen one death in Arizona, when an Uber self-driving car that was being tested struck a woman crossing the street while pushing a bicycle.

In the USA, the National Highway Traffic and Safety Administration (NHTSA) should have enacted safety standards. Instead, President Trump’s secretary of transportation, Elaine Chao, has abdicated responsibility, with her department issuing so-called guidance asking that companies voluntarily file ‘safety self-assessments’ that are supposed to explain how their autonomous cars are dealing with various safety issues. So far only two companies – Waymo and GM Cruise – have filed, and their reports are more like flashy marketing brochures.

California has the most demanding – though still inadequate – regulations of any of the states. Before companies can begin testing driverless cars, they must get a permit. Fifty-four companies have done so. If there is a crash, they must report it within 10 days. Most importantly, companies conducting testing must file annual disengagement reports that explain when their autonomous technology failed and humans had to save the day.

It’s from these disengagement reports filed in January 2018 that we have learned much about the actual state of self-driving technology.

Waymo, the new name of Google’s autonomous vehicle unit, reported that its technology disengaged 63 times, or once every 5,596 miles, because of deficiencies in the technology and not “extraneous conditions” such as weather, road construction, or unexpected objects, as often presumed. The most common reasons why human test drivers had to take control of a car were deficiencies in hardware, software or perception, Waymo’s report said.

GM’s Cruise division, which claims it will be ready to deploy driverless cars on the road for public use in 2019, logged the second-most miles of the companies that were required to report on their testing. Its cars drove a total of 131,675 miles and had 105 disengagements, or one every 1,254 miles.

GM Cruise’s report revealed that its autonomous cars cannot correctly predict the behavior of human drivers, as 44 out of the 105 disengagements (about 40%) in which a driver took control were cases where the technology failed when trying to respond to other drivers on the road.

All other companies that released specific data detailing reasons for the disengagements, including Nissan and Drive.ai (a technology startup with Lyft), confirmed Waymo’s and GM Cruise’s experiences. Nissan said it tested five vehicles, logged 5007 miles and had 24 disengagements. Meanwhile, Drive.ai had 151 disengagements in the 6,572 miles the company logged.

From the collision reports, we can see that many of the crashes occurred at relatively low speeds when the vehicle was rear-ended. Thus, we can conclude that the self-driving vehicle did not perform the way the human driver of the other vehicle expected. The interaction between human drivers and driverless cars remains one of the major challenges as self-driving vehicles are developed.

Unlike California, Arizona has been a wild west with virtually no regulations at all. A National Transportation Safety Board (NTSB) investigation into the fatal Uber crash in March in Tempe found that the car sensors recognized the pedestrian six seconds before the crash, yet no action was taken by the autonomous vehicle.

The ‘safety’ driver, who was the only occupant of the vehicle, was apparently distracted by other technical duties and did not intervene. Up until a few weeks before the crash, Uber had required two people in the car to monitor testing. What’s more, the Volvo that was being used to test comes with AEB as a standard feature, which Uber had disabled for the tests.

There is another danger surrounding autonomous technologies – the threat posed by companies when they market driver assistance features in a way that leads drivers to believe the car is more capable of traveling without driver oversight than is actually the case. For example, there have been a number of crashes – including three fatalities – in which drivers relied on Tesla’s autopilot to do more than it can.

Autonomous vehicles may some day offer greater safety, just as ADAS such as AEB can now offer when deployed properly. However, we must not succumb to the siren song of the autonomous car developers who are over promising what autonomous vehicle technology can do today.

The general deployment of autonomous vehicles on our streets and highways should be banned until enforceable safety standards covering autonomous vehicles are in place. Permits allowing testing of autonomous vehicles can be approved, so long as there is a test driver in the car who can take over the vehicle and there is complete transparency about the test programs. Responsible regulation goes hand-in-hand with innovation.

Voluntary ‘standards’ in the US auto industry have repeatedly been proven to be weak and insufficient. Safety must come before the auto makers’ bottom lines.

John M Simpson can be contacted at [email protected].

Consumer Watchdog
Consumer Watchdoghttps://consumerwatchdog.org
Providing an effective voice for American consumers in an era when special interests dominate public discourse, government and politics. Non-partisan.

Latest Videos

Latest Releases

In The News

Latest Report

Support Consumer Watchdog

Subscribe to our newsletter

To be updated with all the latest news, press releases and special reports.

More Releases