Plenty of blame for pedestrian fatality involving experimental autonomous vehicle

The pedestrian as well as the owner, operator and erstwhile regulators of an autonomous sport utility vehicle are listed as probable causes of a fatal accident in 2018.

The National Transportation Safety Board reported that the automated vehicle, based on a Volvo XC90, was being tested by the Advanced Technologies Group of Uber Technologies Inc. on March 18, 2018, when it approached a pedestrian walking her bicycle across a road in Tempe, Arizona. The vehicle’s automated driving system (ADS) tracked the pedestrian but did not “classify” her as a risk.

The operator of the vehicle was not looking at the road, NTSB stated. “She was glancing away from the road for an extended period,” according to video from an interior camera. “She was looking toward the bottom of the SUV’s center console, where she had placed her cell phone.”

The report presented to the NTSB on Nov. 19 explained:

“The ADS detected the pedestrian 5.6 seconds before impact. Although the ADS continued to track the pedestrian until the crash, it never accurately classified her as a pedestrian or predicted her path. By the time the ADS determined that a collision was imminent, the situation exceeded the response specifications of the ADS braking system.”

The system relied upon the operator in the vehicle to intervene, but she was not watching the road. “The operator redirected her gaze to the road ahead about 1 second before impact. ADS data show that the operator began steering left 0.02 seconds before striking the pedestrian, at a speed of 39 m.p.h.”

NTSB attributed some of the blame for the accident to the victim. “The pedestrian’s unsafe behavior in crossing the street in front of the approaching vehicle at night and at a location without a crosswalk violated Arizona statutes and was possibly due to diminished perception and judgment resulting from drug use. Toxicological tests on the pedestrian’s blood were positive for drugs that can impair perception and judgment.”

Several factors were attributed to the Uber Advanced Technologies Group, which “did not adequately manage the anticipated safety risk of its automated driving system’s functional limitations, including the system’s inability in this crash to correctly classify and predict the path of the pedestrian crossing the road midblock.”

Also, the board added, Uber “did not adequately recognize the risk of automation complacency and develop effective countermeasures to control the risk of vehicle operator disengagement, which contributed to the crash. Had the vehicle operator been attentive, she would likely have had sufficient time to detect and react to the crossing pedestrian to avoid the crash or mitigate the impact. ”

The board reported that Uber improved its safety practices following the accident.

NTSB called upon the state of Arizona and National Highway Traffic Safety Administration to enact greater oversight of “a developmental automated driving system on public roads.”

State regulation of testing is necessary, NTSB said, “considering the lack of federal safety standards and assessment protocols for automated driving systems, as well as the National Highway Traffic Safety Administration’s inadequate safety self-assessment process.”

Share this post