NTSB Faults Driver & Autopilot In Fire Truck Crash

Credit to Author: Steve Hanley| Date: Thu, 05 Sep 2019 20:15:20 +0000

Published on September 5th, 2019 | by Steve Hanley

September 5th, 2019 by  

On the morning of January 22, 2018, a 2014 Tesla Model S rammed into the back of a parked fire truck that was responding to a prior accident. The fire truck was parked in the left hand travel lane of a highway at the time. The Tesla was operating in Autopilot mode at the time of the collision.

Credit: Culver City Firefighters Local 1927 tweet

On September 4, the National Transportation Safety Board released its findings after an investigation of the crash. As reported by CNBC, a statement by the NTSB read as follows: “The probable cause of the Culver City, California, rear-end crash was the Tesla driver’s lack of response to the stationary fire truck in his travel lane, due to inattention and over reliance on the vehicle’s advanced driver assistance system; the Tesla’s Autopilot design, which permitted the driver to disengage from the driving task; and the driver’s use of the system in ways inconsistent with guidance and warnings from the manufacturer.”

From data recovered from car after the collision, Tesla told the NTSB that Autopilot had been active for 13 minutes and 48 seconds before the crash and the that the driver’s hands weren’t on the steering wheel for the majority of that time, according to a report by Transportation Topics.

A witness to the crash, which occurred on the 405 freeway in Culver City, California reported seeing the Tesla speed into the fire truck without braking. “I could see the driver and I saw his head leaned far forward as he appeared to be looking down at a cell phone or other device he was holding in his left hand,” according to a written statement released by NTSB. “The driver’s positioning struck me as odd and concerning because it was clear to me he was very focused on his phone and wasn’t watching the road ahead at all, even though he was quickly approaching the stopped fire engine.”

The NTSB found no indication that the Tesla driver had been texting or making a call at the time, but couldn’t determine whether the phone was being used for other purposes.

Tesla says the car was sending warnings to the driver prior to the crash, reminding him to place his hands on the wheel, but the driver claims he was holding the bottom of the wheel at the time. He says a large vehicle in front of him was blocking his view of the road ahead. That vehicle swerved suddenly to avoid the fire truck and the Tesla driver did not have enough time to react and take evasive action himself. He says he was looking forward at the time, which contradicts the written witness statement.

Tesla issued the following response to the NTSB report:

“Tesla owners have driven billions of miles with Autopilot engaged, and data from our quarterly Vehicle Safety Report indicates that drivers using Autopilot remain safer than those operating without assistance. While our driver-monitoring system for Autopilot repeatedly reminds drivers of their responsibility to remain attentive and prohibits the use of Autopilot when warnings are ignored, we’ve also introduced numerous updates to make our safeguards smarter, safer and more effective across every hardware platform we’ve deployed.

“Since this incident occurred, we have made updates to our system including adjusting the time intervals between hands-on warnings and the conditions under which they’re activated.”

Mike Ramsey, senior automotive research director for Gartner, a global consulting firm, tells CNBC, “When an investigative authority concludes the design of something you made has contributed to a serious accident, that is bad news for an automaker. Tesla has not always been super clear about Autopilot.

“They say in the fine print this was designed as a Level 2 system, and you’re supposed to keep your hands on the wheel. But then they will also talk about and demonstrate this system as if it’s a driverless car. This creates an environment where drivers wink and say we know it’s not supposed to be used this way, but we’ll just drive with our hands off the wheel.”

Ramsey warns that Tesla could face a recall of its Autopilot-equipped cars if vehicle safety authorities, including the National Highway Safety Administration, agree with the NTSB’s conclusions and decide that flawed Autopilot design can cause serious accidents. Such a recall would be a serious blow to Tesla’s plans to deploy a fleet of robotaxis in the near future.

One of the main reasons Tesla is such a strong advocate for semi-autonomous driving technology is because of Elon Musk’s belief that they can save lives. There are more than 40,000 highway fatalities on US roads every year and many more than that around the world. The Tesla quarterly Vehicle Safety Report seems to bear out the company’s claim that driving a Tesla on Autopilot is safer than driving a conventional car.

Part of the problem is surely attributable to human failings. Despite all the warnings from our parents, we still break arms and legs falling out of trees or jumping off the garage roof using an umbrella for a parachute.

It only takes a few minutes watching America’s Funniest Home Videos to realize what a bunch of idiots we are. No matter what warnings Tesla may give, a certain percentage of drivers will ignore them if for no other reason than that’s what people do. If we want a risk free world, we should probably let machines take over completely and not allow people to drive cars at all.

If I could have a word with Elon Musk, I might encourage him to dial his claims about what Autopilot can do back a notch or two. Not that Elon, the ultimate risk taker, would give a moment’s thought to anything I might say. Still, despite all the company’s protestations to the contrary, it does seem to suggest that Autopilot has more functionality that it really does.

There is also a question why cars operating in Autopilot mode seem to have difficulty identifying really large trucks in their path and reacting appropriately. We might like to entertain the idea that Teslas can drive themselves but they can’t. Not yet, anyway. Yet we see videos of Tesla drivers asleep at the wheel or sitting in the back seat reading the newspaper all the time.

Perhaps the problem is that Elon Musk can’t admit to himself how human people are. They can buy small weights on the internet and drape them over the steering wheel to simulate a hand on the wheel so they can play Donkey Kong on their cell phone while the car drives itself. The fault, dear Elon, may not be in your machines but in the people who use your machines. Even you, with all the resources at your disposal, have not yet figured out how to overcome simple human foibles. 
 




Tags: , , , ,

Steve writes about the interface between technology and sustainability from his home in Rhode Island and anywhere else the Singularity may lead him. His motto is, “Life is not measured by how many breaths we take but by the number of moments that take our breath away!” You can follow him on Google + and on Twitter.

https://cleantechnica.com/feed/