Published: Wed, September 13, 2017
Business | By Max Garcia

'System safeguards' lacking in Tesla crash on autopilot: NTSB

'System safeguards' lacking in Tesla crash on autopilot: NTSB

"A driver could have difficulties interpreting which roads it might be appropriate [to use Autopilot]", Ensar Becic, an NTSB human performance investigator, said during the hearing.

"I think it's important to clear up a possible misconception", Sumwalt said.

It was the first known fatal auto crash of a highway vehicle operating under automated control systems.

The crash has been documented by at least three teams of investigators, including one from the NTSB, which issued a preliminary report in June.

Brown, a 40-year-old OH man, was killed near Williston, Florida, when his Model S collided with a truck while it was engaged in the "Autopilot" mode.

The NTSB staff say that the way Tesla and other carmakers measure whether a driver is paying attention by monitoring whether the steering wheel is being moved doesn't accurately reflect whether people are even looking at the road. NHTSA said Brown did not apply the brakes and his last action was to set the cruise control at 74 miles per hour (119 kph), less than two minutes before the crash - above the 65-mph speed limit.

The Tesla careened under the truck's trailer, traveled nearly 300 feet farther and snapped off a utility pole, spinning around into a front yard about 50 feet away. A preliminary report released earlier this year by the NTSB found that Brown kept his hands on the wheel of his 2015 Model S70D for only 25 seconds of an extended 37-minute period where his vehicle was in Autopilot at 74 miles per hour. The board also noted the driver of the truck, who did not make himself available for the investigation, likely had time to see Brown's Tesla before the crash.

The board's final report declares the primary probable cause of the accident as the truck driver's failure to yield, as well as the Tesla driver's overreliance on his car's automation - or "Autopilot", as Tesla calls the system.

The National Highway Traffic Safety Administration joined the NTSB, the highway patrol and Tesla in investigating the crash. The NTSB report did find that a Vehicle-to-Vehicle communication system (V2V) could have alerted both vehicles to the potential danger, but as we have discussed ad nauseam, V2V is still absent from new cars even though the spec is nearly 20 years old.

Most of the headlines in the aftermath of the crash were accurate, but others confused the Tesla with a fully-autonomous vehicle. The crash was the first known fatality to occur while Autopilot was activated.

The Model S is a level 2 on a self-driving scale of 0 to 5.

With a public already skeptical about fully autonomous cars, reaction to the initial mishaps may play a significant role in determining how quickly Americans get comfortable with the new cars.

NTSB recommended that NHTSA require automakers to have safeguards to prevent the misuse of semi-autonomous vehicle features.

Level 2 systems like Autopilot are not meant to replace a human driver; although they will handle the steering and acceleration and braking of the vehicle, the human driver is responsible for situational awareness at all times. "Nobody wants tragedy to touch their family, but expecting to identify all limitations of an emerging technology and expecting perfection is not feasible either", the statement said. "Our thoughts are with the entire Brown family", the company said in a statement Monday. "There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the auto".

Like this: