23 June 2022

Fortune: “Elon Musk’s regulatory woes mount as U.S. moves closer to recalling Tesla’s self-driving software”

On Thursday, NHTSA said it had discovered in 16 separate instances when this occurred that Autopilot aborted vehicle control less than one second prior to the first impact, suggesting the driver was not prepared to assume full control over the vehicle.

CEO Elon Musk has often claimed that accidents cannot be the fault of the company, as data it extracted invariably showed Autopilot was not active in the moment of the collision.

While anything that might indicate the system was designed to shut off when it sensed an imminent accident might damage Tesla’s image, legally the company would be a difficult target.

All of Tesla’s current autonomous features, including its vaunted Full Self-Driving tech, currently in beta testing, are deemed assistance systems in which the driver is liable at all times rather than the manufacturer.

Christiaan Hetzner

It sure looks as if Tesla designed Autopilot to shut off immediately before an impact so the company can deny its software was at fault and evade legal liability…

Tesla Autopilot is among the advanced driver-assistance systems being scrutinized by the National Highway Traffic Safety Administration
Tesla Autopilot is among the advanced driver-assistance systems being scrutinized by the National Highway Traffic Safety Administration. Christopher Goodney/Bloomberg

There may be more reasonable explanations, such as Autopilot returning control to the driver in situations which is not yet equipped to handle, but ‘less than one second’ before a crash is not nearly enough time for a driver to react properly to avoid a collision. Having a feature called ‘Full Self-Driving’ may understandably cause the person in the driver’s seat to become complacent and not pay full attention to the road, which would make it even harder for them to react when Autopilot abruptly cedes vehicle control. And this goes to the core of the NHTSA probe, which is investigating whether Tesla’s driver assist systems increases safety risks by undermining the effectiveness of the driver’s supervision.

It’s worth mentioning that Tesla had other questionable business practices designed to conceal its vehicle failure rate, such as a nondisclosure clause in its customer repair agreements, which they were forced to revise.

Another data release from the US regulator this month shows that Tesla vehicles made up nearly 70% of the 392 crashes involving advanced driver-assistance systems reported over the past year. This time around, the agency required manufacturers to disclose crashes where the software was in use within 30 seconds of the crash, a clever way of circumventing Tesla (and possibly other car companies) disguising their failures with the method described above. It’s refreshing to see a governmental regulator taking its role seriously for once, and I’m curious what more they can uncover about Elon Musk’s overhyped promises around self-driving.

Post a Comment