Tesla-gate: Bad Numbers, Important Story

Image © stock.adobe.com

So the Tesla autopilot has been involved in more than 700 accidents over the past few years. That’s bad, but how bad? That depends on the metric. How do accidents caused compare to accidents prevented? Are autopilots good or bad? The answer: We don’t know.

Still the newsbeat is up in arms about Tesla’s (allegedly bad) autopilot. Because the numbers don’t lie, do they? Well, they do and we know it. And we forget. So what looks like a complete failure based on isolated (and correct) numbers may actually be OK, even good. What we do know is the obvious, that humans are involved in every crash – counting more than 2 million (US domestic) per year. And that driving is dangerous. There is always a chance some moron (like me or you) is going to be inattentive, sleepy, texting, intoxicated, speeding etc. causing an accident, possibly fatal. Or some technical failure, act of God, whatever. And it’s not like the chances are very low either. In 2021, almost 50.000 people died in traffic accidents in the US. The estimated cost of all 2021 accidents – according to iihs.org – was USD 340 billion. These are well known numbers, well known risks that we obviously are willing to take.

These numbers make up the background when specific brands or technologies are evaluated. The Tesla case (‘Tesla-gate’) is about domestic (US) accidents in the 4 year from 2019 through 2022, a period during which 161.000 traffic fatalities were reported. 17 of them involved the Tesla autopilot in one way or other. This is a tragic, but very small number. And while the autopilot has been found at least partly guilty in some cases, the jury is still out on most of them – simply because it is impossible to recreate the exact situation or guess how the situation would have been handled without the autopilot. We don’t have the data, which is surprising given the amount of technology involved and the amount of data these technologies collect and use.

In June of 2022 the NHTSA, the government body overseeing US road and traffic safety, published a report that was expected to be very bad for Tesla and its autopilot. It was not. The report – and other reports from the same source – describe and acknowledge the difficulty of getting usable data, creating standards, testing, even evaluating the data (crash reports) from ‘Advanced Driver Assistance Systems’ (aka autopilots where the degree of ‘auto’ is variable), regardless of make. What seems so simple from the outside – collect all the data, recreate the situation, figure out what happened – isn’t. There may be lots of data, but only a fraction is available and even less usable – as explained by the NHTSA under the headline “Access to Crash Data May Affect Crash Reporting” in Summary Report: Standing General Order on Crash Reporting for Level 2 Advanced Driver Assistance Systems:

Crash data recording and telemetry capabilities may vary widely by manufacturer and driving automation system. Many Level 2 ADAS-equipped vehicles may be limited in their capabilities to record data related to driving automation system engagement and crash circumstances. The vehicle’s ability to remotely transmit this data to the manufacturer for notification purposes can also widely vary. Furthermore, Level 2 ADAS-equipped vehicles are generally privately owned; as a result, when a reportable crash does occur, manufacturers may not know of it unless contacted by the vehicle owner. These limitations are important to keep in mind when reviewing the Summary Incident Report Data.

Few if any standards, many players, diverging interests, politics and much more. Needless to say, it’s complicated. Incomplete, possibly erroneous data for any number of reasons, including redactions (filtering) for privacy reasons, double reporting (several, some times supplemental, reports from the same accident), data entry errors/oversights and more (if you’re more than average interested, browse the mentioned report from NHTSA – it’s enlightening as to the challenges involved. Check this link for an explanation of the levels of driver assistance/automation).

Does that mean Tesla is exonerated? Yes and no. The data are incomplete and inconclusive. Which means that this is not really about Tesla or any other manufacturer, but about our ability to handle the challenges of fast change, inflated expectations, naive users/customers/drivers, lagging regulations, manufacturers/operators holding back data – and shallow (misleading) news stories devoured by shallow readers (that’s you and me). And finally the big question that no one likes to ask because no one wants to answer: Are driving assistance tools a curse or a blessing. More specifically – at what level are (advanced) driving assistance systems lulling drivers into complacency and thus becoming dangerous?

Which brings us back to where we started. It’s more or less the same question – ‘are autopilots good or bad’ – and the same immediate answer: We don’t know. Correction: We don’t know for sure, but we do have a hunch – based on common sense and a lot of experience: Any mechanism, tool, helper (or distractor) that reduces the driver’s attention to driving is dangerous and should be avoided.

So that’s where we’re still at. The ‘Tesla-gate’ accident numbers are real, but they’re just numbers – and probably not much different from other manufacturers’ numbers. They don’t have a story to tell because there is no context, no real metric to gauge them by. And it’s likely going to stay that way for quite some time because the car manufacturers don’t like more regulations.

Is that bad? It is, but it’s also a familiar path of progress. And it doesn’t hurt us unless we want it to: We – you and I, the customers – are still literally in the driver’s seat. We decide which features we want to pay for and which we want to use. Dozing assistance should not be among them.

In fact, advanced driver assistance systems are likely to turn us into dangerous zombies. We don’t need a push down that road, do we?

What we do need is fully autonomous cars – sooner rather than later. Given where we were 5 years ago, we should have been there by now. Fully autonomous. But a big obstacle popped up: Humans! (Autonomous Lifesavers).

Leave a Reply

G-YEJJDB2X5L