Teslas probably aren't safer when on Autopilot, so do they need driver monitoring?

Topic: 
Tags: 

For some time, Tesla has published numbers to suggest that driving is safer with autopilot than it is without it, in that cars have fewer accidents per mile with autopilot on than with it off. The problem is autopilot is mostly on when on the highway, when driving is safer, so this would naturally be the case.

Some new data suggests that it's actually modestly less safe or at best a wash.

Driver monitoring would probably change that, but Tesla resists that. I discuss the issues in a new Forbes.com article at Teslas probably aren't safer when on Autopilot, so do they need driver monitoring?

Comments

Elon is a lot smarter than I am (and the rest of the population), but it's been obvious for a long time now that he really blew this decision to stick with no Driver Monitoring System (DMS). His main argument was that all Teslas would soon be L4/L5 autonomous and the human driver would become superfluous, so why bother with a DMS? This was also a poor prediction. Another argument of his was that no good DMS existed. I don't think this should have stopped him from making one. George Hotz says it wasn't difficult to work up the software for it in 1 week.

The driver facing camera inside the Model 3 is not IR, so it probably wouldn't work too well at night or with sunglasses.

Elon blew it. If he had done the right thing from the start, Tesla owners could have a better driving experience right now - driving hands-free (but eyes-on) and wouldn't have the steering wheel nag thing going on.

It is an interesting question. If hands are off it will take a little longer for them to return to grab the wheel. But probably not a lot. This has been tested but I don't have the data handy.

If somebody other than Tesla did it, I would expect the gaze nagging to be as annoying as the hand nagging. But Tesla might (and only Tesla) let the driver set the nag level.

As I noted, I want lane departure warning but Tesla's is really loud and really annoying when I do harmless departures on empty roads.

Doesn't this analysis assume that the distribution of accidents per mile is the same for on-freeway and off-freeway driving? What justification do you have for this assumption?

You said it yourself, the actual data isn't available. What's the point of this article if it relies on such a blunt estimate, especially since the paper wasn't even published yet?

What I am assuming is that accidents per mile off-freeway are 3 times more frequent. Fatal accidents are 3 times more frequent, and if I were to hazard a guess, due to the higher speeds, I would think a higher fraction of highway accidents would be fatal making the number even more than 3 times worse off-highway.

It would be nice to get data that is precise. However, I claim it to be a reasonable assumption that freeways have lower accidents per mile than regular roads by a good margin, in the range of 2 to 5. If you find a better result I would be interested to see it and use it. Due to the faster speeds, this is a much more similar rate of accidents per hour of driving -- which also makes some sense.

I don't care enough about this to debate you on each of your sloppy assumptions, so let me ask a more general question: why did you write this article based on a paper that hasn't been published yet?

At the AVS conference, and I had been waiting for these numbers to analyse Tesla's numbers, so I wrote it once I had access to the data in the MIT paper.

How come the name of this blog is "Teslas probably aren't safer when on Autopilot, so do they need driver monitoring?" but the forbes article is "Teslas Aren’t Safer On Autopilot, So Researchers Calling For Driver Monitoring May Be Right".
Considering all the assumptions it's a bit of a leap to drop the probably, better clickbait though.

Is which driver is at fault for the accidents. If the other driver is overwhelmingly at fault for accidents v tesla, then I would argue these assumptions are pretty much incorrect.

Wow - what a definitive tittle with almost entirely presumptive data and analysis (I lost track of the # of assumptions). The only thing this article did a great job of was convincing me this author and editorial are more interested in click bait than providing useful information on a pretty serious topic. Whatever puts food on the table I guess

Every statement in the title is true.

"Teslas probably aren't safer when on autopilot". You haven't provided any evidence regarding the probability of Teslas being safer on autopilot, despite you claiming otherwise.

Yes, I have. I have provided the math that shows it, with the very reasonable presumption that accidents are less common per mile on highways, as pretty much everybody accepts as true. If you don't accept that as true, what data do you have on that?

I don't think you should say that a product is less safe because people intentionally choose not to use it properly.

Driver monitoring systems? Only if you're allowed to turn them off. And the people who don't use them shouldn't have to pay for them.

I'm cool with that, though it would probably create more liability for you if you turned it off, but the logs show it wanted to give you an alert that you weren't paying attention, and you had disabled the alert. If you live, anyway, which you do most of the time.

The question is, to what extent are people choosing to use it incorrectly and how many are falling into that pattern without conscious choice? In the case of the Mountain View Fatality, this driver knew the car had a problem at that off-ramp and complained about it. Yet, somehow he fell into not looking. It is speculated that he was playing a game, but we can't be 100% sure of that. (He was playing it earlier in the trip.)

And then there is the question of hitting other people. As I said, if you hit somebody because you turned off the monitoring, I suspect you're going to face a big lawsuit for deliberate negligence.

Are you using that as a legal term? I don’t think it applies. You would, of course, be guilty of negligence.

Sure, the question is what extent people are intentionally not using the system properly vs. the extent to which the system is causing the problem. That’s a very difficult thing to measure.

Tesla should be found liable for the Mountain View Fatality, in my opinion. But that was several years ago, and it’s one incident. Most of the others would have happened just as easily on cruise control, and we don’t blame cars for having cruise control when people don’t pay attention to the road. (Does cruise control increase risk? Should cars that offer cruise control be forced to have driver monitoring?

Turning it off should mean no logs of that sort are produced.

Negligence occurs when you don't meet the standard of care. You can fail to meet it without intent, or with intent. If there is a system to warn you when you are not meeting the standard, and you turn it off, and then don't meet it, I think there's a case for willful negligence

Any case law for that?

I'm skeptical of that Mountain View crash. As I understand it, the car came to a split in the road and crashed into the divider at 70mph. That simply is not how AP works. AP would have picked a lane and stuck with it. I think the driver noticed at the last second AP was taking him the wrong way, tried to correct, and was either too late or lost control. Driver error with tragic results.

What happened was that autopilot decided the "gore" of the off-ramp was a lane, and it was the lane it should drive in -- straight into the crash barrier. This has been extensively worked out from the car's logs, it is not in question.

The car had even done this before. The driver complained about it. But this time, he didn't grab the wheel and correct it.

I had not heard those details before. Yikes.

Agree that it's hard to conclude the exact difference in safety. However, if Tesla AP is not clearly worse than humans, then why any additional burden? Especially since the system is improving almost every quarter. Otherwise why not argue for OEMs to limit speed capability to 85 or block cell signals since misuse there adds risk as well.

While the new FSD rewrite may not deliver L5 autonomy, it almost certainly will cut down on the sensational incidents due to enhanced vision capability. And with that, I think the noise around AP monitoring will settle down somewhat.

Do you have real evidence that Tesla Autopilot is getting better (safer) every quarter? They can't seem to stop running into stationary objects in the roadway, or partially in the roadway. Last week another California State Trooper's SUV just got smashed into. Luckily nobody died or was seriously injured.

Well, their own stats show that, plus I use it and see the improvement. I agree those category of crashes are a problem, but it's very rare and I believe will be much closer to solved with the coming rewrite of their vision processing technology. But time will tell on that.

Generally, in the automotive safety community, the feeling is if you can take steps to easily avoid accidents and deaths, you should, unless the burden is high.

Driver monitoring is now a well established technology. There is free open source software available to do it. Teslas, especially those shipped in the last year, all have more than enough resources to do it. Competitors to Autopilot such as GM super cruise do it. Tesla has no technical excuse not to do it.

I would actually prefer my car let me take my hands off the wheel if I keep my eyes on the road. A number of others have expressed that view, so that's another reason for Tesla to do it.

It is certainly true that a number of Tesla accidents, and probably all or most of their fatalities would have been prevented by this tech. This also seems like something Tesla would want, as would those affected by the accidents.

So why would they not do it? They could do it and let you tune it or turn it off, for example.

Why do you think they aren't doing it?

I think because it's a lot tougher than what you just said, to make a driver monitoring system that is safe, effective, and not burdensome. You can't just slap open source software into the current vehicles and have such a system. It would be expensive, it would be flawed, it would be annoying, there's no evidence it would reduce crashes, and perhaps most importantly, it'd take energies off of the real effort, to eliminate the need for driver monitoring in the first place.

Despite all of the verbiage contained within the article the fact of the matter is that most non Tesla drivers, probably including this author, do not understand the Tesla Owner's Manual instructing the driver to have their hands on the wheel and be prepared to take over at any time which means the driver is at ALL time 100% responsible for the vehicle actions. The fact that some people do not read the Owner's Manual or if they do read it they do not properly understand what they have read is not the failing of Tesla.

See my reply to the next comment below this one. It was ment to follow this one.

I have driven my Tesla for many miles in autopilot, and am familiar with what the manual says, and nothing in the article is at odds with that.

Can you help me with the calculations the article is making to show autopilot is less safe than driving without it? You assume the following: accidents are 3 times less likely on highways, autopilot is used 94% on highway, and non-autopilot is used 40% on highways. Then we get these numbers:

"Of the 2.1M miles between accidents in manual mode, 880,000 would be on freeway and 1.2M off of it. For the 3.07M miles in autopilot, 2.9M would be on freeway and just 192,000 off of it. So the manual record is roughly one accident per 1.5M miles off-freeway and per 4.5M miles on-freeway. But the Autopilot record ballparks to 1.1M miles between accidents off freeway and 3.5M on-freeway."

No clue how they got to manual drivers getting one accident per 4.5M miles on freeway and autopilot getting one accident per 3.5M miles on freeway. It's not clear in the article.

If I take their stats above and apply it to the mileage breakdown from:
https://lexfridman.com/tesla-autopilot-miles-and-vehicles/
Tesla cars have driven 22,500M miles total (3.324M on autopilot and 19,176M off autopilot). Using the Q4 2019 safety stats:

Autopilot estimated highway miles = 3,324M miles X 94% highway = 3124M highway miles
Autopilot estimated accidents = 3,324M miles / 3.07M miles per accident = 1017 accidents
Autopilot highway accidents = 1017 / 4 = 254.25 accidents (3 times less likely per article)
Autopilot highway safety stat = 3124M highway miles / 254.25 = 12.3M miles per accident

Manual estimated highway miles = 19,176M miles x 40% highway = 7,670 highway miles
Manual estimated accidents = 19,176M miles / 2.1 miles per accident = 9131 accidents
Manual highway accidents = 9131 / 4 = 2282.75 accidents (3 times less likely per article)
Manual highway safety stat = 7,670M highway miles / 2282.75 = 3.4M miles per accident

What you see is that if you drive 2.9M miles on freeway at 1 accident per 3.5M and 192K miles at 1 accident per 1.1M miles off-freeway, you will have one accident in 3.07M miles as Tesla reports. Similar calculation for off-freeway.

If the driver must be ready at all times to intervene, then what is the point of autopilot. Especially since this requires that he must realize that autopilot is going to screw up soon enough that he has time for the intervention.

Well, the best way to figure that out is to drive with it. People actually do like it and most use it safely. They pay a fair bit of money for it in fact. It actually does make driving more relaxing. It makes driving a bit like driving on a completely straight road with adaptive cruise control. You do the driving thoughts in your head but your body doesn't have to do them and it turns out to be nice.

As long as hundreds of people are not dying...

I don't really see why it's anyone's business.

We don't fault trucks and SUVs for being much more deadly than cars. We don't fault motorcycles for being death machines. We don't freak out about cars lacking advanced accident avoidance tech. These things are all much more deadly.

Should cars that offer cruise control be forced to have driver monitoring?

I think eventually, if we don't have completely self-driving vehicles first, all cars will be required to have driver monitoring. I don't see any reason to single Tesla out, though, and I think we'll probably have vehicles that are completely self-driving before driver monitoring becomes a proven, cost-effective technology (on the level of airbags).

At some point self-driving technologies will be like airbags: mandatory on all new vehicles. And that'll save far more lives than driver monitoring.

You can't take your eyes off cruise control for more than a short time before you leave your lane. There has been no risk of losing attention, beyond what happens to all drivers.

The self-driving tech would of course save more lives. That doesn't mean the driver monitor would not save more.

I'm not sure what you mean by that second paragraph. If you have self-driving technologies that don't require driver attention, you think adding driver monitoring would save lives?

I think it'd cause more deaths, because fewer people would choose to adopt the self-driving technology. The only reason Tesla's FSD package is arguably worth the price is the promise that eventually you won't have to pay attention.

You can take your eyes mostly off the road for a very long time with cruise control if you glance up every once in a while. I'm not sure how that's relevant, though, because this is almost always a conscious decision.

You say there "has been no risk of losing attention, beyond what happens to all drivers." Has that been studied? Having used cruise control, and experienced an increased tendency to lose attention, which I have to actively counteract, I highly doubt that.

You say "beyond what happens to all drivers." So why not apply it to all drivers? If driver monitoring systems are cheap, safe, effective, and not burdensome, why not put them in every vehicle?

Driver monitoring makes sense if:

  • Your technology works well enough that drivers will be tempted, or in particular may unconsciously drift into not paying attention to the road
  • You can do it well

Cruise control is not enough of a self-driving system that you are likely to zone out. Tesla autopilot is good enough that it increases the chances you will either zone out or deliberately ignore the road.

You can't take your eyes off the road for a minute with cruise control. You will quickly drift out of your lane. If your wheels are perfectly aligned and the road is perfectly straight, you might go for slightly longer, but not much.

Note that Tesla has driver monitoring in the wheel torque. It's not a question of whether Teslas have driver monitoring, it's a question about how effective it is at the real goal of reminding you to keep eyes on the road.

Driver monitoring makes sense if it's cheap and it saves lives. If that's true, every car should have it, not just Teslas.

Again, 1.2 million crashes a year, the vast majority of which are not in Teslas.

You can play a game on your phone for a long time while on cruise control, or even while not on cruise control. Yes, you'll glace up from time to time, but you're still greatly increasing the risk of a crash.

Driver inattentiveness doesn't consist solely of closing your eyes for minutes at a time (and there's no evidence that using autopilot unconsciously causes people to do that anyway).

If Autopilot is used as it should, the conditions you sign for before using it, then all would be good! If some idiots choose to abuse the rules then they fave the consequences. I know what car I would rather carry precious cargo, grandchildren, in and it is not made by ANY of the legacy manufacturers, it is made by Tesla. Maybe researchers should put their efforts into making other cars as safe as a Tesla!

If you really don't want the driver monitoring, you should be able to turn it off.

I suspect there are people getting into accidents not because they deliberately ignore the road, but because their attention wanders.

1.2 million crashes a year are caused by driver inattention. The only reason that number isn't higher is because the vast majority of the time when you aren't paying attention, you don't crash. Most people fail to pay proper attention to the road multiple times every day. If someone or something suddenly appeared in front of them, and they didn't have an adequate collision avoidance system, they'd crash.

Unless you want driver monitoring in every car, it's not enough to say that people driving Teslas get into crashes because their attention wanders. I think you'd have to show that something in the Tesla causes their attention to wander, and that the danger of this exceeds the benefit of the collision avoidance system that usually works.

You say "Teslas probably aren't safer when they're on autopilot," but you haven't shown that. But more importantly, you haven't shown that Teslas probably aren't safer when they're on autopilot and are used properly.

Whether I'd turn off the driver monitoring or leave it on would depend on how good it was. I suspect it wouldn't be very good, in that it would be overly nagging (just like the steering wheel nag). But more likely, it'd be impossible to turn it off, just like the steering wheel nag.

I'd definitely turn off the steering wheel nag, except probably when driving late at night.

In addition to turning off driver monitoring, you should be allowed to not pay for its development. But neither would happen. Tesla would be sued into oblivion if they spent the money to build a driver monitoring system that was safe and effective and then let people turn it off. If a safety feature is cheap, safe, and effective (like airbags), manufacturers are required to provide them. https://en.m.wikipedia.org/wiki/Calculus_of_negligence Tesla's best argument for why they don't provide driver monitoring is that it's too expensive compared to the effectiveness (it's probably relatively safe, though some arguments could be made on that front). And that's not just Tesla's argument. If driver monitoring is proven to be cheap, safe, and effective, all car manufacturers would be forced to implement it and take steps to make it difficult for people to turn it off.

I have shown that Teslas, driven across all roads, have accidents slightly more frequently, but Tesla's number, when they have autopilot on then when they have it off.

Some assumptions are needed but they are highly reasonable assumptions.

I expect many people would set driver monitoring to a threshold so that it gives false positives rarely.

Tesla allows you to set the threshold for forward collision warning, for example. It tends to give a moderate number of false positives. I would prefer fewer but I don't reduce its level. I would like to have it be better at lane departure warning though. It gives too many false positives there but that is a much harder thing to do. To figure out if I intended to drift over the line (which I do) needs a full understanding of the nature of the road, and of me, to know which sort of roads I would drive a racing line on.

What have you shown? The article has a bunch of guesses, most of them unfounded. You even admit at the very beginning of the article that you have no clue what the numbers even are measuring. How can you say what the expected ratio between the two numbers is if you have no clue what the numbers are measuring?

(I can't tell what your first sentence is trying to say, but I don't see how you've shown anything other than the fact that Tesla's numbers are meaningless.)

Hoping that Tesla FSD is about to foom!

https://electrek.co/2020/08/14/elon-musk-tesla-full-self-driving-quantum-leap-new-rewrite/

Remember when I said that their current flaws have no bearing on what the final product is going to look like, because everything is going to be rewritten?

I don't know if this rewrite is going to be good enough to take them all the way to government-appeoved driverless. Probably not. But I do believe them when they say it's going to be leaps and bounds ahead.

Tesla's greatest resource is not their codebase. It's the billions of real world test miles that people drive for them for free, and the knowledge of their software engineers that hasn't been encapsulated into code.

The stock market sure does like it.

I hope you have a post about this.

Add new comment