Tesla model X fatality in Silicon Valley had Autopilot turned on

Topic: 

Last week, buried in the news of the Uber fatality, a Tesla model X had a fatality, plowing into the ramp divider on the flyover carpool exit from Highway 101 to Highway 85 in the heart of Silicon Valley. Literally just a few hundred feet from Microsoft and Google buildings, close to many other SV companies, and just a few miles from Tesla HQ. I take this ramp frequently, as does almost everybody else in the valley. The driver was an Apple programmer, on his way to work. (Update: Possibly to an Apple facility in Sunnyvale, and thus not using the ramp but driving next to it.)

With autopilot on, it was just revealed today.

This was only revealed now because the concrete divider was missing its "crumple barrier" and the Tesla was almost complete destroyed, and had a battery fire. They were lucky to get the data.

While they took place in the same week, this is pretty different from the Uber incident. First of all, Tesla's autopilot technology is a very different animal from the full robocar technology tested by Uber, Waymo and others. It's a driver assist technology that requires that the (consumer) driver stay alert all the time. It is, really, a glorified cruise control with lanekeeping ability. There are all sorts of things it doesn't handle, and that Tesla warns customers it doesn't handle. The Uber was a prototype full robocar, designed to handle the situation it failed on, though still a prototype and needing a safety driver.

Even the simplest accidents are never simple, so let's consider the circumstances of this one. First the basic ones:

  1. The Tesla system is known, like most ADAS tools, to not detect big stopped objects in your lane. To radar, by the way, cross traffic like the truck in the Florida fatality counts as stopped.
  2. This is an area Tesla employees and cars know super well. Tesla's first release pointed out that Teslas drive through this lane 20,000 times so far this year with autopilot on. A lot of people in the valley have Teslas.
  3. As a "left exit" it is of course different from the right exits we are used to. This exit is reserved to carpools (and electric cars like Teslas) during rush hour, but this accident was at 9:30am, after rush hour. While there is a long run up to it, with warning signs, people still find it a bit surprising and stressful when the leftmost of the 2 carpool lanes becomes a forced exit onto 85.
  4. The lane markers are pretty reasonably painted, but unusual. However, because Tesla drives this so much, it should not have been unusual to Tesla.
  5. I originally presumed the driver was taking the flyover off-ramp, but other factors suggest he may have been just using the next lane in which continues on. The car definitely ended up on the right (non-flyover) side of the barrier.

But these complications make it more unusual:

  1. While the barrier is a fixed object, it's not on the road. The Tesla left its lane and drove into the barrier.
  2. The driver had complained that his autopilot was having trouble at this off-ramp. So much that his family says he went to the Tesla dealer to complain about it. That's a lot of trouble. His family said he reported it was doing something odd 7 out of 10 times he went by.
  3. At the same time, hundreds of thousands of other Teslas drive this without trouble. If this were happening to other Teslas, with all this publicity, we should have heard reports by now.
  4. If he really was having problems at this exit all the time, it is unclear why he kept using autopilot there, or did not watch it carefully. It seems really unwise.
  5. This would appear to then be a problem with his particular model X, which is a bit hard to explain, though perhaps not entirely impossible.
  6. In particular, Tesla logs say that he ignored several warnings earlier in the ride, one of them an audible alert, telling him to put his hands on the wheel. However, he did put his hands on the wheel 6 seconds before the crash, but removed them, which is fairly normal autopilot operation.
  7. After hitting the barrier, his car ended up on the highway side, not the ramp side, but more severely damaged than any Tesla has ever been, according to Tesla, suggesting a near head-on impact into the start of the divider.

This video made by another Tesla driver offers a disturbing explanation. As an off-ramp splits off from a highway, the lane markers form a V. Eventually they get far enough apart that this can look like a lane. Somehow the Tesla may be seeing that lane and trying to drive in it. It should not -- there is no reason to leave its existing lane, and the lane border is a solid line, not a dashed one, so you should not be crossing it -- but this is a reasonable explanation.

Note in this example the left line is much stronger than the right lane, which may be what fooled the Tesla. It really is just a fancy cruise control with lanekeeping, and people have to stop forgetting that.

One theory, there is a spot on the side of the left (continuing lane) where the lane marker has worn away. I show it here in streetview seen from the exit lane. We might imagine the Tesla lanefinder seeing this and deciding that suddenly the "lane" between the two lanes is now its path, heading right for the barrier. Yikes.

A commenter has pointed out that another video now shows somebody duplicating it in the same fashion in the exact lane of 101/85. It becomes unclear how it is so easy to duplicate when Tesla claims 200,000 Teslas have gone through this spot in autopilot this year.

Analysis

Tesla is probably right that autopilot makes people safer on average, even factoring in those who misuse it, and treat it like a robocar. Still, there are ways Tesla could make fewer people misuse the autopilot. I suggested one in 2016 namely that the car test you from time to time while it is safe to do so, by almost drifting into the next lane -- when it's empty of course -- and waiting for you to grab the wheel and correct it. Fail to correct it too many times and lose Autopilot, first for a day, then a month, then forever.

Another option has arisen, used by GM SuperCruise and other tools, which is driver gaze monitoring. This week I wrote about the new tools open sourced by MIT professor Lex Fridman which can do that.

Tesla has resisted such measures. They would make autopilot more annoying. It is possible that if made more annoying, people would use it less, and that would deprive us of the safety benefit it offers. I think that this could be tuned so that the net safety effect is still positive, but it requires research.

I do find it strange that a skilled software developer, who was so bothered by the performance of his autopilot at that off-ramp that he went to the dealership to complain, still kept driving it with the autopilot on. That he even drove it without watching the road, doing whatever it is that people like to do in that situation. (Typical things include using one's phone.)

But maybe not too strange. While people always tell me robocars will never happen because people won't trust them, the reverse is actually true. This happened just 5 days after the Uber fatality and the reliability of robocars was on the minds of everybody, and I presume on his mind. Even though Tesla autopilot is a glorified cruise control, he was treating it like a robocar, it seems, even with that warning.

I am baffled to think about how this could happen regularly to just one particular model X. What defect could make one model do this, since they all have similar sensors and software? Could he have had a special test update?

It has been suggested that the diverging lane lines of the off-ramp could at some point look like the lines of a new lane. That would not explain why the car would seek to drive that lane, or how it would make that mistake, since off-ramps are hardly an unusual thing.

I sure hope news of this sort stops coming, as soon as it can!

Comments

So we had a Tesla on autopilot in Florida in August 2016 fatally crash into a semi truck without any attempt to slow down; we had a Tesla on autopilot in southern California on Jan. 22, 2018 non-injury crash into a fire truck without any apparent attempt to slow down; http://abc7.com/traffic/2-federal-agencies-investigate-tesla-crash-in-culver-city/2985342/
we had a Tesla on autopilot in northern California on March 23, 2018 fatal crash into a concrete barrier without any apparent attempt to slow down; and a Uber in autonomous mode in Arizona on March 18, 2018 fatally strike a pedestrian without any apparent attempt to slow down.

There seems to be room for improvement in obstacle detection. I don't think you would want a toddler walking into a roadway in front of cars that can't detect semi trucks, fire trucks, concrete barriers, and adult pedestrians walking bicycles.

And Tesla says it does not see this sort of thing in all the documents, but a subset of people keep refusing to believe them.

Unless you argue that autopilots are an inherently wrong thing, they will have lists of things they do not detect, and are not expected to.

do they see small things but not big things?
do they see pedestrians but not concrete?
do they see toddlers but not trucks?
do they see a mother duck with ducklings?

I think the Tesla manual has some specific examples, but it's an autopilot, not a robocar. So it has an arbitrarily long list of things it doesn't see. But high on the list are stationary objects and cross traffic. Also varies between the MobilEye based old autopilot and the newer one.

If I go to get a drivers license and tell the DMV that high on the list of objects I can't see are stationary objects and cross traffic, I doubt they want me anywhere near the driver's seat. Is there some special exception for pedestrian cross traffic?
If a deer is frozen in the headlights, do you crash?
A moose? a dog?

It's a cruise control with Lane keeping. Of course there is tons of stuff it doesn't see. The problem is it got too good, started detecting too much, fooled some people into forgetting what it is. Your question about the licence test is wrong. It's like telling the tester "with my eyes closed I can only feel the road and hear other cars". Don't close your eyes then!

A basic adaptive cruise control, when they first came out, saw only moving cars in certain speed ranges. They could not see stopped cars. Could not and still can't see all sorts of things on city streets. Should they have been banned?

A basic adaptive cruise control is different than a Tesla autopilot, because the driver isn't tempted to take his hands off the wheel or stop paying attention to the road. So it doesn't really matter if such a cruise control can detect pedestrians and other obstacles. But the Tesla autopilot is in a no-mans zone where the driver is tempted to give up control, but the autopilot isn't really fully capable of taking it.

This is the paradox. The better you make it, the more tempting it is to abuse. Tesla says, and NTSB agrees, that at present they are still a win -- people are overall safer, even including the people like this guy who misuse it. As in, he didn't look at the road and died, but more people who would have died because they didn't have autopilot looking over their shoulder lived.

However, it does seem that there is a path to something even better.

"Unless you argue that autopilots are an inherently wrong thing"

I wouldn't go that far, but it seems to me that a limited-use robocar (e.g. one which works in autonomous mode only on limited access highways and in stop-and-go traffic) would be a lot more useful than an autopilot which doesn't allow you to take your hands off the wheel and your eyes off the road.

That is a controversial product. People like to call it a "level 3" car -- I think the levels are a stupid idea, so I call it a "standby" car -- which can't handle all roads to hands off to you, with warning, when it is entering something it can't handle. Some feel this is a dangerous idea. Waymo decided it was and not to build it. Even so, Audi is selling it and a few others will soon.

As long as there's a safe place to pull over if the driver doesn't take control, I don't see how it's particularly dangerous. However, part of my point is that it's *less* dangerous than a car which requires you to monitor it constantly even though it does the right thing the vast majority of the time. I really don't see the point of that. It's *harder* to hover over the steering wheel getting ready to take control than to just have a hand on the wheel and follow the path of the road. Following a lane is easy, if you've gotta keep your eyes on the road anyway. Adaptive cruise control is nice, though, for the same reasons cruise control is nice (maintaining a constant speed isn't always easy) and then some (one issue with normal cruise control is constantly resetting it due to traffic).

For stop-and-go traffic, adaptive cruise control with some sort of stop-and-go feature is probably enough. But for long trips it'd be nice to have a car which could handle the interstates while I read a book or something.

For the rest, I do look forward to the future of completely driverless vehicles. I think we're several years away from a consumer-ready product (to own or even just to rent), though. Hopefully I'll be pleasantly surprised.

Florida crash was in May, not August.

what was the speed before accident happened? Was automatic braking used? Was update 2018.10.4 installed? Tesla has no record of the driver complaining about autopilot at this location.

also non-injury Tesla crash into concrete construction divider, March 2017.
http://bgr.com/2017/03/02/tesla-crash-video-texas/

The Tesla crashes into a construction divider which is suddenly incurring into where the lane used to be -- and a car in front of you is hiding that and it suddenly turns to reveal the hazard. We know the Tesla doesn't handle that.

This one is a bit odd, because there are clearly marked lanes that have been there for a long time, and well tested by Tesla, and somehow rather than staying in its lane and missing construction, it appears to have just randomly departed its lane to drive into the barrier.

I know very little about Tesla's system. I assume it does not use detailed maps that would had identified the ramp end as a no go area? Also the line dividers separating the two lanes are so long they actually start to look like another lane. The design of the ramp end looks totally unforgiving if the crumple barrier was missing (being repaired?). Many places would place temporary water filled barriers or a large sign well out in front of the concrete end until such a safety feature was replaced. Not sure if they had in place any warning system?

One thing both Tesla and Uber have in common as companies: both seem to have a culture of entrepreneurial risk taking. It can make them both successful and fast moving but can also be a curse when it comes to safety.

"[Tesla's technology is] a driver assist technology that requires that the (consumer) driver stay alert all the time."

As was Uber's, even if Uber tried to pretend otherwise.

Yes, Uber's and everybody else's still need monitoring except Waymo. Uber wasn't pretending it doesn't need monitoring, it did pay somebody to monitor it. She just failed. Now we're debating if she failed by her own error, or because her work situation was doomed to fail.

I still drive a dumb car, so maybe this is already a thing, but what I'd like is a live display of the current planned route overlaid on a forward-camera view. Heads-up would be nicer, but I could live with the off axis LCD screen in the center console. The problem with an auto-pilot (or a human driver, for that matter) is you don't know if they are planning to take corrective action or not until they take it - and it might be too late for a plan-B if they don't. With a human driver, I can say "cyclist, right" and get back a quick "got him" and feel that things are mostly right with the world. It would similarly be nice if my smart cruise control had a 5-second path projected so I had some inkling as to whether it was going to barrel into a concrete barrier or not.

"Tesla is probably right that autopilot makes people safer on average, even factoring in those who misuse it, and treat it like a robocar."

Are there any statistics on that subject?

Telsa cites them often, and the NTSB cited some in its report on the first fatality in Florida

Essentially, Tesla looks at the average fatality and accident rates for ordinary non-autopilot driving, and says that the rate for Tesla driving with autopilot (including the crashes while using autopilot) is lower. They claim this is because drivers using autopilot properly are two times safer than those who drive with no autopilot, while the smaller population of people using autopilot improperly have a higher rate, they are a small enough population to claim a win.

The driver did indeed work for Apple but his office was located in Sunnyvale. The correct route to that location would be to stay southbound on 101. I haven't seen anything that definitely identified his destination but it is just as likely (if not more so) that he wanted to stay on 101.

Does not change a lot. In neither lane should the car decide to leave its lane and drive into the barrier. Left exits are unusual, but hardly uncommon, and as noted, being just a few miles from Tesla HQ, Tesla developers probably drive this left exit quite a lot. (Along with the one on 237 on the way to the Tesla factory.)

There are two carpool lanes that ends at this split, during carpool hours the left most lane (#1) seems to be preferred. I certainly prefer it even if I was going stay on 101 south because of the speed differential between the cars in the carpool lane. The accident occurred outside of car pool hours so I'm not sure if this was a factor but it could explain why the driver could potentially choose the #1 lane even though he should be familiar with the area know of the split.

I use to drive over that ramp a lot when I worked in Palo Alto, even though I knew about the split I often found it surprising. The way the lines and pavement didn't match was not helpful. It's pretty clear that many drivers are surprised as well. Google street view cameras seem to regular catch drivers driving in the gore point. I found it interesting that Tesla mentioned that driver setting the following distance to "1" (the minimum). Wild speculation follows: I wonder if the Model X was following a car that entered the the gore point and that lead car realized what going wrong and corrected itself quickly but the sudden correction makes the Model X stop tracking that car. Now the Model X needs to find the lane on its own and it incorrectly "thinks" it was inside a lane (a white line on either side of it). Now it was doomed, the radar is blind to the stationary object that was in front it. It just keeps driving in its "lane" until it runs into the wall.

I don't know quite how Tesla balances following a vehicle vs. following lane lines. In the past we've seen some crashes where a Tesla was following a vehicle, the vehicle then veered right to follow newly painted lane lines and the Tesla neither followed the vehicle or the new lines and crashed into the barrier or some other obstacle. So I am not sure it follows vehicles, or at least did not in the past, other than to get what speed to travel.

The other issue is that this sort of situation is common at every off ramp. They all look like this except that this one goes to the left. And people are always making last minute quick changes at off ramps, too.

The story we get is that he went to Tesla and complained his car was veering at this ramp a lot, not just in some special situation. Still don't know why he kept autopilot on there. And that he adjusted the wheel 6 seconds before, so don't get why he wasn't watching. You are more likely to keep watching if taking the off-ramp than if just continuing on.

I don't know if Tesla recovered all the data so we may not learn more details. At this point, however, Tesla is presumably having their cars drive this area repeatedly recording what their systems are deciding to see if anything odd is showing up, so even if they don't recover the data they can hopefully reproduce the problem.

Tesla is not very map based but it does have maps of off ramp locations from what I understand.

This video gives a lot of support to your theory. While Tesla still can fall back on the idea that it is not a complete system and does not handle everything, this is a serious flaw. It's the kind of flaw you partly solve with maps -- Tesla does do some mapping -- in that you tweak the lane finding code in those regions to not be fooled in this way.

In this case, the left line is much brighter than the right one, which even a human might miss for a while. The stripes on the non-lane however should be scaring it away, stripes not seen at the 101 off-ramp.

In the long run, this is also why all real robocars don't just drive with lanefinding, they use detailed maps to always know where they are, and not knowing where they are is considered an intervention requirement.

Someone reproduced the AP issue at the same spot, and it aligns with your theory:

https://www.youtube.com/watch?v=VVJSjeHDvfY

Add new comment