Tesla model X fatality in Silicon Valley had Autopilot turned on
Last week, buried in the news of the Uber fatality, a Tesla model X had a fatality, plowing into the ramp divider on the flyover carpool exit from Highway 101 to Highway 85 in the heart of Silicon Valley. Literally just a few hundred feet from Microsoft and Google buildings, close to many other SV companies, and just a few miles from Tesla HQ. I take this ramp frequently, as does almost everybody else in the valley. The driver was an Apple programmer, on his way to work. (Update: Possibly to an Apple facility in Sunnyvale, and thus not using the ramp but driving next to it.)
With autopilot on, it was just revealed today.
This was only revealed now because the concrete divider was missing its "crumple barrier" and the Tesla was almost complete destroyed, and had a battery fire. They were lucky to get the data.
While they took place in the same week, this is pretty different from the Uber incident. First of all, Tesla's autopilot technology is a very different animal from the full robocar technology tested by Uber, Waymo and others. It's a driver assist technology that requires that the (consumer) driver stay alert all the time. It is, really, a glorified cruise control with lanekeeping ability. There are all sorts of things it doesn't handle, and that Tesla warns customers it doesn't handle. The Uber was a prototype full robocar, designed to handle the situation it failed on, though still a prototype and needing a safety driver.
Even the simplest accidents are never simple, so let's consider the circumstances of this one. First the basic ones:
- The Tesla system is known, like most ADAS tools, to not detect big stopped objects in your lane. To radar, by the way, cross traffic like the truck in the Florida fatality counts as stopped.
- This is an area Tesla employees and cars know super well. Tesla's first release pointed out that Teslas drive through this lane 20,000 times so far this year with autopilot on. A lot of people in the valley have Teslas.
- As a "left exit" it is of course different from the right exits we are used to. This exit is reserved to carpools (and electric cars like Teslas) during rush hour, but this accident was at 9:30am, after rush hour. While there is a long run up to it, with warning signs, people still find it a bit surprising and stressful when the leftmost of the 2 carpool lanes becomes a forced exit onto 85.
- The lane markers are pretty reasonably painted, but unusual. However, because Tesla drives this so much, it should not have been unusual to Tesla.
- I originally presumed the driver was taking the flyover off-ramp, but other factors suggest he may have been just using the next lane in which continues on. The car definitely ended up on the right (non-flyover) side of the barrier.
But these complications make it more unusual:
- While the barrier is a fixed object, it's not on the road. The Tesla left its lane and drove into the barrier.
- The driver had complained that his autopilot was having trouble at this off-ramp. So much that his family says he went to the Tesla dealer to complain about it. That's a lot of trouble. His family said he reported it was doing something odd 7 out of 10 times he went by.
- At the same time, hundreds of thousands of other Teslas drive this without trouble. If this were happening to other Teslas, with all this publicity, we should have heard reports by now.
- If he really was having problems at this exit all the time, it is unclear why he kept using autopilot there, or did not watch it carefully. It seems really unwise.
- This would appear to then be a problem with his particular model X, which is a bit hard to explain, though perhaps not entirely impossible.
- In particular, Tesla logs say that he ignored several warnings earlier in the ride, one of them an audible alert, telling him to put his hands on the wheel. However, he did put his hands on the wheel 6 seconds before the crash, but removed them, which is fairly normal autopilot operation.
- After hitting the barrier, his car ended up on the highway side, not the ramp side, but more severely damaged than any Tesla has ever been, according to Tesla, suggesting a near head-on impact into the start of the divider.
This video made by another Tesla driver offers a disturbing explanation. As an off-ramp splits off from a highway, the lane markers form a V. Eventually they get far enough apart that this can look like a lane. Somehow the Tesla may be seeing that lane and trying to drive in it. It should not -- there is no reason to leave its existing lane, and the lane border is a solid line, not a dashed one, so you should not be crossing it -- but this is a reasonable explanation.
Note in this example the left line is much stronger than the right lane, which may be what fooled the Tesla. It really is just a fancy cruise control with lanekeeping, and people have to stop forgetting that.
One theory, there is a spot on the side of the left (continuing lane) where the lane marker has worn away. I show it here in streetview seen from the exit lane. We might imagine the Tesla lanefinder seeing this and deciding that suddenly the "lane" between the two lanes is now its path, heading right for the barrier. Yikes.
A commenter has pointed out that another video now shows somebody duplicating it in the same fashion in the exact lane of 101/85. It becomes unclear how it is so easy to duplicate when Tesla claims 200,000 Teslas have gone through this spot in autopilot this year.
Tesla is probably right that autopilot makes people safer on average, even factoring in those who misuse it, and treat it like a robocar. Still, there are ways Tesla could make fewer people misuse the autopilot. I suggested one in 2016 namely that the car test you from time to time while it is safe to do so, by almost drifting into the next lane -- when it's empty of course -- and waiting for you to grab the wheel and correct it. Fail to correct it too many times and lose Autopilot, first for a day, then a month, then forever.
Another option has arisen, used by GM SuperCruise and other tools, which is driver gaze monitoring. This week I wrote about the new tools open sourced by MIT professor Lex Fridman which can do that.
Tesla has resisted such measures. They would make autopilot more annoying. It is possible that if made more annoying, people would use it less, and that would deprive us of the safety benefit it offers. I think that this could be tuned so that the net safety effect is still positive, but it requires research.
I do find it strange that a skilled software developer, who was so bothered by the performance of his autopilot at that off-ramp that he went to the dealership to complain, still kept driving it with the autopilot on. That he even drove it without watching the road, doing whatever it is that people like to do in that situation. (Typical things include using one's phone.)
But maybe not too strange. While people always tell me robocars will never happen because people won't trust them, the reverse is actually true. This happened just 5 days after the Uber fatality and the reliability of robocars was on the minds of everybody, and I presume on his mind. Even though Tesla autopilot is a glorified cruise control, he was treating it like a robocar, it seems, even with that warning.
I am baffled to think about how this could happen regularly to just one particular model X. What defect could make one model do this, since they all have similar sensors and software? Could he have had a special test update?
It has been suggested that the diverging lane lines of the off-ramp could at some point look like the lines of a new lane. That would not explain why the car would seek to drive that lane, or how it would make that mistake, since off-ramps are hardly an unusual thing.
I sure hope news of this sort stops coming, as soon as it can!