An 8-Car Pileup Started By A Tesla In Autopilot Opens Up Many Complex Issues
(This originally appeared on Forbes.com. I am not always linking stories from there as they now have their own comment section though it is not well used as yet.)
On Thanksgiving (Nov 24) an 8-car pileup occurred on the San Francisco to Oakland Bay Bridge. Nobody was seriously injured, but interest was raised because it was all started by a driver in a 2021 Tesla Model S. The driver, a 76 year-old San Francisco lawyer, told police he was using Tesla’s “Full Self Driving” mode and it malfunctioned, changing lanes and hitting the brakes hard in front of a line of cars. While the car would have actually been in “Autopilot” (a different system) this crash opens up some surprisingly interesting questions about how driver supervision of “pilot” style driver-assist systems should work, and who is at fault.
Tesla has two different systems which can take control of the car. The original one, “Autopilot,” is included with cars and drives highways while drivers keep their hands on the wheel and eyes on the road, ready to intervene if the system does anything wrong. This is known as advanced “driver assist” or ADAS. Owners can also buy an upgrade to this called “Navigate on Autopilot” which adds the ability to do automatic lane changes, among a few other features.
For even more money, some Tesla owners have pre-bought Tesla’s eventual “full self driving” system, a product which is not yet ready, though Tesla lets customers try out the early prototype in what they (very incorrectly) refer to as a “beta” test. While the system, when eventually delivered, promises right in the name to be an actual self-driving system, the prototype is not that, and needs very diligent monitoring with regular intervention by drivers. It works on city streets, and does not operate on highways. If you turn it on, and it enters a highway, it switches to the older Autopilot system automatically, which might have confused the driver in thinking that the FSD system was engaged, though the switch is quite obvious on the display screen.
Both systems require driver supervision and can’t be bought or turned on without a lot of warnings about that, though as we know, many people — even lawyers like this driver — click to agree to warnings and then get lax. To reduce that, Tesla requires drivers apply regular force to the wheel, and recently started watching their heads with a camera to assure they keep eyes on the road. Sadly, there have been a number of crashes with Autopilot where drivers didn’t pay attention to the road, with some fatal results. Some drivers deliberately don’t pay attention, and a few even try to defeat the system that nags you if you don’t touch the wheel. With the FSD early prototype, drivers actually pay more attention because it makes mistakes so often you would crash almost every time you went out if you did not, as you would with a basic cruise control if you turned it on and didn’t watch.
According to the driver, he had activated FSD and the car was moving on the lower deck of the Bay Bridge leaving SF. It sought the left lane, something it does if the driver has asked it to drive well above the speed limit — drivers can configure this behavior. If it finds the lane it is in is too slow, it will automatically change lanes to the left. If it’s too fast, it will move right. It also moves to take upcoming exits. Drivers are notified of lane changes and are told to check the road and blindspots when it signals one, and to abort the lane change if it’s unsafe. The system, however, is fairly good at checking the blindspots — or at least often is good enough — and so some drivers get lax at this.
The Tesla selected the 2nd from left lane of the 5-lane bridge before entering a tunnel on Yerba Buena Island after crossing the first span out of SF. The driver claims he (or the car) was driving 55mph, slightly above the 50mph limit but hardly unusual — in fact it’s slow for the left hand lanes.
Then the car did something quite odd. It signaled to change lanes to the left, into the fastest lane. According to the driver, it did this, but then suddenly braked for no apparent reason. (The police report, which includes examination of the video, says the car braked before changing lanes, which appears correct.) The car behind, who was cut off, slams on the brakes and apparently hits the Tesla, though they claim they did not. Same for the next car, but other cars behind, going faster, do hit, and push other cars forward until everybody gets hit, in one case car #6 gets pushed up on top of car #7 when it hits. All of that (detailed below) is just the mechanics of a multi-car pileup. What’s interesting is how it began.
Phantom Braking
Tesla owners have for some time been complaining about Tesla cars in Autopilot (and FSD) suddenly braking hard when there’s nothing there. Some feel this problem has gotten worse after Tesla stopped using radar and relied only on cameras, others feel the other way. Nobody likes it, and as we see here, when this happens, it can be dangerous.
In all autonomous systems, where sensors scan the area in front of the car, there can be two main sorts of mistakes. The worst is the “false negative” where there is something there but the system doesn’t see it. It’s the worst because you may hit something. To prevent that, you make the system very sensitive so it is never missing things in front of it.
If it gets too sensitive, the reverse happens, the “false positive” or phantom. You brake for the ghost. You don’t hit anything, which is better, but it’s very jarring, and if people are following too closely behind you, they might rear-end you. It would be their fault under the law, but you still want very much to avoid that. It’s just you want even more to avoid running into somebody.
Tesla’s phantom braking problem hasn’t caused many reported accidents, but that changes if the braking happens during or right after a lane change. While normally, in a rear-end collision, the car behind is considered always at fault, one exception occurs when somebody cuts in front of you too close, and you can’t brake in time. That’s the fault of the forward car. Indeed, the police cited the driver of the Tesla for an “unsafe lane change.” The cars in the back that did hit other cars were cited for following too closely/excessive speed and thus going to fast with stopped cars ahead.
The Tesla in this case did something quite unusual, if the statement of its driver is to be credited.
It attempted an automatic lane change when its own lane was not particularly slow. The car in front was around 3 seconds ahead. Normally such a change happens only when you’ve come up on slower traffic in your lane. It attempted the change when there were cars behind and to the left of it, which it is usually fairly conservative about — but it does do it, as humans do all the time, though possibly unwisely. Based on the police report, it attempted the lane change even though it was already braking for some ghost that wasn’t there. After changing it braked even harder. Teslas do not tend to do evasive action and swerve around obstacles, they just brake if they see one — or think they see one. As a 2021 model S it would still have ultrasonic blind spot sensors As such, something odd was going on here. It’s an unusual enough situation that I haven’t seen much in way of reports on phantom braking before or during a lane change. It seems to have happened, though it’s also possible the driver misreported and manually commanded the lane change. If the driver manually disengaged Autopilot and, in spite of the tones it makes, failed to notice, that could potentially explained it. Tesla used to provide details learned from the logs in these cars but they have made no statement on what they know, though they certainly will have looked into it. Police may request this information.
Some analysis of the situation shows the turn signals on the car activating for only a short time. When “Navigate on Autopilot” initiates a lane change on its own, it blinks the turn signal for longer first according to reports. This makes some speculate the driver triggered this lane change either manually or by flicking the turn signal manually.
It should be noted that as a lawyer, the driver of the Tesla could face serious consequences to his career if he lies in a statement to police, so it seems likely that he believes his statement that Autopilot was on and the car stopped for no reason to be true. (He might be mistaken, though, as he was about it being FSD.)
Because Tesla AutoPilot is a driver-assist system and it’s fairly explicitly explained that drivers must maintain control and watch at all times, it is unlikely it will be deemed “defective” though there is still controversy over this question. In most other crashes we’ve seen, the driver’s dereliction of duty has been clear. What makes this one different is that the phantom braking problems may make it much more difficult to properly supervise.
How do you supervise this situation?
In general, people who are paying attention do a pretty decent job supervising these ADAS pilot systems which come from Tesla, Ford, GM and many other vendors. More data is being collected about both how well people supervise, and how many people are negligent in supervising, and what to do about the latter group.
Phantom braking, though, is an unusual and special situation. If a car is steering the wrong way, it’s easy to grab the wheel and steer it correctly. If it’s heading to fast for something, it’s the natural reaction to hit the brakes. People do this OK. If the car brakes for no reason, your way to correct that is to hit the accelerator. (You could call it “hit the gas” but these are electric cars.) We don’t tend to have that reflex. That’s more something you have to think about, and thinking takes longer. We don’t want to generate a reflex to hit the accelerator because that can lead to very bad situations if you do it wrong.
Lane changing is also complex. People are OK at supervising a lane change, but we can’t look both forward and backwards and sideways at the same time, the way robots can. It turns out that blindspot detection is pretty reliable (especially with ultrasonic sensors, which Tesla started removing in 2022, but not from this car) so you can actually trust the robot to help on that, asking the human to only make a situational judgment about whether this is a good time for a lane change, leaving the tactical steering etc. to the car.
But nobody really does this with the anticipation that you’re going to have to suddenly brake, for something real, or for a ghost. Sometimes it happens but if it’s likely you don’t do the lane change. But nobody can predict a sensor ghost.
It could be that lane changes just should not be done if there are people not too far behind. Just for the rare risk of a sensor ghost. And it definitely is the case that a car should not change lanes if it’s sensing an obstacle that might be a sensor ghost.
It’s possible Tesla should at least disable automatic lane changes when there are cars nearby in the target lane unless it can assure no phantom braking will take place after the lane change. Drivers also must learn how to respond to phantom braking by quickly using the accelerator — just a tap is enough to tell the car that braking is not desired here. Presumably Tesla is already working on their phantom braking problem, and if it is deemed to be caused by the loss of radar, they may face a difficult and very expensive decision about re-installing radar in the large number of cars sold without it.
The driver here, though, should have noticed it was braking while doing a lane change without anything in front of it, and aborted the lane change and then pressed the accelerator to continue in the 2nd from left lane. For not doing so, he is at fault.
While Tesla stopped going it for a year, today they updated the misleading accident statistics for every quarter. Those stats give the impression driving on Autopilot is much safer than regular driving. In reality, they are similar. Tesla knows the actual difference but declines to disclose it, which has always raised eyebrows. However, as noted they are similar, meaning that AP driving is not more or less dangerous than regular driving. Events like this crash are significantly rare that the overall safety record remains similar to driving manually.
Tunnels
As it turns out, tunnels are a tricky area for sensors. Radar in particular is bouncing off of everything. That means you can pretty much ignore any radar return from a stationary object, because those are in all directions. This is one of the reasons Tesla decided to stop using radar, though more advanced imaging radars can help here — and there is a rumor that Tesla plans to use an imaging radar in the future.
Even vision systems can be confused by tunnels. The “light at the end of the tunnel” can blow out the contrast for cameras. That might be what took place here. If that’s true, tunnels might be a good place to avoid lane changes, or to demand confirmation from the driver before doing them.
Being smarter about phantom braking
The law creates a paradox when it comes to sensor blindness (false negatives) and sensor ghosts. If you brake for a ghost in a normal situation (ie. not just after changing lanes) and you get rear-ended, it’s the fault of the driver behind you, pretty much 100%. If you hit something you didn’t see, it’s your fault. The latter is worse, but both are bad.
Robots are looking all directions at once. So, unlike people, when they brake, they also know who is behind them and how fast they are going. They can track how well that car to the rear is braking, they can know if and when it’s going to hit them. They are very good at physics.
A robot does have the option, in that situation, to not brake quite as hard. To make sure the car behind won’t hit. At least, it has that option if it knows that, by not braking quite so hard, it still won’t hit what it’s braking for. It could be argued the right strategy is to do that, try to avoid both collisions at once, if possible. On the other hand, people are much more afraid of the collision that will be their fault, so they instead don’t try that, they just brake as hard as they can, rather than brake so they will end up one inch away from the obstacle.
It could make sense for the law to reward this “avoid them both” strategy. For now it does not. Of course, in many cases you can’t avoid hitting what’s in front, you only control how hard you hit it, so you brake at maximum force. (The right strategy, if you are sure of the other lanes, is to break at maximum force until the last moment and then swerve. People are wary of that because they may not have a perfect image of what they might swerve into, and again, the law is much harsher if you swerve into something that wasn’t even in your lane — you took a deliberate action which caused harm.)
This gets even more interesting if you estimate your forward obstacle might be a phantom. You see, systems will sometimes have an idea about that. It doesn’t help too much because you’re not sure it’s a phantom, so you must always brake. But it might help you decide whether to brake so hard you make somebody behind hit rear-end you because they were following too close. Their fault, but bad for you and bad for them.
But it could be a good strategy, if you are facing an obstacle that you think 90% might be a sensor ghost, that you work to avoid being rear-ended if you can still miss the ghost.
Most crashes that have happened with Waymo, Cruise and other robocars have involved the cars being rear-ended while stopped. Very obviously the fault of the car doing the hitting, but these cars could, if the way in front of them were clear, deliberately move out of the way because they, unlike people, are always looking behind them. Robocars could be made that are very difficult to rear end. You could only rear end them if they had something else in front of them that you would also have rear ended, or almost rear ended. Rear radar would be ideal here, as it provides an instant read on distance and velocity. Not all cars bother with a rear radar, but a rear LIDAR can do the job fairly well too. We’ll leave it to Tesla to figure out how well cameras can do it.
Possible recommendations for both ADAS and robocars:
- ADAS pilot systems should do a training run on an empty road to assure drivers are ready to respond to phantom braking
- Until phantom breaking is minimized, automatic lane change should be curtailed, especially in tunnels or other environments where it is more likely.
- A vehicle should not do automatic lane change while braking, except as a deliberate emergency evasive action.
- If a vehicle is coming from behind on a trajectory to rear end, even with fairly full braking, vehicles should avoid doing more braking of their own than is necessary to avoid a forward collision, particularly if the score indicates the forward obstacle is likely to be a false positive.
- LIDAR and imaging radar should be used to minimize perception ghosts.
- Areas prone to phantom breaking should be mapped and this map used in evaluation of possible perception ghosts.
- HDR cameras should be used to avoid perception problems in tunnels (many cars already do this.)
The rest of the crash:
For information, here are the details of the 8-car crash are reported by California Highway Patrol with a few more details.
- V1*, the Tesla, was presumably in Nav-on-autopilot. The driver activated FSD but it switches on the interstate. The driver says it was in the left lane, moved right before entering the tunnel and moved left again just before the crash. He claims he was doing 55. If true, he must have programmed an aggressive speed into AP to make it want to drive in the left lane so much.
- V2 claims he was doing 50mph and stopped in time and didn't hit the Tesla, until V3 pushed him into it. (That's not what I see in the video but it's low res.)
- V3 claims she didn't hit V2 until he was hit by V4. She says she was doing 50.
- V4 * claims he was doing 65, and was still braking, and trying to veer when hit from behind
- V5 * admits she hit V4 (and also hit V3), and then she was hit. She also claims she was doing 35 in the left lane (?)
- V6 said he stopped 10 feet behind V5 but then was hit and pushed hard — his car was pushed on top of V7. Claims he was doing 50
- V7 * claims he braked but he hit V6, the car that ended up on top of him. He claims he was doing 70.
- V8 * also says he was doing 70mph and a truck in front of him pulled to the right and he hit V7. (*) indicates police cited vehicle code violation, either unsafe lane change, or excessive speed
Add new comment