Tesla Autopilot slams into truck; this one is strange


So, yesterday in Taiwan a Tesla, claimed to be on Autopilot, smashed in broad daylight on a mostly empty road into a truck lying on its side, almost hitting the driver standing in front of it. While Tesla Autopilot is just driver assist, and not meant to catch everything, it's not great to miss a giant truck and a human. I explore why in my new Forbes site article found in at Tesla in Taiwan crashes


Seems to be missing link to https://www.forbes.com/sites/bradtempleton/2020/06/02/tesla-in-taiwan-crashes-directly-into-overturned-truck-ignores-pedestrian-with-autopilot-on/ ?

No link, but let me guess: If they had lidar, this never would have happened.

I wonder, if Tesla had lidar, would they be ready for level 4 on highways? If not, does it really matter if they have it?

If lidar means the difference between a Tesla being level 2 and it being level 4 (even if it's just on highways), I'd say it's worth the billions of dollars it would cost to add it. If not, probably not.

A few thousand dollars extra for a car that lets me sleep while on a long road trip. That's worth it. A few thousand dollars so that driving on the highway without looking at the road goes from really really really really dangerous to really really really dangerous? No thanks.

Hopefully one day Tesla will get to the point where all they have to do is add lidar and they'll have an autonomous car. I don't think they're there yet, though.

No one disputes that a crutch can help you go from bad to kinda okay. But kinda okay isn't good enough when it comes to autonomous vehicles.

Teslas autopilot does have radar maps for exactly that reason.

I have not seen Tesla talk a lot about their radar maps. Can you point me at that? They have said they want to avoid detailed vision maps, and I presumed they didn't want radar maps of that sort.

This truck should have presented a prominent radar target, and the only reason to miss it is because you weren't sure it wasn't something by the roadside. If a map could have avoided that error, why did it fail?

Maybe they haven't gotten around to mapping all of Taiwan yet.

Tesla have yet to provide a validated set of software that a well documented ADAS capability. The versions roll out with no indication of regression testing, and potentially may confuse consumers on the vehicle capability at any given point of time.
If we had infrastructure based sensors providing V2X (or C-V2X) updates then this type of crash should never happen. Even V2V would be helpful as the truck could BMS an "I'm broken down" message, since one assumes that a vehicle system would understand that being on it's side was abnormal.

Radar would not see the plastic roof.
The frame beneath would be detected.
But the contents of the truck would scatter and absorb the radar.
If the roof was shinny, distorted, or black (light absorbing) all optics solutions would not work.

Ultrasound would help some, for at least short range detection.(ultrasound: refection of transmitted sound)

Cameras need their own light source like Modulated headlights, so the reflected light could be differated.

Ultimately, traffic lanes need more visible marking.

Add new comment