Tesla Autopilot alleged failure makes you wonder about how they train it
Another Tesla car crash, allegedly on autopilot, teaches us something about how well (or not well) Tesla is doing with its claimed ability to use its fleet of cars to quickly learn to identify unusual obstacles and situations. Here, a Tesla on autopilot crashes into a tow truck sticking out into the right lane (injuring the Tesla driver.) The driver says it was on Autopilot but that he was distracted for a few seconds. Driver's fault, but why did the Tesla, whose Autopilot is supposedly just months from turning into "feature complete full self driving" miss this sort of thing, when this has happened before, and Tesla has great tools to understand things that have happened before. New Forbes article in comment #1
Plus I look into new things about an old Tesla accident, where it seems the driver was completely abusing Autopilot, treating it like self-driving, and just wiggling the wheel every few minutes to keep Autopilot going, but missed and it was off when he probably thought he had kept it on.
Read some analysis at this Forbes site article Alleged Tesla Autopilot failure raises questions on how they train it
Comments
BEN
Wed, 2019-08-14 00:07
Permalink
Tesla Autopilot Failure, NOT!
The video which was not mentioned in this article showed the Tesla's brakes engaging just before hitting the tow truck. This likely reduced impact. Could have been a lot worse. Also, likely the Tesla would have gone 'right' if not for the vehicle to its right which it hit after hitting the tow truck. The problem for the Tesla as I see it was the vehicle behind the tow truck blocking the line of sight to a degree until the Tesla came up upon the truck. For not paying attention dad was lucky he was in a Tesla that partially responded and protected his children. Fire subsequently not a great thing but well enough time for all to escape prior to onset.
brad
Wed, 2019-08-14 12:22
Permalink
AEB
The Tesla has an independent collision warning and collision braking system that is always on, not just in autopilot. This is probably what engaged the brakes.
However, that truck had been there a while, and all the other cars were navigating easily around it -- traffic was not even slowed very much. To be a full self drive product, Tesla's system needs to be able to match that ability. To perceive the problem well before it happens (but noticing that other vehicles are moving right for something, and braking, among other things) and be ready. Humans in this situation will slow, and they will move right, crowding the car in the lane to the right which will make that car move right too.
It's not an autopilot failure. It's a failure of "autopilot, the thing that in its next release is supposed to have some level of full self driving."
FKA
Wed, 2019-08-14 04:39
Permalink
ADAS is not unmonitored self-driving
This crash is a good example of why it's a mistake to judge ADAS as though it's unmonitored self-driving. The types of evasive maneuvers you take in the latter situation are not equivalent to the types of responses you take in the former situation.
There's not enough information that I can find to determine what, if anything, the Tesla did wrong here. Yes, it could have probably squeezed in-between the truck and the car in the lane to the right of it, but I'm not sure that would be the right thing to do for an ADAS, as opposed to an unmonitored self-driving system. The more likely mistake, if it was in fact in autopilot, was not slowing down or changing lanes far before reaching the truck, but without the forward-facing camera view it's hard to say if the view was obstructed. Also, autopilot is, probably correctly, tweaked not to slow down too much in potentially hazardous situations (as opposed to imminent crashes), because the assumption is that the driver is a better judge in that situation. Except for evasive maneuvers in the case of an imminent crash, standard autopilot doesn't make automatic lane changes at all. A vehicle in unattended self-driving mode would have full control over speed and lane decisions.
brad
Wed, 2019-08-14 12:24
Permalink
This is true
The behaviour I describe in the comment above, that humans do, of attempting to move right and claim more space on the right, is not something an ADAS product would do. However, slowing in any situation with a significant risk of high speed impact, or at least issuing an audible warning, is the right thing for an ADAS tool.
FKA
Wed, 2019-08-14 18:16
Permalink
ADAS
I'm not sure what situations you would slow in. It'd have to be limited to situations where the car could be sure that the driver either wasn't paying attention or had made a mistake, or else it'd be really annoying.
Ditto to have an audible warning. It's arguably even worse to have an audible warning in too many unnecessary situations, as that'll have the "boy who cried wolf" effect.
brad
Thu, 2019-08-15 00:47
Permalink
Boy who cried wolf
My Tesla's forward collision warning goes off frequently for semi-false alarms, and you can tune that in the car's settings.
However, naturally you want to catch as many actual impacts as you can. That's why you try to make the systems better using the techniques Tesla has promoted they use.
The main excuse here is that Russian tow trucks don't look like ones in the USA. But that is only an excuse to the extent that you are ready to have it hit any new brand of tow truck.
FKA
Sat, 2019-08-17 13:31
Permalink
Excuse?
Who is using that excuse?
Without the forward camera view I'm not sure how we can even say that a human would have seen and recognized the tow truck before the Tesla did.
Which mode it was in would also be relevant, and we don't know that either, though we know definitively that it wasn't in fully autonomous mode.
Why does it matter that it was a tow truck?
The actual impact was caught. It just was caught after it was already too late to avoid a crash.
To catch it before it's too late, you have balance that against warning people of things they already see. I guess another question is what setting this car was tuned too for that, although first the question is whether or not a warning did go off and when.
brad
Sun, 2019-08-18 11:08
Permalink
Human would have seen
There is a fair bit of traffic, though not enough to be stop and go. That means every other human being who passed these stalled vehicles recognized and handled it.
Tesla FCW mostly warns me about things I already see.
It matters it was a Russian tow truck, in that there is the suggestion that it doesn't much look like the Tow Trucks that Tesla will have used in training their models in the USA, Asia and Europe.
FKA
Tue, 2019-08-20 06:27
Permalink
But when would a human have seen?
We don't my see on the video how many other cars handled the situation, nor how. A last minute swerve into the lane to the right maybe, which the Tesla could have done, and maybe could have done without causing a crash with one of the cars in that lane, but I'm not sure you'd want to program an ADAS to make evasive maneuvers like that when there is traffic in the lane to the right. ADAS is not autonomous driving. There's no requirement to handle these sorts of situations, and imperfectly trying to do so would quite possibly cause more liability than doing nothing. Braking hard and staying straight is a fairly safe response from a liability standpoint. Liability-wise, it's likely the tow truck driver would be found 100% liable for this crash (at least assuming US laws). If the Tesla has swerved and caused a crash, maybe not.
Why does it matter whether or not the type of tow truck was used in training models? Does Tesla's vision system not make a 3D map of obstacles in the road regardless of whether or not its model has been trained on them? Training on types of vehicles is useful to predict behavior, but shouldn't be necessary just to avoid crashing into stationary objects.
They certainly have enough data to make a 3D map of the limited area they will be travelling through. Maybe not enough processing power with the old hardware? Or maybe this is just not something they do at all?
The aspect I'd be more interested in is what could have been done, long before the Tesla hits its brakes, to recognize the situation. In particular, when you see cars ahead of you swerving, you should slow down (and, if safe, probably change lanes). Were there warning signs like that, which the Tesla could have noticed? If so, I would expect an autonomous car to react to that. Maybe not an ADAS, though, because of the false positives.
brad
Tue, 2019-08-20 12:18
Permalink
Swerve, or hit?
Well, as is obviously the case in this video, you want to swerve as much as you can. You are, ideally, aware of the vehicles to the right. In this case, the Tesla hits the tow truck and then slams into the vehicle on its right hard. Much better to gently push that vehicle from the side if the other choice is what happened. Not a great choice. Best (and human) approach is to be aware by watching the motions of other cars around the obstacle that something is up, and to slow and exercise more caution around it.
Vision systems try to get a 3D map but they aren't perfect, particularly on stopped objects. But that's part of the debate between computer vision and LIDAR, which is inherently 3D while vision is inferred 3D. Motion parallax stuff is ancient (and is often just called machine vision because it's a lot simpler than computer vision) but it has limits.
This is a tough situation. Cars parked like this are risky even for human drivers. But human drivers handle them the vast majority of the time. Perhaps Teslas do too, but I don't believe they are able to follow the human strategy here. But they didn't even build a good 3D map and slow down just from the first car, let alone the truck. Their classifier should have seen that first car.
FKA
Thu, 2019-08-22 15:59
Permalink
ADAS vs. human vs. robocar
I don't think you're going to convince me that an ADAS should swerve in this situation. An ADAS isn't a driver. The human driver is the driver, and is the one who should make the decision to swerve or not swerve, if there's a significant risk that swerving will cause a crash.
Without seeing more video, I can't say whether or not a human driver would have been likely to avoid this crash. (Other than the fact that in this case a human was driving and that human driver didn't avoid the crash.) Tow trucks parked like this are extremely risky. I bet crashes like this one occur quite frequently.
It's not clear when the Tesla would have seen even the first car. The view of both vehicles was no doubt blocked by cars ahead of the Tesla until relatively soon before the crash.
Perhaps the lack of a good 3D map played a role in the crash. It's hard to say, and we'll likely never know. One thing we can say is that the 3D mapping ability of the Tesla will increase dramatically in cars that have the new hardware. It's not clear if the Tesla in this crash had that new hardware, but even if it did, I don't believe the software to take advantage of the new hardware has been fully deployed, yet.
brad
Fri, 2019-08-23 17:29
Permalink
ADAS and humans
The stalled vehicle and tow truck didn't just get there. The stalled vehicle has to have been there for probably tens of minutes. The tow truck didn't just arrive -- if it had, it would have created a traffic jam as it slowed down and pulled onto the shoulder. So tons of human drivers passed this just fine.
What should an ADAS tool do here? Well, first of all, as soon as it detects an incursion into the lane, it should sound an alert. Then, if the driver does not react, and is on a collision course for the incursion, it should start to nudge as far as it can safety do to the right.
More controversially, it can be argued that as the impact approaches, it should, if it does not detect any torque force on the wheel, nudge to the right even if it is not safe to do so, ie. if it will sideswipe (gently) a vehicle in the other lane. Normally we all know why you would not want to do that. However, once the physics make it clear that the present course will hit the truck and then bounce the car hard to the right, hitting the car to the right even harder, as happened in this accident, then hitting it gently is the best strategy for the car to the right.
FKA
Fri, 2019-08-23 19:22
Permalink
priorities
Presumably several vehicles passed this just fine. Whether or not the drivers of those other vehicles had a better view than the Tesla, we'll never know. Whether or not those other vehicles had a bigger opening to swerve, we'll probably never know. In any case, I don't think this is a particularly surprising crash. The tow truck was in an awful place. The tow truck driver was 100% at fault for this crash.
Your suggestions of what an ADAS tool could do are good ones. I don't think there's any car manufacturer that does this, but it would probably be something good for them to add. (I would worry about false positives, though.) If Tesla had unlimited funds to spend designing the perfect ADAS, they should do all of this.
Of course, they don't have unlimited funds, and avoiding a crash where both a tow truck driver and the driver of the Tesla do stupid things is probably not their top priority. I think when you're designing an ADAS, the first priority should be to not do any harm. That means that if there's any doubt whether or not suddenly swerving out of your lane is safe, you shouldn't do it.
A crash caused by an ADAS swerving out of the lane when it shouldn't would be much much worse than a crash caused by an ADAS not-swerving out of the lane when it should.
brad
Sat, 2019-08-24 17:35
Permalink
Never know?
Well, if the authorities thought it a worthwhile question, we would of course know because they have surveillance video for the whole thing, not just the crash. I don't know if anybody asked for it.
Fault of the tow truck? Not really -- this is a common situation when a car stalls in that sort of location. The tow trucks have to get in there and tow them. The safe way to do it would be to stop traffic first, it's true, and put out a detour sign etc. Lots of flashing lights. I guess they don't do that in Russia. You could blame the road designers for not having enough shoulder, or the stalled car, but not the tow truck, it's just making the best of a job that needs to be done under their system.
The job of any self-driving car is to deal with the hazards that happen on real roads. It can't say, "well, that shouldn't happen." If it happens, you have to deal with it.
Actually, sideswipe crashes are much less of a problem than partial side impacts with stationary trucks.
FKA
Sat, 2019-08-24 20:06
Permalink
Any self-driving car?
This wasn't a self-driving car, let alone a self-driving car built to handle situations that happen commonly in Russia (*). It was a car that, maybe, had adaptive cruise control and lanekeeping enabled. Swerving out of the way of vehicles partly blocking the lane isn't part of the description of Autopilot, as far as I'm aware. Not in Russia or anywhere. There is, I believe, a collision avoidance system that the Tesla has that is separate from Autopilot. I'm not sure what parts of the world it's enabled in, though. And I doubt it's set up to swerve out of a lane into a lane where there is already a vehicle, at least without some input from the driver that this is the answer to the trolley problem that the driver chooses to take.
It's true that Tesla is working on a self-driving car. But this car probably didn't even have the hardware required for that, and definitely was not in self-driving mode.
(*) Is it even true that it's common in Russia for tow trucks to block part of a lane without first blocking the entire lane (cones? flares? another vehicle?)? I'm sorry, but that's an incredibly dangerous thing to do. I doubt it's common in Russia, though maybe I'm underestimating how underdeveloped Russia is in terms of rules of the road. To the extent it is common, I bet crashes are common as well. You don't block part of a lane on a motorway. When you do that, crashes like this happen. As you noted yourself, if the entire lane was blocked, it would have caused a traffic jam. And that means this crash wouldn't have happened.
brad
Sun, 2019-08-25 12:32
Permalink
How good should it be
I agree that a Tesla shipped prior to March does not have the hardware. In fact, I don't think the current Tesla has the hardware, but the newer ones have a much better processor and while Tesla formerly said the old hardware was enough, it admits it was not.
And, presuming this Tesla is put on wifi and gets software updates, it's running the version 9 autopilot, which is not the FSD version. That's all clear.
The question at hand is, does the car today show the signs of being only a few months of development away from being ready, even if given new hardware. Accidents like this strongly suggest not, and that's worthy of comment.
It is not surprising that Tesla autopilot or other similar product might miss reacting to that truck. It is surprising that a company whose product is at that level says that very soon it will ship something called full self driving.
FKA
Mon, 2019-08-26 07:37
Permalink
Three things
1) There's no significant evidence that Tesla is only a few months away from releasing an autonomous vehicle.
2) This accident is neither evidence for nor against that. If Tesla is only a few months away from releasing an autonomous vehicle, the vehicle involved in this accident was not running the software that will be used for that autonomous vehicle. I think this is the key point, and I'm not sure why you disagree with it: You don't produce an autonomous vehicle by simply perfecting ADAS. It doesn't work that way. It can't work that way. Any software that runs a Tesla as an autonomous vehicle will be very different from Autopilot. It is possible that Tesla is only a few months away from releasing that software, though there is little evidence for it (and the main evidence of it is self-serving claims).
3) There's nothing at all surprising about it. See 1 & 2. Like many claims that Musk has made, it's not true; and even if it was true, an autonomous vehicle is a different product from the one involved in the crash.
--
I do think this accident shows that there's a big difference between autopilot today and an autonomous vehicle. But then, I think the YouTube videos by "Tesla Driver" already show that. The driver in those videos has to take over numerous times in every episode.
brad
Mon, 2019-08-26 11:12
Permalink
A few months away
I agree with you that it is unlikely they will pull it off, but the big "evidence" is Elon Musk declaring that he is certain that feature complete FSD will release this year. And there's more, there are customers out driving it in beta and Tesla has been showing it off.
Again, I know that current version Autopilot isn't the FSD release. But then, current Waymo isn't their real deployment release either. You judge their quality and capabilities by how close they appear to be. FSD software won't be as different from Autopilot as you say, that's the point of Tesla's incremental approach.
In particular, my main point is that Tesla knows that stalled vehicles in lanes are a problem for it. This accident shows they are still having problems with this known issue -- presuming the car was up to date. It is not sufficient to say, "They don't know Russian tow trucks." They are claiming a general drive-anywhere vehicle, and new styles of tow trucks show up everywhere from time to time.
FKA
Tue, 2019-08-27 06:40
Permalink
When was the last time Musk said it was going to be this year?
Musk has declared lots of things. I don't put much significance in his declarations.
Waymo isn't testing their real deployment release? That's strange. I assume Tesla is testing their real deployment release of FSD (*), just not in cars being driven in Russia by non-employees driving cars without the hardware required for FSD.
FSD software will be more than just Autopilot plus bug fixes. There are lots of things that make that clear. The need for different hardware, for instance, means that they must have two different branches of code. There's just no way to build features that take advantage of the new hardware if they just showed up as bugs in cars running the old hardware. They have already said that the new hardware lets them use more cameras. Obviously using more cameras would mean that they would see more things. It's quite reasonable to guess that the additional active cameras might help them see trucks parked on the side of the road, or might help them see exactly how much room there is to swerve to the right. Furthermore, the plan is to sell Autopilot-only and to sell FSD. They can't do that if FSD is just a bug-fixed version of Autopilot. You can't charge more for a bug-free version of your vehicle. They'd get sued to oblivion for every crash caused because the driver didn't pay for the bug fixes. (Not to mention they have said that safety-related fixes will always be free. Of course they say that; the law requires them to do that.)
They're not claiming that the vehicle involved in the crash was a general drive-anywhere vehicle. Autopilot is not a general, drive-anywhere technology. It isn't even allowed to switch lanes without human confirmation in most areas of the world. It's unclear if EU regulations would even allow the car to swerve out of its lane without human confirmation in any situations, let alone one where there is a vehicle in that other lane. And yes, I realize the car wasn't in the EU. But AIUI it was in a jurisdiction where it wasn't allowed to be sold at all. I'm not sure what model it was, but it may very well have been an EU model.
(*) I also suspect that FSD is going to be just the misleading marketing name of something that isn't true autonomy, just like Autopilot was. But it may very well avoid crashes like the one that happened in Russia.
--
You know how software development works. Let me ask this: Do you agree that there's a branch of code that only runs on the new hardware? Do you agree that the vehicle in the crash was not running that branch of code? Do you agree that it's possible that this issue was fixed in the FSD branch, but that the fix wasn't backported? (Consider that maybe the fix couldn't be backported because it requires too much processing power that's take away from other critical things if it ran on the old hardware. That's quite possible, no?)
brad
Tue, 2019-08-27 12:10
Permalink
The last time
It's on the Tesla web page, so they say it every day.
I don't imagine that FSD is just a minor tweak to autopilot. Or even a major tweak to autopilot. It will be (whenever it actually comes) a much more complex product. However, it will very likely contain autopilot's tools as core components. The classifiers. The planner. The prediction engine. The world segmentation and mapping. The radar interpretation. Tesla has said they are taking an incremental approach. That means they are not discarding Autopilot and doing a full rewrite. Instead it means that the components of Autopilot will be in FSD. Yes, the networks will be much larger thanks to the new hardware they can depend on. However, the new processor is the only new hardware they have spoken of.
The larger networks will be generally better but work still continues, and should, on Autopilot for the 2.5 processor. Including crucial work, like handling things like this tow truck.
You are correct that FSD is indeed going to just be a marketing name. This is an open secret, not really even a secret, and the rest of the people in the industry are quite annoyed that Tesla is going to muddy the waters, and push them to call their products things like "real full full self driving" to distinguish from what is a city autopilot.
Of course the vehicle in the crash does not have the most advanced code. While they develop their "FSD" they are also actively developing Autopilot. This should involve both backporting what they can, but also a focus on solutions to the top tickets on the problem list. And hitting stopped vehicles should be super duper high on their list.
FKA
Tue, 2019-08-27 18:50
Permalink
The webpage does not say they'll be autonomous this year
Is there a particular page you're referring to? The order page says, "The currently enabled features require active driver supervision and do not make the vehicle autonomous."
Autopilot is lane keeping and adaptive cruise control. It doesn't even include switching lanes or taking interchanges, that is part of FSD. As such I would not say handling this tow truck is crucial for autopilot to handle. It's not part of lane keeping, and adaptive cruise control is known not to work well when there's a non-moving obstacle in the lane, especially at highway speeds. This crash only happened because both the driver of the Tesla and the driver of the tow truck did something stupid. There's only so far a car can go to protect people from themselves.
It's not clear how many of the "core components" of Autopilot will be used in FSD. It's not even clear how many of the tools you list are even used by Autopilot, which is a very simplistic system. Considering the need for more hardware, in order to process more sensors simultaneously, I doubt there will be much overlap at all between the Autopilot of today and the version of the software that will support autonomous driving. You say there's no new hardware other than the processor, and that's true in the sense that the car already had the hardware. But most of the cameras are not used by Autopilot in cars with the old processor. So while there are not 6 new cameras being added to the car, from the perspective of the software, there are. This completely changes things.
You say Tesla has said they are taking an incremental approach. I'd be interested in seeing the context of that.
Yes, you're right: It's not a secret that FSD, at least in the initial release, will not make the vehicle autonomous. It even says it right on the order page.
I agree the name is misleading. I also argued that the name Autopilot was misleading. You seemed to disagree with me about that. But yeah, the name is misleading. I'm surprised they're getting away with that.
--
Addendum: From what I can gather, Autopilot currently uses two cameras, the main and the narrow-angle forward facing camera. Imagine if your head was fixed facing forward and you didn't have any mirrors. Getting into this sort of crash might be understandable. When I said earlier that the Tesla didn't "see" the tow truck until it was too late, what likely happened is that the Tesla didn't "see" the tow truck at all. It likely first sensed it with the ultrasonic sensors.
FSD on Hardware 3.0 will use all eight cameras. It will process at 2,300 FPS vs. 110 FPS for Hardware 2.5. One of the additional six cameras is the wide forward facing camera. It will likely be useful in spotting the location of cars (and, probably more commonly, though not often on motorways, bicycles) in these sorts of locations. Additionally, automatic lane changes will help get the car out of the lane long before these situations arise in the first place.
Of course, there will still be crashes sometimes. When people do stupid things on motorways, crashes are sometimes unavoidable. Any idea what the speed was of the Tesla in this crash, prior to the application of the brakes? This particular crash is somewhat understandable, but a similar situation on a road that is shared between cars and bicycles is much more common (a situation where the view of a bicycle on the right hand side of a lane is blocked by a car until the car swerves out of the way at the last second).
brad
Tue, 2019-08-27 18:50
Permalink
On the web site:
So yes, "automatic driving on city streets" will require supervision, and is, if you read deeply into what Elon said, the thing he was calling feature complete full self driving. Of course, he's the only one calling it that.
Autopilot (especially Enhanced Autopilot which is what older customers have, some of whose features were moved into FSD for new customers) includes automatic lane change and various other things, and yes, they use all those components.
I am not sure when you say that the other cameras are not used by Enhanced Autopilot. Since I got my Tesla in 2018, when they had no other product, it's always used those cameras to track other vehicles around you. It certainly uses them for lane change.
FKA
Tue, 2019-08-27 19:36
Permalink
Elon
Yeah, "automatic driving on city streets" requires supervision. It's not autonomous driving. (And this crash wasn't on a city street anyway.)
Elon says crazy things. News at 11.
How do you know it's using the cameras and not the ultrasonic sensors?
I'll try to find some sources for the the fact that they're not using all the cameras in the older hardware. I believe one of them is on this very site, when you announced the new hardware here.
--
Wikipedia says it's using two, but Wikipedia's source is from 2017: https://electrek.co/2017/03/30/tesla-autopilot-2-0-camera-8-1-update/
I also found some outdated text from Tesla that said that EAP used 4 cameras and FSD used 8. However, now that EAP is gone I'm not sure what this means. I give up. Maybe current Autopilot uses 2, maybe it uses 4, maybe it even uses 8 (though I doubt it). Either way, it's doubtful, looking at the video, that any of the cameras are what picked up the truck. The ultrasonics most likely did that.
Anonymous
Wed, 2019-08-14 19:35
Permalink
There is conflicting information
There is conflicting information about whether autopilot was in or not.plus no info was it updated or not in recent months and if the car had it at all.
Google this
Как рассказал владелец Tesla, в машине был включен не полноценный автопилот, а режим "ассистент водителя ".
Driver told that it was not full autopilot but a driver assist
brad
Wed, 2019-08-14 20:26
Permalink
Autopilot and "driver assist mode"
Most sources have quoted the driver as saying he was in Autopilot. Autopilot is "driver assist mode" though Tesla also lets you turn on just adaptive cruise control and also always has forward collision warning (which may or may not have gone off) and forward collision breaking (which does appear to have gone off just before the crash.) These are considered driver assist features.
Tesla, when it gets logs from a vehicle, knows if it was in autopilot at the time of a crash. However, since this vehicle probably did not communicate with the cellular network in Russia, and was burned to a cinder, we will probably not get any firm answer on that.
Chuck
Wed, 2019-08-14 21:57
Permalink
Current autopilot predictive of FSD?
I'm curious how relevant today's autopilot is in predicting the next-gen FSD that Tesla plans to release. From what I gather, the order of magnitude greater processing power in the chip will let the car use the full camera resolution and accelerate/improve decision making.
The bigger question is if robo-driving (Tesla or other) can eliminate all the dumb human accidents, will it be ok if it messes up an occasional corner case while being 5x (pick your number) safer on average? Will be interesting to see when society is ready to make that trade-off. We will never eliminate all weird accidents, so when will we get used to them?
David
Thu, 2019-08-15 00:03
Permalink
No matter how much better the
No matter how much better the processing speed gets its going to be no where near the depth lidar that offers. 1 mode of sight vs 12 makes a big difference. If anything as processing gets better his sensors vs lidar is going to look like a pinto vs a ferrari.
David
Wed, 2019-08-14 23:57
Permalink
Cost Savings>Safety
This is simply summed up to Elon's attempt to cut corners and do things his way. Tesla is the ONLY company not using LiDar for there lvl 4 and 5 autonomous efforts. LiDar is 10x more effective and safer for consumers at scale, however it is more expensive. Instead of adopting the technology, he is running an experiment with his customers to see if he can get away with a cheaper solution. By selling his cars on a global scale and by collecting a significant amount more data he believe his tech should be able to function in that capacity. As he collects more data his cars will get better however there is only so far sensors can go. There ability to collect quality data is just not possible in order to perform pinpoint accurate perception and prediction analytics at 1/20000 of a second. It plateaus and his consumer pay the price. He has decided to gamble lives in his pursuit.
David
Wed, 2019-08-14 23:59
Permalink
Excuse the "there" instead of
Excuse the "there" instead of "their"...sent from phone.
Add new comment