Car and Driver evaluates autopilots, and other news.

In a recent article, Car and Driver magazine compares 4 of the Highway autopilot systems, including those from Tesla, Mercedes, BMW and Infiniti. They test on a variety of roads, and spoilers: The Tesla wins by a good margin in several categories.

It's a pretty interesting comparison, and a nicely detailed article. They drove a variety of roads, though the reality is that none of these autopilots are much use off the highway, and they are not intended to be as yet. Each system will perform differently on different roads. People report a much better score for the Tesla on Highway 280, which is the highway closest to Tesla HQ.

Still, it should wake up people who want to compare Google's report of needing an intervention to prevent an accident every 70,000 miles (or 5300 miles between software anomalies) and needing intervention every 2 miles on the Tesla and twice a mile on the Infiniti, on average.

Other News notes:

  • Google is expanding testing to Kirkland Washington -- hoping for some heavy rain, among other things.
  • The California DMV hearings were contentious. You can hear a brief radio call-in debate with myself and one of the few people in favour of the regulations at KPCC's "AirTalk". Google threatened that if the regs are passed as written, they will plan to first deploy outside of California, and they probably mean it.
  • A small autonomous shuttle bus is doing test runs in the Netherlands, joining several other projects of this sort.
  • Porsche has come out against self-driving. Who would have thought it?
  • Baidu and Jaguar/Landrover are both upping their game. While you probably won't automate off-road vehicles any time soon, having one that takes you to the countryside where you take the wheel can be a nice idea.
  • In Greenwich, the self-driving shuttle pilot there will use vehicles based on the Ultra PRT pods from Heathrow. Ultra's pods have always been wheeled cars but they needed a dedicated track. Today, they can be modified not to.
  • Steve Zadesky, supposedly the lead of Apple's unconfirmed project Titan, has left Apple. Rumours suggest a culture issue. Hmm.
  • The Isle of Man is tiny but is its own country -- they are giving serious consideration to being a robocar pilot location. Last year I had some talks with another channel island on the same topic. There are advantages to having your own country.

Comments

Of course, when you want to test a system you operate it near it's margin. But when you use a system properly you don't spend a lot of time operating at the margins. The course they drove to stress the four cars makes sense if you're trying to induce failures, but it's not a representative sample of where those systems are intended to be used, or where they actually get used. I can't speak for the other three cars, but the Tesla's performance is much better than suggested by the figure "one intervention every two miles". I've logged thousands of highway miles in auto drive on my Tesla and I'd estimate interventions are more like one every two hundred miles. Even that figure is misleading as those interventions are, almost without exception, things I did for my own peace of mind and not situations where, had I not intervened, an incident would have occurred. Auto drive is a new technology and I'm still getting used to it so if a situation makes me uncomfortable I tend to drive manually. Out of curiosity I sometimes push the limits and the car always exceeds my expectations. Last week I encountered a long stretch of fresh blacktop that had been laid down on I-5 where there were no lane markers, no barriers, and no other vehicles in my field of view. I was curious about how far the car would continue before it would find the lack of external location references unacceptable and force me to take over. It never did - after perhaps half a mile the lane markings returned and the car continued on without complaint. I assume it must have used GPS during that stretch. I was shocked.

I wish the referenced article had provided more information about the testing and, in particular, what version of the software was being compared on the Tesla. It would also be interesting to know the distribution of interventions. Did the Tesla ever fail when used in it's recommended highway environment, or were all of it's interventions at intersections on surface streets? Also, the recent 7.1 release is a substantial improvement over the original 7.0 in terms of auto-drive performance. I have no idea what magazine story lead times are like, but I wouldn't be surprised if they were testing the older, weaker, version of the software. I notice that they described their car as having the tech package upgrade and including that in their estimate of over $4000 for the cost of Tesla's Autodrive feature. The tech package has not been offered for some time now, and has never been required for auto-drive (the hardware has always been included in the price of all models of the car). The incremental cost of the software activation is only $2500.

I understand the need to make an article entertaining and succinct, and I appreciate that someone is doing these kinds of comparisons, but the errors in the article combined with the test representation almost seem intended to minimize the difference between the Tesla and the other vehicles, despite awarding the Tesla best in class performance.

They should have done a territory breakdown, and were going beyond what the products are designed for on the lesser roads. Of course they are Car and Driver. I hope we will see more detailed comparisons on different types of roads.

Though oddly, I think that when you get to the one-intervention-per-day level, you actually get into a danger zone, where people trust the system more than they should. Cruise controls require constant intervention on the steering wheel, and non-adaptive cruise control requires pedal intervention every few minutes too, even in light traffic. So you never imagine you could ignore the road, and you do imagine that with Tesla autopilot.

I totally agree that we're in uncharted territory. Tesla's boldness here is quite striking. It's difficult to imagine established car makers ever attempting something like this and I suspect that they will have a hard time trying to come up with a response to Tesla no matter how much pressure they start to feel.

Of course it's possible that people will make new categories of errors when complacency sets in, but it is also true that the old categories of errors go away as people become more expert with using the system. In my experience the disengage situations have always involved plenty of warning. Once you've used the system for a while you can tell when the car is becoming 'uncertain' and you find yourself anticipating these occurrences based on environmental conditions. With experience I think I will become very adept, even without trying, at knowing exactly how much margin the system has in whatever circumstances are coming up. I think most other drivers will experience the same growth and will end up using the system where it works well, and rarely otherwise. It's not a useful feature if you have to constantly intervene, after all, and people will become adept at knowing where it's useful and where it is not. I think that this will be reflected in the accident statistics.

It feels to me a bit like riding a horse. A horse is pretty good at walking and can generally do a lot things unsupervised, but a rider still has to maintain the situational awareness to avoid events where the horse will not be able to cope well. With experience riders learn to anticipate the often complex sets of conditions under which the horse won't behave optimally, even if those situations are quite obscure. This doesn't mean riders have to be continuously vigilant - their subconscious will tell them that a manicured lawn might involve sudden activation of a sprinkler than can spook the horse. In return for a little bit of general caution on the rider's part the horse takes over the detail work of planting its feet and in general the experience is quite relaxing and safe.

We're likely to have a good idea of the overall merits by the end of 2016. At that point there will be a decent size population of drivers with a lot of hours logged on the system and, with Tesla capturing logs of every event, it will be possible to state with some confidence what the net effect is on vehicle safety. If the system performance continues growing by leaps and bounds I think the question will be how much better it is than human drivers, not whether it is. I expect to see a quite substantial reduction in accidents when the system is used in it's recommended manner, but of course there is plenty of room for surprises in either direction.

Add new comment