Mercedes and Vislab release videos of their real-road tests
Videos have been released on some real-world tests of robocars. The most notable is from Mercedes.
As a nice reflection on the past, Mercedes drove the 100km route done by Bertha Benz in the first automotive road trip 125 years ago. You will also find that this alternate video is much better at talking about the technical details of the vehicle.
The Vislab team from Parma also released video of their drive around town. As the name suggests, Vislab's research has a focus on computer vision, though this test vehicle also has 3 small LIDARs.
The Mercedes video has a lot of statements from MB engineer Ralf Herrtwich about their goals in doing this using existing sensors (cameras and radar primarily) and not (though he does not name it) LIDAR which he says is years or decades away. While I don't want to criticise the accomplishments of his team, nor in any way deny that everybody would love to be able to make a safe driving system using the most cost-effective sensors, his philosophy seems backwards to me.
First, those of us used to Moore's law think that planning to use hardware that is expensive today but which will come down greatly in price by the time things are commercialized is the obviously right strategy. It seems backwards to limit yourself to the technologies of today in planning a product for the future just because they are cheaper today. To use the metaphor of a great Canadian athlete, you skate to where the puck is going to be.
This is magnified by the fact that the problems of robocars are safety problems, not problems of cost or ones of appearance. With safety as the dominant goal, it seems very odd to me to imagine that one would, in the first vehicles to be made, avoid using a sensor that could improve safety and performance markedly just because of cost or appearance. If the cost difference were forecast to be ridiculous, one could consider it, but it makes no sense if the cost is within the noise to early adopters. That's why Tesla is able to succeed with such an expensive car -- the early adopters are more interested in a cool, high-performance electric car than they are in the cost. The other argument that is made -- that the established sensors are more tested and robust -- has some merit but is surely a short term optimization.
It could be argued that attempting to build a vehicle without LIDAR is skating to where the puck is going to be in the next game. After all, there is optimism that vision and radar will be enough for safe driving some day. As we all know, humans can drive with simple vision -- even with one eye closed -- and no radar or other sensors aside from hearing. So some day, cameras and a lot of processing probably can safely drive a car, and do it with low cost hardware. But the first production robocar? Deliberately not having lasers when it's such a challenge to meet the safety goals? It seems very unlikely.
The notes on appearance are also odd to me. (It is commonly noted that research sensors like the Velodyne are big and make the car look unusual and not like a car.) We even see the IEEE Spectrum keen on how the new CMU car does not look like a robot unlike BOSS from the urban challenge. While the research vehicles like BOSS were over-the-top on top, I think the reaction of early adopters is going to be quite the opposite. They will want their shiny new robocar to look distinctive and clearly different from regular cars. Prius owners reacted the same way, and there was not even much need for the Prius to have such a distinctive shape, though being more like a raindrop never hurts.
I suspect this approach is in part inspired by a marketing goal. The auto companies, not wanting to appear to be trailing Google on robocar research, are making extra effort to appear to be on a different course, and in fact ahead of Google and the rest on that path. "We're doing what the competition is doing, but we're not as far along" is not a very good press release. That's OK if it were just for appearances -- and I'm in favour of there being many competing approaches because any paradigm, including mine, can turn out to be wrong -- but I hope that these teams really expect their approach is the best and fastest path to a safe and capable vehicle.
Here, by the way, are more details of the 33 mile trip by the GM/CMU collaboration. This vehicle has an "automotive grade" LIDAR -- meaning one of the smaller ones that is one to four planes, not the giant 64 plane Velodyne used by CMU's BOSS, Google and many others.
Thu, 2013-09-12 22:14
It's interesting that the S500 is relatively easy to upgrade to full self driving capability. I wonder how many cars like this will already be on the road in a few years when commercialization of self driving cars takes place. At that point, millions of existing cars might quickly become self driving after minor hardware upgrades and a major software upgrade.
I've submitted this post to the autonomous car community on reddit at www.reddit.com/r/SelfDrivingCars
Fri, 2013-09-13 00:37
"ready to upgrade"
It should be made clear that the Mercedes is not easy to upgrade to full self-driving capability. What is possible is to do a 100km research drive. There are no reports on how many times they tried it, and if their safety drivers needed to perform interventions along the way. Possibly they had a perfect drive, who knows, maybe even the first time.
But there is a really huge difference between doing a research drive of 100km and the real thing. A really huge difference. Orders of magnitude more work, as it turns out. They are 1/10,000th of the way there, perhaps. But what I say in my article is that they are not really even that, because the first cars won't use the stock sensors in a Mercedes S, not unless there is a real revolution in computer vision. And probably not even then, since the first car will want to be as safe as it can be (at almost any price.)
Fri, 2013-09-13 07:21
Ah, my mistake. I didn't mean to say that it's currently easy to make the S500 fully self-driving for consumer use. However, would you agree that by 2017-2020--with the available hardware and software in that time--that it could be relatively easy to upgrade an S500 to be fully self-driving?
Mon, 2013-09-16 01:48
What is upgrade
I believe the cars released at the end of this decade will use lasers, unless there has been a major breakthrough in computer vision, and probably even if there has been one. So it's a change from what you see in cars today, but you could call that an upgrade.
Fri, 2013-09-13 07:25
Mercedes in no rush
It appears Mercedes is in no rush to get self driving cars in production. They appear to not be taking the disruptive threat of robocars seriously. But then you could say that about lots of the manufacturers.
Thu, 2013-09-26 07:51
The safety problem is not one that technology can solve
Technology cannot solve the safety problem because it's not actually a technology problem. It's a liability problem, a regulatory problem. What lawmakers (and insurance underwriters) want is a car that will never ever, in any possible scenario, run someone over while under autonomous control. That isn't something that lidar is going to fix in itself; to the extent that lidar lets people use autonomous cars, it will be acting as a security blanket rather than a technical requirement.
Wed, 2013-10-09 16:21
How Moore's law works in practice
There's a bit in this that's been niggling at me for a couple of weeks now, and I'm finally coming back to comment on it. I think that your reference to Moore's law up there should be tempered by what I believe is the big cost driver behind the LIDAR systems: The LIDAR depends on physically moving parts. Historically, this is not a class of component that follows Moore's law nearly as well as fixed silicon (And, of course, I typed that and realized that hard drives may be a counter-example, but the abysmal MTBF in those and the fact that we're moving to SSDs may just bolster my argument).
On the other hand, the expensive part about image processing right now is not the actual sensing unit. Cameras are cheap, processing the imagery coming in from the cameras is where the improvement lies.
So if, based on historical anecdote, I were to put a bet on the future of sensing technologies, I'd go with better understanding of straight-up images.
Wed, 2013-10-09 22:39
Moore's law changes the electronics, the lasers and many aspects of the motors. Some LIDARs spin, others move mirrors on servos and voice coils, still mostly electronic, and all of which benefit. The non-electronic parts still get cheaper from volume and DFM.
Consider the DVD player, which also has spinning and lasers. It went from $1,000 to $40 in a decade. And people might have said, "That can never happen, it has moving parts."
Add new comment