Rumoured Google-Ford deal, low-end robocars, Tesla backslide and other news
Yahoo Autos is reporting rumours that Google and Ford will announce a partnership at CES. Google has always said it doesn't want to build the cars, and Ford makes sense as a partner -- big, but with only modest R&D efforts of its own, and frankly a brand that needs a jolt of excitement. That means it will be willing to work with Google as a partner which calls many of the shots, rather than just viewing them as a supplier, which gets to call few of them. Ford has the car-making skills, global presence and scale to take this to any level desired. Besides, if Google really wanted, it could buy Ford with cash it has on hand. :-)
This is combined with the announcement of what I predicted earlier in the year -- that Alphabet will spin out the self-driving car project (known internally as "Chauffeur") into its own corporate subsidiary.
While the big story of the week was the California regulations here are some other items worth of note, and non-note.
No, a whiz-kid hasn't duplicated what the big labs did
There was a fair bit of press about the self-driving car efforts of George Hotz. Hotz modified an Acura ILX to do some basic self-driving. It's a worthwhile project, and impressive for a solo operator, but because of the amount of press hype was so large, Tesla even issued a "correction" which is pretty close to spot-on.
I don't know Hotz or anything about his effort not in the story, but what is described is what were viewed as "solved problems" years ago by the major teams. What's interesting about his effort is how much less work is required to do it today. The sensors are much cheaper, the computing is cheaper and smaller, the AI tools and other software tools are much better and more readily available, and the cars are easier to interface to.
In particular, Hotz gets to take advantage of two things not easy for early teams. Today's cars are all controlled by digital signals on internal controller area network buses. Many cars are very close to "drive by wire" if you can use that bus. The problem is, most car vendors are very protective about their bus protocols, and also want to change them, so it's hard to make a production system based on unsupported protocols learned via reverse engineering. Better to take the hard but certain route of pretending to be the sensors in the brake pedal and gas pedal, and wire on to the steering motor.
The other rising trend of interest is the surge of capability in convolutional neural networks and the "deep learning" algorithm. Google loves these tools and just open sourced the TensorFlow package to spread it out into the world. This is starting to affect the conclusions I wrote in my article several years ago on the question of whether cameras or lidar will be the primary sensor in a robocar. In that essay, I conclude that computer vision is still too uncertain a quantity to predict, while cheap lidar is a safe and easy prediction. Computer vision is improving faster than expected, though it's still not there yet. It is this, I think, that gives Elon Musk the (still probably false) confidence to declare lidar as the wrong direction.
Tesla steps back
Rumours also say that Tesla's latest update scales back the autopilot capabilities including limiting the autopilot to the speed limit +5 on smaller roads, something that will upset customers for sure.
Many people with whom I have had conversations have felt that Tesla's early autopilot release was reckless. And Elon Musk perhaps agrees, because he has noted that videos show customers clearly doing unsafe things with the autopilot. The Tesla autopilot handles most highway conditions, and in fact lulls people into thinking it handles them all. In reality it is a system that needs constant monitoring, like a good cruise control. Some of us have feared it's a matter if when, not if, a Tesla on autopilot will have an incident.
One comment from Tesla has particularly concerned me. It is said the Teslas are improving every week based on learning from data gathered from all the Teslas out running in autopilot, perhaps a million miles a day. That is an impressive and useful resource, and Tesla has even said they would love for them to learn every day. Learning is good, but that rate of learning strongly suggests that no human quality assurance is being done on the results of the learning -- the QA is being done by the customers. I fear that is not a safe approach at this stage of the technology.
Many more teams and entrants
Baidu has stepped up their efforts and now also will work on buses. They have also stepped up their partnership with BMW. Samsung has entered the fray as well as Kia (Hyundai announced big plans earlier this year.) Tata group has also announced plans but through the Tata Elxsi design division, not Tata Motors. (Mahindra earlier offered a prize for robocar development in India.)
Comments
Tasha Keeney
Wed, 2015-12-23 07:58
Permalink
Map collecting - LiDAR v. cameras
Hi Brad,
If Tesla is collecting its mapping data with a camera based system, if then it were to start using LiDAR as the primary vision component is it easy to run off of the camera-collected data? Or do you suspect there would be some re-mapping done with LiDAR?
Thank you for your insight!
brad
Wed, 2015-12-23 10:10
Permalink
Hard to map
It depends on their approach. I don't think Tesla is building fine detail maps in an abstract sense, but I could be wrong. Some of their data might percolate through -- and since you don't have to get rid of your cameras to use LIDAR, the data are useful in any event for the cameras. Tesla has not said what they are training with the data from operating cars, other than reports of behaviour (like learning to handle curves and exits differently.) It is probably helping improve their classifiers (software to tell what things are in the camera view.)
Add new comment