California DMV regulations may kill the state's robocar lead

Topic: 
Tags: 

Be careful what you wish for -- yesterday the California DMV released its proposed regulations for the operation of robocars in California. All of this sprang from Google's request to states that they start writing such regulations to ensure that their cars were legal, and California's DMV took much longer than expected to release these regulations, which Google found quite upsetting.

The testing regulations did not bother too many, though I am upset that they effectively forbid the testing of delivery robots like the ones we are making at Starship because the test vehicles must have a human safety driver with a physical steering system. Requiring that driver makes sense for passenger cars but is impossible for a robot the size of breadbox.

Needing a driver

The draft operating rules effectively forbid Google's current plan, making it illegal to operate a vehicle without a licenced and specially certified driver on board and ready to take control. Google's research led them to feel that having a transition between human driver and software is dangerous, and that the right choice is a vehicle with no controls for humans. Most car companies, on the other hand, are attempting to build "co-pilot" or "autopilot" systems in which the human still plays a fundamental role.

The state proposes banning Google style vehicles for now, and drafting regulations on them in the future. Unfortunately, once something is banned, it is remarkably difficult to un-ban it. That's because nobody wants to be the regulator or politician who un-bans something that later causes harm that can be blamed on them. And these vehicles will cause harm, just less harm than the people currently driving are doing.

The law forbids unmanned operation, and requires the driver/operator to be "monitoring the safe operation of the vehicle at all times and be capable of taking over immediate control." This sounds like it certainly forbids sleeping, and might even forbid engrossing activities like reading, working or watching movies.

Special certificate

Drivers must not just have a licence, they must have a certificate showing they are trained in operation of a robocar. On the surface, that sounds reasonable, especially since the hand-off has dangers which training could reduce. But in practice, it could mean a number of unintended things:

  • Rental or even borrowing of such vehicles becomes impossible without a lot of preparation and some paperwork by the person trying it out.
  • Out of state renters may face a particular problem as they can't have California licences. (Interstate law may, bizarrely, let them get by without the certificate while Californians would be subject to this rule.)
  • Car sharing or delivered car services (like my "whistlecar" concept or Mercedes Car2Come) become difficult unless sharers get the certificate.
  • The operator is responsible for all traffic violations, even though several companies have said they will take responsibility. They can take financial responsibility, but can't help you with points on your licence or criminal liability, rare as that is. People will be reluctant to assume that responsibility for things that are the fault of the software in the car they use, as they have little ability to judge that software.

No robotaxis

With no robotaxis or unmanned operation, a large fraction of the public benefits of robocars are blocked. All that's left is the safety benefit for car owners. This is not a minor thing, but it's a small a part of the whole game (and active safety systems can attain a fair chunk of it in non-robocars.)

The state says it will write regulations for proper robocars, able to run unmanned. But it doesn't say when those will arrive, and unfortunately, any promises about that will be dubious and non-binding. The state was very late with these regulations -- which is actually perfectly understandable, since not even vendors know the final form of the technology, and it may well be late again. Unfortunately, there are political incentives for delay, perhaps indeterminate delay.

This means vendors will be uncertain. They may know that someday they can operate in California, but they can't plan for it. With other states and countries around the world chomping at the bit to get vendors to move their operations, it will be difficult for companies to choose California, even though today most of them have.

People already in California will continue their R&D in California, because it's expensive to move such things, and Silicon Valley retains its attractions as the high-tech capital of the world. But they will start making plans for first operation outside California, in places that have an assured timetable.

It will be less likely that somebody would move operations to California because of the uncertainty. Why start a project here -- which in spite of its advantages is also the most expensive place to operate -- without knowing when you can deploy here. And people want to deploy close to home if they have the option.

It might be that the car companies, whose prime focus is on co-pilot or autopilot systems today, may not be bothered by this uncertainty. In fact, it's good for their simpler early goals because it slows the competition down. But most of them have also announced plans for real self-driving robocars where you can act just like a passenger. Their teams all want to build them. They might enjoy a breather, but in the end, they don't want these regulations either.

And yes, it means that delivery robots won't be able to go on the roads, and must stick to the sidewalks. That's the primary plan at Starship today, but not the forever plan.

California should, after receiving comment, alter these regulations. They should allow unmanned vehicles which meet appropriate functional safety goals to operate, and they should have a real calendar date when this is going to happen. If they don't, they won't be helping to protect Californians. They will take California from being the envy of the world as the place that has attracted robocar development from all around the planet to just another contender. And that won't just cost jobs, it will delay the deployment in California of a technology that will save the lives of Californians.

I don't want to pretend that deploying full robocars is without risk. Quite the reverse, people will be hurt. But people are already being hurt, and the strategy of taking no risk is the wrong one.

Comments

A disappointing outcome and a little ironic coming from a department with the motto "Driving Change".

The ruling suggests that the discovery by Google and more recently by Tesla that humans are likely to be too distracted to be of much use when instantly needed appears to have been ignored. At the very least this ruling should have a sunset clause built into it lest it becomes a modern version of the Red flag traffic laws. It is critical that driverless cars begin with safety foremost in mind and so starting off with low speed driving in carefully selected areas and only expanding as confidence is gained seems a more logical approach. It is inevitable that there will be some accidents, what matters is the relative safety of driverless vehicles and this is best found by analysing large amounts of metadata from real world trials. Driverless cars have the potential for some huge environmental and safety benefits, it is possible that a risk averse approach ultimately creates the perverse outcome where these are unnecessarily delayed.

"It will be less likely that somebody would move operations to California because of the uncertainty. Why start a project here — which in spite of its advantages is also the most expensive place to operate — without knowing when you can deploy here. And people want to deploy close to home if they have the option." - I do agree.

We are in a meta-trolley problem.

Do nothing, and over 30,000 people will die in car accidents in the US alone. Globally, over 1,000,000 people will die, tens of millions injured. Replace all cars with self-driving cars, you'll reduce the deaths by at least an order of magnitude, but state regulators would be at risk for allowing a change to the status-quo.

The moral imperative is clear - to reduce casualties as fast as possible by introducing self-driving cars as soon as they are ready (i.e. SDCs kill less people than human drivers - an extremely low bar). To delay SDCs for even a day is to sentence hundreds to death.

Honestly - I don't care how SDCs solve the trolley problem, how SDC insurance works or any of that stuff - we'll figure it out as we go along. We should make sure we don't kill thousands of people because we are stuck in the meta-trolley problem.

If the CA government were shown how much oil would be saved by driverless cars, and how many parking lots could be turned into green spaces, they will change their tune.

It is a matter of reframing the benefits, nothing more.

I strongly believe that we need to make our highway transport systems intelligent if we are to use smart cars. We can retrofit existing vehicles to have a minimum of sensors. Still electronics on the road way is key to getting safe automated transportation. I strongly believe as well that energy conservation, shared ownership/rent versus buy, and multi user rides are the way to go. I think government should be involved to a degree that would probably offend libertarians although I am not proposing the government simply own everything. This is a practical problem in Silicon Valley with traffic and I am frankly amazed that Silicon Valley Google Engineers can't see what is in front of their own eyes. Maybe we have to experiment in Massachusetts first, we already have toll roads.

I have an essay in progress making the case for the opposite. The roads should be as dumb as possible, the infrastructure as virtual as possible and the cars as smart as possible. We replace our cars frequently (shared robotic taxis will only last 5 years if they are like existing taxis) while our road infrastructure gets replaced after scores of years, if we are lucky.

Self-driving cars have already moved to Texas, leaving California behind. Increasing regulation was a huge tactical mistake by the wannabe controlling individuals.

Before a robocar hands back control to a human driver it will need to test the driver, after the driver responds correctly then control would pass back. In an emergency there is no time for a proper hand-off test/validation.

To further reduce crashes elevate the cars above traffic on guideway. Reducing vehicle-to-vehicle interactions is how to make travel safe. Overpasses in the form of elevated guideway is one cost-effective method. Self-driving could then be at non-lethal speeds and high speed would be done on the safe elevated guideways in a grid across the city. The only way forward is to use data-driven decisions. Study the fatality data and respond proportionately to the data. Rollover in tall pickups and SUVs (8000+ dead) would be first to be easily fixed by edict. Edicts that say self-driving cars must leave the state are counter-productive to actual safety. They should use their edict powers to stop the slaughter lowering pickup truck heights as one clear easy to understand example. There is NO reason trucks need to ride so high but the manufacturers are in a race to be stupid with no one manufacturer willing to say enough is enough and put them back near the ground where they are more useful and where they used to be. Fire is also a killer looking at the data. Make cars that do not burn the occupants. Self-driving cars is an opportunity to hit the reset button and stop making cars that burn out of control (520 deaths last year) in a known percentage of crashes.

Number of fatal self driving crashes less than 30 mph ---> zero.

Add new comment