Commentary on California's robocar regulations workshop

Topic: 
Tags: 

Tuesday, the California DMV held a workshop on how they will write regulations for the operation of robocars in California. They already have done meetings on testing, but the real meat of things will be in the operation. It was in Sacramento, so I decided to just watch the video feed. (Sadly, remote participants got almost no opportunity to provide feedback to the workshop, so it looks like it's 5 hours of driving if you want to really be heard, at least in this context.)

The event was led by Brian Soublet, assistant chief counsel, and next to him was Bernard Soriano, the deputy director. I think Mr. Soublet did a very good job of understanding many of the issues and leading the discussion. I am also impressed at the efforts Mr. Soriano has made to engage the online community to participate. Because Sacramento is a trek for most interested parties, it means the room will be dominated by those paid to go, and online engagement is a good way to broaden the input received.

As I wrote in my article on advice to governments I believe the best course is to have a light hand today while the technology is still in flux. While it isn't easy to write regulations, it's harder to undo them. There are many problems to be solved, but we really should see first whether the engineers who are working day-in and day-out to solve them can do that job before asking policymakers to force a solution. It's not the role of the government to forbid theoretical risks in advance, but rather to correct demonstrated harms and demonstrated unacceptable risks once it's clear they can't be solved on the ground.

With that in mind, here's some commentary on matters that came up during the session.

How do the police pull over a car?

Well, the law already requires that vehicles pull over when told to by police, as well as pull to the right when any emergency vehicle is passing. With no further action, all car developers will work out ways to notice this -- microphones which know the sound of the sirens, cameras which can see the flashing lights.

Developers might ask for a way to make this problem easier. Perhaps a special sound the police car could make (by holding a smartphone up to their PA microphone for example.) Perhaps the police just reading the licence plate to dispatch and dispatch using an interface provided by the car vendor. Perhaps a radio protocol that can be loaded into an officer's phone. Or something else -- this is not yet the time to solve it.

It should be noted that this should be an extremely unlikely event. The officer is not going to pull over the car to have a chat. Rather, they would only want the car to stop because it is driving in an unsafe manner and putting people at risk. This is not impossible, but teams will work so hard on testing their cars that the probability that a police officer would be the first to discover a bug which makes the car drive illegally is very, very low. In fact, not to diminish the police or represent the developers as perfect, but the odds are much greater that the officer is in error. Still, the ability should be there.

It is important to reiterate just how different the driving logic of software will be compared to the thinking of human drivers. Human drivers knowingly break the law all the time, and they get sloppy all the time. Software is not perfect, but if it breaks the law, it will be for very different reasons.

One way this could happen is if a person who has summoned their unmanned vehicle to come to them commands the vehicle remotely to speed. A person in a hurry might do that. And they'll get in a world of trouble if caught. It's much more likely that a vehicle carrying a person would be told to speed by its occupant -- in fact that's a good thing to do -- but in this case the occupant is responsible, and the occupant is already compelled by law to make the car pull over if police signal this.

Who gets the ticket?

This question gets asked a lot. I'm just going to come out and say what none of the car companies are willing to say. If a vehicle breaks the law because of a bug, or because it was programmed deliberately to break the law, the developers of the car should have responsibility. Nobody will say this because you never want to go on record saying you should have responsibility. It can only hurt you to say this, never help you. So we'll fight over this but I can't see it going any other way in the long run.

I say in the long run, because in the early days, the early adopters, keen to get robocars, might be quite willing to sign a contract taking all responsibility onto them. They will do this fully informed -- they are willing to take the risk to be early adopters. This even makes sense -- if you want to be the first to try an experimental technology, you do bear responsibility for that decision, even if the cause of the particular bug that caused trouble had nothing to do with you.

This works for a while, but once you move away from early adopters, the ordinary public will not accept a vehicle that, if the vehicle makes a mistake, leaves them on the hook for liability, fines and demerit points or in extremely rare cases, criminal charges.

So the market will sort this out. Let the early adopters sign a contract taking responsibility. But as the market matures, that will fade away.

Once again, a vehicle doing something for which it would get a ticket should again be an extremely unlikely result. The developers would have tested the vehicle extensively to fix any such issues. If their own testing doesn't find it, users of vehicles will notice it and report back to the vendor. It will be very rare for the police to witness such a bug for the first time, but of course not impossible.

How does a vehicle "come to a stop"

California's statute requires the regulations to govern times when a vehicle must come to a safe stop. And again, car developers will work to make this happen, even if components of their systems fail. Fault tolerance is frequently discussed among developers, it is not something they are unaware of. The question came up of "what if the car fails in a tunnel, with no shoulder?" Chances are, the car's map knows where the shoulders are, and so it will do its best not to stop in the tunnel. But if it has to, it has to. If regulation is needed here, it should be road-specific. Late the state say, as it already does on road signs, if there are specific regulations for specific roads. It would be nice if it said it in a database of road-specific regulations.

Nobody wants to stop in a dangerous place or to cause a big traffic jam, except perhaps New Jersey governors.

Driver's test for cars and operators

The topic came up of having a driver's test for the cars. This would mean a developer would show up at the DMV and a tester would put it through some paces, as new drivers are tested. This is not, on the surface, a bad idea, but it has a few big problems:

  • Cars need not drive all roads and conditions in order to enter service only on limited roads and conditions. So you might need a different test for each car. If the car does not do streets over 40mph, for example, that's perfectly fine, it can just refuse. Human drivers can't do that.
  • You don't want to have to do a new test with every new software revision.
  • The car might not even handle any streets around the DMV, so testers would need to come to where the car operates. For example, a shuttle that drives around the Google campus would not be able to do anything else, and would need a test just for that situation.

More problematic is the idea of a test for operators of the vehicles. Nevada's law requires you get an endorsement, but that's just some paperwork, you send in $5 and sign to say you understand some basic rules.

Tests for operators present many problems. They would be different for every car, for one thing, until standardization arises a decade down the road. Cars might even differ after a software upgrade. In this case the car will put operators through some training to be sure they know about the difference, but a new test at the DMV is a different order.

Secondly there is the problem of people who come from out of state. You can't be required to do new driving test just to pick up a rental car at the airport, or have it pick you up. This is not just practical, it's probably not even legal. Perversely, for example, while a Nevada resident has to have the endorsement on their licence to operate a robocar in Nevada, I with my California licence do not!

User privacy

John Simpson from Consumer Watchdog expressed a lot of worry over the car allowing the maker (and Google most of all) to track your movements. He wants to be sure people can turn off any reporting of their travels back to Google. While Google has a pretty good history of providing opt-out on location tracking -- they do a lot of location tracking via Android -- it is not the owners of cars who need to be so worried.

In my view, and that of many others, taxi service is the real future of vehicles, and taxi clients are not owners and will not be able to configure the car's privacy settings unless there's a lot of market pressure to do so. Even if they do, and we work out a means of anonymous payment for taxi services, or the erasure of logs, the trips themselves need to be logged to manage the cars, and you can't really disconnect your identity from the logs of a cab that took an anonymous passenger from your house to your office in the morning. It's very difficult.

The law does demand that cars record all data for 30 seconds before an accident. The reason for this is clear, but owners, in this case, might have the right to not have their technology be required to betray them. There are black box recorders in many cars these days that do this, but people have the right to disable them. This only applies in autonomous mode, however.

Markings on the car

Some people think robocars should have some light or indicator required by law to show they are in self-driving mode. I think that's a very poor idea. The vehicle is either safe on the road or it isn't, and there is nothing that other drivers should do differently around it if it's safe. I could see this for testing (like a student driver sign) but that would actually interfere with testing, since the goal of testing is to see how the car performs in real world situations, not ones where people are scared of it.

It's a moot question today -- Cars with LIDARs on them are quite obvious -- and I suspect they will always be pretty obvious in the future, even if the sensors get smaller.

Slow Operations

A new issue I'm going to start raising is one of slow operations. A simple reality is that the easiest path to safety in these early phases is just to go more slowly. It's not a perfect path, but there are vehicles like the Navia that one is comfortable seeing operate at 20 km/h but not at 40 km/h.

While there has been much talk of whether the cars might exceed the speed limit (they definitely should if there is an occupant in them commanding this) I have not seen much talk about minimum speeds. We don't want to have super-slow vehicles blocking traffic over the long term, but we might want to decide to legally tolerate it in the introductory phase of this technology, to speed up the benefits that come with time.

Owned vehicles vs. taxis

The hearings were almost entirely about cars that are sold to and operated by individual owners. That's what the car companies all imagine, but another school of thought suggests taxi service is the most interesting market. I touched on the privacy issues there, but we're not talking about this nearly as much as we should.

Taxis need to be unmanned to self-deliver, though they can go slow on more limited roads to do that. The desire for taxi service pushes up our consideration of unmanned operation.

The DMV does licence taxis and taxi drivers, but 99% of what they do revolves around ordinary car sales to driver-owners, and licencing these folks. That's going to change.

Comments

A few reasons for police to pull over a robot car:
1) A person or animal is being dragged along under the car.
2) There is a hazard further down the road which had not yet been published on Google Waze. E.g. a chemical spill, or zombies.
3) There is a suspected criminal in the car.

The car should not be capable of dragging anything, it should have sensors to detect this in the various ways you can detect it, and pull over itself.

2) The police don't pull over cars for this, they do what's called a rolling slowdown or traffic break or moving roadblock, where they start weaving quickly back over all the lanes with lights on. The car should detect this (it's a pretty obvious pattern but does confuse even humans when they do it. However, if the car does not notice this, it has a bug and that would be a case where they should then try to signal the car directly. Though I do think that having live traffic feeds (including information of this form regarding emergency road and lane closures) would be good for both robots and humans. Cops have computers (and even smartphones) and why not have them issue a command to say they are closing some lanes?

3) If there is somebody in the car, it is their job to notice that the police are attempting to order the car to pull over, and to command the car to do that (or take the wheel.) If they don't, it's a serious offence, but no different than anybody else who refuses to stop for police. Except no high speed chase involved here. Frankly, being in a robocar would be a pretty poor way of trying to escape police!

If you're suggesting the police should be able to force the car to stop against the will of its occupant or owner, I'll fight you pretty hard on that one. Your technology should not betray you. You are innocent until proven guilty.

While I can understand you independent drivers want to maintain a modicum of control over your environment, even if your car is driving for you, those of us who are drunk or elderly or [like me] blind really cannot tell when the cops want to pull us over and will require some kind of autonomous control if a cop wants to stop us for some reason. I can imagine taxis would probably have the same requirement. Keep in mind, we're hoping that, in the future, instead of takeout food you can have the restaurant's car service come take you out to the restaurant (and back home again) no matter your age or infirmity.

Actually, I would presume that 99% of drunks, unless they are at the point of unconsciousness, could hear a siren and respond to it. Even a blind person can figure that out -- you hear sirens whoop-whoop close by and a police bullhorn saying, "Pull over, you in the red/green/blue/yellow Google car!"

However if you have a truly incompetent passenger, the car should know that and act as though it is in unmanned mode.

In addition, even a car with a passenger will still notice the police, and it will signal that passenger that it has noticed the police and ask what to do. It probably would pull over if the occupant does not respond to that query. It has a number of techniques available to it to get your attention, including sound, lights and tapping the brakes to jerk you into alertness.

Don't Police force cars to stop all the time today? Why would you want to refuse to refuse to stop when a legal authority asks you to?
You expect that the occupant notices this order to pull over and responds. But you want autonomous cars where there is no driver in control (in fact no occupants present at all at times) and the occupant is busy doing things other than driving. Now you want them to be attentive to legal authority directives...but the vehicle not to be.

In your piece http://www.templetons.com/brad/robocars/government.html you suggested that robocars should be allowed to break the law:

1. Should be allowed to double park even in front of hydrants
2. Speed, since they should be able to keep up with others breaking the law.
3. Do rolling stops at stop signs.
4. Pull briefly into oncoming lanes (crossing double Yellows??)
5. Act aggressively where necessary to ensure they get through.
I think there appears to be ample opportunity for the Police to pull a robocar over if using this rule set. IMO they should never break the law.

Don't Police force cars to stop all the time today, even at road works, hazards, commercial events and junctions?
Why would you want to refuse to stop when a legal authority asks you to?

No, robocars should not be allowed to break the law. What I recommend in that article is that we recognize the problem that in many regions, driving almost always involves breaking the law, and that vehicles which tightly adhere to the law are effectively crippled, and that the law be modified to recognize reality, and at the very least allow vehicles that meet certain standards to act under the de facto law -- making it the de jure law for them.

However, if a person in a robocar tells the car to drive above the speed limit, that would still be breaking the law, but it would be the person who commanded this who bore full responsibility.

If a vehicle has a person in it, there is no issue about the vehicle pulling over for police. That is the person's responsibility (unless they are not considered competent.) The only issue is for unmanned vehicles (or those carrying a non competent person.) As such, vehicles in that state will surely support a mechanism whereby they can recognize that an officer is ordering them to stop, and they will stop, or get in serious trouble.

I will also note that the various de facto principles apply in two different situations. An unmanned vehicle probably doesn't have to act aggressively and shouldn't unless we get a situation where it's clearly impossible to drive any other way. And unmanned vehicles have little need to speed, though we might want them to if going the limit impedes traffic, as it will in many situations. On the other hand, only unmanned vehicles would have need to stand in front of hydrants, or to allow themselves to be passed on the right using oncoming lanes.

Add new comment