Cruise getting a ticket is a very positive story

Topic: 

Lost in all my coverage of the Uber event is a much more positive story from San Francisco, where Police issued a ticket to the safety driver of a Cruise test vehicle for getting too close to a pedestrian.

We don't have all the details on this, but based on Cruise's statements -- that the ticket was issued in error and they were never closer than 10.8 feet to the pedestrian and they correctly yielded right-of-way -- it is speculated this incident involved a Cruise car going through a crosswalk which was also in use by a pedestrian.

The law requires cars to yield the right of way to pedestrians in crosswalks, and Cruise claims it does this and did this. It is not clearly defined what the right of way is of a pedestrian. In some places, the pedestrian effectively "owns" the whole crosswalk in front of them. In other places, their ROW only includes half the road or a smaller buffer zone. We have all been there -- you want to turn right on a green, and there are pedestrians in the crosswalk. Often, you will turn through the crosswalk when the pedestrians are coming from the other side and you'll be long gone by the time they get to the lane you are turning through. Or you wait for pedestrians to pass and you turn, with caution, as soon as they have walked through the lane you want.

This is a good thing, and in busy cities, is important for the flow of traffic. The extreme situation -- nobody can turn if a pedestrian is anywhere in or approaching the crosswalk -- would make turns on green lights next to impossible in busy cities.

However, much more important than the details is the lesson this teaches about how we develop and regulate robocars. There will be disagreements between regulators, police and developers about just what the law means. There may even be mistakes. And those situations will result in police issuing a ticket, just as they do to a human who makes the same mistake or has the same disagreement.

What's different, however, is this one incident will be considered in detail by the team and the law. They will do one of two things:

  1. Agree (or concede) that they violated the law, and modify their system so that no car of theirs does that again
  2. Contest the ticket. If they lose, do #1, or if they win, educate police on the fact that they are compliant.

That part's not like humans. The fact that one person gets a ticket doesn't stop other people from doing the same thing. The law is not perfect, and programs are not perfect, but we get to iterate them closer to perfection. Of course, all of this presumes that Cruise's actions still were safe. Certainly if they never got within 10 feet of the pedestrian at low speed, with a safety driver monitoring, they appear to be safe.

While safe, it is possibly they might be frightening. We're not ready for robots to shave away all but a few inches of our right-of-way because of their excellent confidence in the physics of a situation. In theory a well programmed robocar could probably turn in front of you and miss you by a couple of feet without risk, but you would not feel comfortable with it, not for now.

Being too strict with the robocars

The other interesting issue this brings up is the perennial one caused by the fact that human drivers routinely break the rules, and we have to decide what to do about robots doing that. Speeding is the most obvious example -- everybody speeds, and a car that keeps to the limit becomes an impediment to traffic.

Here, even if the rules say that cars must never turn if any pedestrian is present, even one you would never hit, we know that humans grab the chance to turn whenever they can, and we know that's important for keeping the roads flowing. We even see cars slowly entering crosswalks right in the middle of pedestrian flow, and the pedestrians just swarm around them, mostly to the rear as the car creeps forward. It's done at such a slow speed that it's not a bad thing, and it's better than a strict interpretation of the law which would block the road for long periods.

Problem is, companies like GM/Cruise don't want to program their car to do something illegal even though everybody does it to the point of it being necessary. We will need to adapt our laws to this.

I will also guess that things are much more on edge when it comes to robocars and pedestrians this week, and the police are more likely to ticket something.

Who got the ticket?

In most states that legalized robocar testing, the rules were written to say that if the vehicle code refers to "the driver" it means the person who activated the self-drive system, and in particular the safety driver when one is present. In California for now, one is always present.

Fox news made the silly report that a "passenger" got the ticket. That's certainly not the case, nor will it be when Cruise finally has commercial service for passengers.

That means the ticket would go to that safety driver, though of course Cruise will pay the fine. It does seem a bit unfair that the record of the ticket, including any "points" will go to that safety driver. We presume Cruise has some plan in their safety driver program to compensate the safety driver for any extra insurance costs etc. Since most companies only use safety drivers with zero points, there is little risk of a safety driver losing their licence from the points.

In the long run, the "ticket" system was really designed for humans, and isn't the best match for regulating robocars. A better system would be:

  1. Determine from logs if the vehicle did really violate the code. (In some cases, consider changing the code.)
  2. The team fixes it in a reasonable time.
  3. The team demonstrates or certifies that it has fixed it.

You could attach financial penalties, but frankly they are unlikely to be a real barrier to the large company teams. Certainly the prices of tickets for people aren't. The real deterrent would be the threat of losing the right to test or operate if the above procedure is not followed.

Revoking of the right to operate will be a very powerful thing. You would only need to do that in the case of a clear danger to the public, which is not the case here. Once a service is in operation, shutting it down could cause chaos in the city, which is only justified if the risk is too high. We certainly don't stop all human drivers once we learned that human drivers sometimes cut people off, even though that's a dangerous thing.

Another great attribute of robocars is that there really will never be more than a few score teams developing them. If there are issues, you don't need to pass laws. You can literally get all the teams together in a room, or a virtual room on very short notice. You can discuss it, come up with solutions, and implement them. The old methods of the law are a much less effective answer than this. They should remain only as a hammer to use if simply meeting about it doesn't work.

Add new comment