We need a world where open source robocars are possible

Topic: 
Tags: 
An early student robocar at Stanford

We all love open source. But the usual rules of open source break down if every vehicle deployed on the road has to have gone through a complex and expensive safety certification process. You can't just download, patch and go.

So we need other solutions to allow the world of the tinkerer/hacker and the innovation and superior function it can provide.

That's the subject of my new Forbes.com article:

Tinkering is essential for Robocars, but how do we get open source?

Comments

There is a category of aircraft "Experimental - Amateur Built" that closely parallels your idea of a tinkerer self-driving vehicle. There are a set of standards, inspections, and flight test requirements. It could be a good model for a process of approving an amateur-built vehicle.

Oh, for sure there will be experimental cars. But with safety drivers. We don't yet know how they are going to treat an experimental car running unmanned on the road. There is a non-tiny (also not majority) contingent that doesn't like cars on the road today with safety drivers. Will they let a garage tinkerer put out a custom modified software stack unmanned on the road? I fear not.

If the source code is released under a free license, it's open source. No regulation is going to stop that, as releasing the source code is free speech.

If regulations require certification before the source code can actually be deployed, that would make the source code somewhat less useful, but not completely useless. After all, most end-users don't hack on their source code themselves. They instead rely on others to work on the code.

Businesses that either resell cars with modified code or make modifications to their cars before using them for hire will just go through the certification process. That is, if there is one. It's not completely clear that there even should be one.

It's especially unclear that regulations will require certification before modifications can be made to the source code by end-users. How would that be enforced? If you're not reselling the car or using it for hire, I doubt it will be.

Are these going to be state laws or federal laws? If federal, the chances that the rules are going to be enforced on non-businesses (or even small businesses) are very low.

I suppose there is one example of an area where tinkering is somewhat forbidden: FCC regulations. But even then, if you're not actually interfering with others, it's not heavily enforced.

Another example would be drones. There seem to be several open source drone projects out there.

Regulations are sparse today and nobody has commercial unmanned operations. It is the prediction of many, including myself, that for a car to be licenced to operate unmanned on the roads -- and it will need such a licence, probably from the state -- it will need to both meet federal regulations and have some sort of certification of its safety. As I write, that may well be self-certification, which large companies will be able to do fairly easily. It will be harder for small teams and individuals. Or so I predict -- it has not yet taken place.

Even drones (and rogue radios) are not as scary to people as cars.

I don't know. The status quo has a lot of momentum, especially at the federal level (where it's hard to get Congress to pass much of anything).

I definitely foresee regulations on cars that are resold, and for cars that are operated by massive robotaxi companies. I don't know about cars that are operated by individuals.

On the other hand, tinkering with a safety-critical portion of your car's self-driving software as an individual, and not testing it somehow, is probably a bad idea whether we have new regulations or not.

What if there were a way to test your code in a simulator? Write the code, upload it somewhere, get them to run it through a simulator, and they could sign the code and send it back to you. Not only would that improve safety, it'd be a way for the maintainer of the open source software to encourage people to share their improvements.

The problem isn't really the regulations. Regulations or not, this isn't something many people will want to tinker with on their own.

I am not sure it would be that few. Look how many run FOSS on their computers and phones. It's many millions, and there are actually millions who code and make private mods. Cars are a very popular thing to tinker with, and the aftermarket for customizing cars is very, very large. Put the two together.

Yes, I propose the simulator approach in the article, and it's why I suggest that most people would base their code off an existing FOSS car stack that has been certified, and fairly standard hardware. That makes it easy to run your mods through the existing regression tests and simulation scenarios. Much harder to do that with a radically different car.

How many would tinker depends on how safe it is. If it's safe, yeah, a lot of people will do it.

It also depends on what counts as tinkering. Using an API, definitely. Writing plugins, less, but still a lot. (Note that these first two don't require the software to be open source.) Editing security-critical code and recompiling? A lot less. And that's only if it's safe. The vast majority of people aren't going to do it if there's a significant risk that they're going to kill someone.

You say the risk is greater than that covered by insurance today, but I don't think it is. We let people do their own mechanic work today. You could easily kill someone by screwing something up while replacing the brake lines or even something as common as changing the tires. Yes, it's harder to write code that doesn't screw something up, but if you run the code through a very good simulator and through some very good code analysis software, maybe screwups can be as rare as the 500 people a year who die due to unsafe tire conditions (2,600 deaths a year due to "car neglect").

As I note, this is one of the potential solutions. Still, based on my read of public reaction to robocars, I don't think people will like experimental prototypes built by garage hackers going out just because they got an insurance policy. They are more scared of robots than crazy drivers, whether that is rational or not. The problem though is we are quite some time from insurance companies knowing how to measure the risk. If "You passed the simulator test" were enough, then nobody would need safety drivers. Perhaps in the future we won't.

Personal note: Since you're commenting on this blog a fair bit, let me know if you would like to have an account on it. You don't need to reveal your name. If you offer a suitable channel to send it, I can send you a password. An account offers the ability to write in Markdown (with links and lists etc.) and the ability to edit your old comments, among other things.

I don't think people will like experimental prototypes built by garage hackers either. I'd argue that people who do that without adequate testing are already breaking the law, though.

People who recklessly let robots loose on the streets are already breaking the law. Imagine if instead of Uber killing a jaywalker it was some garage hacker who killed someone in a crosswalk with an unmanned vehicle. You think that garage hacker wouldn't be in jail right now?

"You passed the simulator test" isn't enough because the simulators aren't good enough, and until the simulators are good enough the robocar isn't good enough, no matter how much testing you do on the roads. Building a good simulator is an important part of the process of building a good robocar.

As far as the need for safety drivers, I've argued with you before about my position on safety drivers. Safety drivers should be used when autonomous driving with a safety driver is safer than having the safety driver drive the car.

You have to create two cars, really. Yes, you have to create a self-driving car. But you also have to create a car that you can use to test your self-driving car software. That latter car doesn't have to be a self-driving car. I'd argue it probably shouldn't be a self-driving car. For instance, it should alert the driver to take over when it encounters a situation that hasn't been adequately covered by the simulation software. That's not something that a self-driving car is going to do. Your idea of a "minder" is a good one, but "drive the car to a safe state" can be replaced with "alert the safety driver to take over" until the minder itself is good enough to drive the car to a safe state.

Essentially, the "minder" is an ASDAS - an advanced safety driver assistance system.

I am presuming that tinkerers trying to build open source cars (and distribute them to those who wish to use them) want to be able to make cars that can operate unmanned, or allow sleeping passengers. The vehicle must be able to get to a safe state on its own.

Definitely. My comments about that were more about people working on a product for distribution right now should do. It's what Uber should be doing. Maybe it's what they are doing, though it wasn't what they were doing a year ago.

Note how Brad ensures that his midsection is behind the car door for the photograph. Obese people tend to think that hiding in this manner prevents the 'secret' from getting out. This way, of course, lies madness.

It would be easier to lose weight.

Thanks for your erudite contributions.

You're welcome. I am happy to make additional contributions in the future.

Add new comment