Driving without a map is another example of being cheap rather than being safe
Submitted by brad on Tue, 2018-05-08 13:40There was a lot of buzz yesterday about publication by a team at CSAIL and MIT about their research on driving without a map.
The future of computer-driven cars and deliverbots
There was a lot of buzz yesterday about publication by a team at CSAIL and MIT about their research on driving without a map.

The newsletter "The Information" has reported a leak from Uber about their fatal accident. You can read the article but it is behind a very expensive paywall. The relevant quote:
A crash today with a Waymo van is getting attention coming in the same area just a short time after the Uber fatality, but Waymo will not be assigned fault -- the driver of the car that hit the Waymo van veered out of his lane into oncoming traffic because of somebody else who was incurring on the intersection. Only minor injuries, but higher energy than prior crashes for Waymo.
At teams around the world attempt to build safe robocar systems, one key asset has stood out as a big differentiator -- experience. For a company to be willing to certify their vehicle as safe, it needs experience with all the strange circumstances that it might encounter driving the roads.

The primary purpose of the city is transportation. Sure, we share infrastructure like sewers and power lines, but the real reason we live in dense cities is so we can have a short travel time to the things in our lives, be they jobs, friends, shopping or anything else.
Sometimes that trip is a walking one, and indeed only the dense city allows walking trips to be short and also interesting. The rest of the trips involve some technology, from the bicycle to the car to the train. All that is about to change.
The NHTSA/SAE "levels" of robocars are not just incorrect. I now believe they are contributing to an attitude towards their "level 2" autopilots that plays a small, but real role in the recent Tesla fatalities.
Last week, buried in the news of the Uber fatality, a Tesla model X had a fatality, plowing into the ramp divider on the flyover carpool exit from Highway 101 to Highway 85 in the heart of Silicon Valley. Literally just a few hundred feet from Microsoft and Google buildings, close to many other SV companies, and just a few miles from Tesla HQ. I take this ramp frequently, as does almost everybody else in the valley. The driver was an Apple programmer, on his way to work.
How does a robocar see and avoid hitting a pedestrian? There are a lot of different ways. Some are very common, some are used only by certain teams. To understand what the Uber car was supposed to do, it can help to look at them. I write this without specific knowledge of what techniques Uber uses.
In particular, I want to examine what could go wrong at any of these points, and what is not likely to go wrong.
The usual pipeline looks something like this:
Lost in all my coverage of the Uber event is a much more positive story from San Francisco, where Police issued a ticket to the safety driver of a Cruise test vehicle for getting too close to a pedestrian.
Uber has reached an undisclosed settlement in the fatal incident with the victim's husband and daughter. This matches my prediction of Uber's likely best course of action, since it will shut down much of the public discussion and avoid dragging all sorts of details out into the open in a lengthy trial. The settlement comes with an agreement for silence, as you might expect.
Yesterday we saw the state of Arizona kick Uber's robocar program out of the state. Arizona worked hard to provide very light regulation and attracted many teams to the state, but now it has understandable fear of political bite-back. Here I discuss what the government might do about this and what standards the courts, public or government might demand.

The governor of Arizona has told Uber to "get an Uber" and stop testing in the state. With no instructions on how to come back.

Unlike the early positive statements from Tempe police, this letter is harsh and to the point. It's even more bad news for Uber, and the bad news is not over. Uber has not released any log data that makes them look better, the longer they take to do that, the more it seems that the data don't tell a good story for them.
In the wake of the Uber fatality, I'm seeing lots of questions. Let's consider the issues of crosswalks and interventions by safety drivers.

Crosswalks actually are important to robocars in spite of the fact that they still should stop for a pedestrian outside of a crosswalk.
Today I'm going to examine how you attain safety in a robocar, and outline a contradiction in the things that went wrong for Uber and their victim. Each thing that went wrong is both important and worthy of discussion, but at the same time unimportant. For almost every thing that went wrong Is something that we want to prevent going wrong, but it's also something that we must expect will go wrong sometimes, and to plan for it.

Major Update: Release of the NTSB full report includes several damning new findings
Update: Analysis of why most of what went wrong is both terrible but also expected.
Update: More information in following posts, particularly impressions of serious possible errors by Uber.
Update: Did the woman cross 3.5 lanes of road before being hit?
It's just been reported that one of Uber's test self-driving cars struck a woman in Tempe, Arizona during the night. She died in the hospital. There are not a lot of facts at present, so any of these things might be contradicted later.
One of the biggest milestones of the robocar world has gotten just a little coverage. Waymo, which last year removed the safety driver from behind the wheel of their cars in Phoenix, still had a supervisor sitting in the back with a kill switch. That supervisor is now gone and the car comes to pick up passengers entirely unmanned.
In the world of flying cars, another big step was taken with the partial unveiling of the Kitty Hawk Cora. Kitty Hawk is a project involving some friends of mine who made the Google car project happen, and while it's very nascent it could have some big effects.

Copyright © 2025, Brad Ideas
Designed by Zymphonies
