How's how robocars might drive in the most chaotic of cities
Submitted by brad on Thu, 2019-06-06 11:29
Topic:
Tags:
How will robocars drive around the chaotic cities of the world. Not Phoenix but Boston, Rome or Delhi? Here are some ideas. The most radical is to keep track of the drivers who constantly cut people off or drive like you're not there, expecting you to hit the brakes -- and then, very rarely and at random, when it's the jerk's fault, let a gentle but expensive accident or near-impact happen. I suspect they will quickly learn to give other cars more respect, and cooperate rather than defect on the roads.
That and more ideas are spelled out in my Forbes site article found in:
Here's how robocars might drive in the most chaotic of cities
Comments
FKA
Thu, 2019-06-06 17:39
Permalink
Artificial Road Rage
"In some places, there are laws that put a duty on drivers to avoid an accident even when they are in the right."
Some places? I'd challenge you to find a single place where that is not the law. Maybe there are some, but I doubt there are any based on English common law. How is intentionally causing a car crash not battery and trespass to chattel? Maybe there's a car equivalent to the stand your ground doctrine?
Ever hear of the Last Clear Chance Doctrine?
This goes to the reason why something like RSS is doomed to fail. The law isn't just statutes, and it's not black and white.
The way to drive in chaotic cities is not to intentionally get into car crashes. It will involve sometimes cutting people off (assuming they'll yield) and sometimes unintentionally get into car crashes (when they don't), though. (Like the Waymo crash with the bus.)
brad
Thu, 2019-06-06 19:04
Permalink
Not wanting to get in them
You don't want to get in them. I am talking about crashes caused by uncooperative behaviour of rude drivers. You have right of way, the law requires them to yield, they don't yield and they hit you. They don't yield because they figure you will brake hard and let them in, and people generally do that.
But are they required to brake like that?
The closer situation involves them forcing their way in front of you and you hit them. They will generally be found legally at fault. You apply some braking to reduce severity of the impact, but you don't apply all that you could and have a minor impact.
I think it's an open question how police and courts might view this. For humans, it's easy, because humans aren't so calculating, and humans don't want the hassle of an accident even if the other guy's going to pay for it all. Humans have somewhere to be and don't want any damage to their vehicle.
Fleets have different motives.
But more to the point, this is a good thing socially. Discouraging defecting behaviour on the road is in the public interest. I could easily see the rules being amended, or their interpretation being modified, to make this happen. Right now societies would love to punish the greedy, non-cooperating drivers but has no way to do so. I think it might very much like the chance.
FKA
Thu, 2019-06-06 19:39
Permalink
Not good and not legal
I don't think you're right about the law.
And I also don't think you're right that it's a good thing socially.
But maybe I'm unclear on what you're suggesting. The scenario is that you take an action knowing with substantial certainty that it will cause a crash, despite being able to avoid it without causing harm, right?
That's illegal in any jurisdiction I can think of. In fact, it's criminal here in the US. We don't allow vigilante justice. And, in my opinion, vigilante justice is not a good thing socially.
brad
Thu, 2019-06-06 21:03
Permalink
Without harm
The reason for the social question is the term "without harm." If robocars always yield, even when the right of way is theirs, that creates a world where antisocial drivers -- and possibly all drivers -- treat them like they are not there, slowing traffic not just for the users of the robocars, but everybody on the road.
Today, if a person drives like you are not there or have no right of way, they tend to get away with it -- because people brake. But if that person hits you, we blame you, not that person, at least if there is dashcam video to show what happened. We do that because nobody can truly declare if that person could have stopped. In fact, chances are that if they had seen you in time, they would have stopped, because everybody prefers a brake jab to an accident, even one they won't pay for.
Now robots come along. In the same situation, we can know if they could have stopped or not. It's in their logs. So what should be the result? Demand they stop, and let the antisocial drivers disrupt the road? Or give them the same benefit of the doubt given to humans, so that if one time in 200 they don't stop, it's on the violator, not on them.
Is that not a good thing for society? That the antisocial driver realizes they can't depend on the robot to always yield? Realize that with humans, the antisocial driver takes some risk because humans do make mistakes. They must give the other humans more margin to get away with this. With robots, you can give them no margin at all -- they will stop if physics and their sensor latency time permit it.
I outline another option in the article -- just recording it and getting the police to deal with the bad offenders. That's preferable, I think, but others have problems with that.
Let's consider a special case. A robot is driving, being followed by a human driver, too closely as is usually the case. Somebody cuts in front of the robot. The robot could stop, but can't predict whether the human behind it will stop in time if the robot stops so abruptly. They might. They might not. So the robot decides to stop less quickly, and gently hit the guy who cut in.
The law here is a bit murky to me. The guy who cuts in has fault. The guy following too closely has fault. I don't know how the law would judge them. But the robot that tried to make the best of it? Should we punish that company?
FKA
Fri, 2019-06-07 06:28
Permalink
Without immediate harm
By "without harm" I meant without immediate harm. Necessity is a defense to battery. But "I'm trying to make the world a better place in the long run" is not an example of necessity. I also think it would be awful if we allowed it to be one. The idea of intentionally harming someone because s/he cut you off is abhorrent to me. We should not allow that.
Yes, if it were a human driver they'd just say "oops, I couldn't stop in time," and they'd get away with it. Because things are done with a computer, the programmers probably won't be able to get away with that.
On the other hand, are there really that many situations where this is even going to happen? You seem to agree that the robocar needs to be empty, and presumably another criterion would be that there are no passengers in the other car. Otherwise you are definitely harming innocent people here. Moreover, shouldn't there be a certain level of certainty that you were cut off intentionally? Or is acceptable to enact this vigilante retribution even upon people who simply make mistakes? And how bad would the "cutting someone off" need to be? It's pretty much a necessity to cut people off sometimes. That is the norm in many situations. The correct driving in a place like New York City is not to get angry or seek retribution when people cut you off, it's to simply cut other people off. It's a necessity, otherwise, you never get to your destination. Once someone cuts you off, unless you're a psychopath or something (I suppose road rage is not limited to psychopaths), you accept it and move on. The trick is to establish a better position before you're in the situation where your only two choices are to slam on the brakes or crash into someone.
There is indeed a problem if robocars always yield, but yielding and causing a crash aren't the only two possibilities. Waymo learned this early on in four-way-stop scenarios, and is learning it again in unprotected-left-turn situations. You don't crash into someone who cuts you off at a four-way-stop. You do inch out into the intersection to assert that you're going to take the right of way (in the unprotected-left-turn situation, sometimes even when you don't actually have the legal right of way; for instance when traffic is jammed and to not force your way through would make you sit there for a really long time; I have to do that frequently in my morning commute; I also frequently have to yield to cars going the opposite way in my turning lane; maybe I should just crash into them instead).
I'm not sure there's going to be a big problem with robocars stopping (or more likely slowing) to avoid a crash. When one person cuts you off, there's very little delay from it. Pedestrians might cause more delay, by crossing the road in front of robocars, but 1) I don't think you're suggesting hitting them when they do this; and 2) maybe pedestrians should have the right of way in most of these situations anyway. And a chain of cars all cutting you off would cause significant delay, but the way to deal with that is to assert your right of way by slowly pulling into the intersection. That might cause a crash that's not your fault, but it probably won't.
It's interesting that you call people who cut other people off "antisocial" (even though it's something that almost everyone does sometimes) while proposing something way more antisocial - intentionally causing a crash just because someone cut you off.
"Just recording it and getting the police to deal with the bad offenders" might be an acceptable solution if it truly were a problem. If things really do get to the point where there are serious problems from people ignoring robocars, significant fines for doing so coupled with at least some enforcement might be needed. That'd have its problems, but it'd be a lot better than dealing with antisocial behavior by letting robocops act as judge, jury, and executioner.
Your special case is a good example of why the law isn't black and white and the RSS idea is doomed to failure. It's also an example, I'd say, of a trolley problem (trolley problem situations happen frequently if you relax the definition to include harm less than death and certainty of harm less than 100%). I definitely wouldn't suggest punishing a company for making a reasonable decision in that sort of situation. Whether or not they should be at all liable, I think is pretty grey-area, especially without more details. And there are lots of details. Are there innocent passengers in the car ahead? Are there innocent passengers in the robocar? (Is one of the "innocent" people a fat man, or a fat villain?) I think the safest thing to do from a liability standpoint is to slam on the brakes as hard as possible. Anything else opens you up to liability, even though it might be the right thing to do. You might escape the liability by arguing something like necessity or defense of others. You might not.
But in a situation where you can't argue any defenses like necessity or defense of others, not slamming on the brakes is an intentional tort.
brad
Fri, 2019-06-07 13:31
Permalink
If not this, what?
It is a radical solution. The problem, though, is radical too. Once you know a car will do everything inhumanly possible to get out of your way, it's going to be too tempting for some people to drive very aggressively around them. Want to squeeze in to the next lane in front of a robocar? Just do it, they will move back. Want to merge into dense traffic? Just wait for the robocar to come along and quickly cut in front of it. Stuff you would never do with a human.
Same for pedestrians. Want to cross the street full of robocars? Just go. They will reliably stop for you, just as you stop an elevator by sticking your arms in the doors.
There needs to be something to counter that. So I talk about the police dealing with it, and that may be the answer (and has to be for the pedestrian of course) but there might be another option for the drivers.
What Waymo does at the 4 way stop works today. It won't work in future if drivers learn, "If you come to a 4 way stop, even a few seconds after a Waymo, just go ahead of them. They won't fight you."
I predict a problem. Maybe it will be resolved. In Indonesia, if you cut somebody off, YOU have the right of way. If they rear end you it's their fault. I think that's crazy and dangerous and I think it causes congestion, but it sort of works. I am not sure we want that, though.
I don't expect the cars to know how many passengers are in cars. The driver you can always see for obvious reasons, but there can be children in cars you can't see. I don't expect people to do trolley calculations based on passengers. They may be able to tell if a car has a driver in the driver's seat or not, and they will also be able to identify models of cars that never have a driver.
FKA
Sat, 2019-06-08 04:53
Permalink
Maybe nothing at all
I think it'll only be a minor inconvenience. And the benefits (particularly for pedestrians) might even outweigh that minor inconvenience. But as I said before, there's a big difference between doing everything inhumanly possible to get out of your way, and not intentionally crashing into you for no reason other than to teach people a lesson. The former isn't necessary. The latter, in my opinion, is.
If it becomes a major inconvenience, or worse, if becomes a safety hazard (for instance because of rear-end collisions when a robocar slams on its brakes), boosting up the legal penalties and boosting up enforcement should take care of the problem. Undercover police can ride in robocars. If 1/1000 robocars are manned by cops, and the penalty for doing one of the problem behaviors is $10,000 (with the car being impounded until the fine is paid), that's going to be as much a deterrent as a 1/1000 chance of getting into a fender-bender. And now that I think about it, you can be even more efficient than that. Robocars can make reports (with license plate numbers) to police when something like this happens, and even if those reports aren't acted on directly (prosecution might be tough without an eyewitness, especially if you don't have a good identification of the driver), police can use the information to do more targeted enforcement and catch the repeat-offenders in the act. Maybe the police (since they can look up the car owner's name and address) could even send a letter, with a picture, to a first time offender, educating the person about why this type of behavior is so problematic and that it's illegal. The car company sending the evidence could pay the cost of this education effort.
Longer term there are also going to be opportunities for infrastructure improvements and improvements through wireless intercar communication.
You don't expect the cars to know how many passengers are in cars. So you are potentially going to harm completely innocent victims when you intentionally cause a car crash. Yeah, you say it's like bumper cars. Bumper car rides come with a long list of people who shouldn't ride - a list that includes many people who do ride as passengers in vehicles. I think it's a terrible idea. I know it'd be a liability nightmare.
brad
Sat, 2019-06-08 12:30
Permalink
WIth cops
There is no need for the cops if we're willing to do this. The car will have video of the event. The main thing that a cop could do is make the person pull over to confirm identity, which admittedly the robocar, if behind, could not get perfectly well, particularly if the offender does not have a rear windshield.
The proposal -- as I said an extreme one -- would not hit a car at a speed that could injure anybody in the car. It would be bumper car speed.
But as I said, this an extreme proposal, for use only if this becomes an extreme problem. But it could. People tend to do whatever they can get away with on the roads, but it also depends a bit on each city's driving culture. Boston drivers are much more aggressive than Phoenix ones. This technique is unlikely to ever be wanted in cities without a big problem, but in cities with chaotic driving, I think it's likely that, if there is no reason not to, people will get quite aggressive with timid robocars.
Today, as discussed earlier, people don't understand robocars. The robocar can pretend to be aggressive and get its due. That's how you do the four way stop. Not by actually being aggressive but by bluffing at it.
But you can't play poker when everybody knows you are bluffing.
Anonymous
Sun, 2019-06-09 06:42
Permalink
Driving is not poker
It's extremely hard to prosecute someone, over a competent defense, for a traffic ticket based solely on a video that is submitted by a third party. In the most jurisdictions, these sorts of tickets can easily get thrown out of court by a decent attorney specializing in traffic ticket defense. The problem of confirming the identity of the driver is one, but there are many many others. Some jurisdictions have passed laws specifically addressing the situation (in order to get red light cameras and the like to be enforceable), but many haven't, and several jurisdictions are in fact passing laws to ban red light cameras and the like. And this policy, of giving a big, quite possibly unpopular robocar company the power to tell the police who to charge with traffic violations, will face significant public backlash and constitutional challenges even if the legislators do pass laws to allow it.
I don't think it will become an extreme problem, at least not in places where there is not already an extreme problem. There are many reasons for not to get too aggressive with timid robocars. The reasons are, for the most part, the same reasons not to get too aggressive with human-driven cars. If it does become an extreme problem, I want human cops to take care of the problem, not robocops.
The way to handle a four way stop is not to be aggressive or to pretend to be aggressive. The way to handle a four way stop, for both humans and robocars, is to be assertive. It doesn't involve crashing into people, or even pretending like you're going to crash into people. It does sometimes involve putting your car into a position where the other driver has the choice of yielding or crashing into you. Most of the time it doesn't even involve that. What it usually involves is simply communicating to the other drivers that it's your turn. A fairly universal way to communicate that is to start to pull out into the intersection. It's communication, not a threat.
brad
Sun, 2019-06-09 12:31
Permalink
Becoming a problem
It is possible that it won't become a problem where it isn't a problem. But the subject of my article is how to get cars to work well in the cities where there are so many aggressive drivers that people who don't have any aggression suffer greatly in their ability to move.
What I'm worrying about is what happens when a population of assertive drivers learns that cars of a particular type, the ones with the lidars on them, will always yield if it's physically possible. You don't know that about humans. Humans will almost always yield, but some are also aggressive and some are just slow to react.
Ditto pedestrians learning the cars will always yield. Today, pedestrians wait for cars to pass before they cross outside of crosswalks, and they try to cross just after the car passes. (There is a common and annoying situation where the ped crosses the road halfway, hoping the car will pass and they can go behind it, and the car of course stops, but the ped does NOT want to go in front of it and you get a stalemate, usually until the car stops fully and the ped reluctantly goes, but sometimes it goes bad and other cars come because this took too long.)
Anyway, you will be able to cross in front of any robocar. As long as you see that's what's coming, just cross. But that does not scale up well.
At a 4 way stop, people are taught the rule that the first to arrive goes, and if it's a tie, the one to the right goes. Sometimes somebody else goes and the others yield. Sometimes there is a back and forth as people have slower reaction times. Somebody has to decide they are first and assert. The robot knows exactly who was first. It will assert when it is first. It will wait when it is not. It may decide to assert if the other, who was really first, waits too long. It all mostly works.
Until people start noticing that if I come to a 4 way stop with robots, I can go even if I'm not first. They may start moving but they will always pull back.
FKA
Sun, 2019-06-09 19:27
Permalink
If it's already a problem...
What would be an example of a city "where there are so many aggressive drivers that people who don't have any aggression suffer greatly in their ability to move"? What types of aggression are necessary? I'm not sure "aggressive" is the right word, but robocars will have to be programmed to break rules and to cut people off in order to drive in some places. I don't see the advantage of programming them to actually crash into other cars, though.
Robocars should not always yield. Maybe you're using the term "yield" to mean something other than what it actually means. Robocars should not crash into other cars if they can safely avoid it. That's not the same as always yielding. Yielding means you don't establish a position in the path of the other car. Robocars shouldn't yield when they have the right of way, except in those relatively rare situations where the other car is traveling so fast that it's unlikely that they are going to stop for you (for instance, when another car runs a red light, you should probably yield even though you have the right of way). Robocars should avoid crashing into other cars when they can do so safely. They should do this because it's the law; they should do this because it is the ethical thing to do; and they should also do it for practical reasons. Even a fender-bender where you are not at fault is going to have a significant cost.
I don't think drivers knowing that a robocar will always do everything it safely can do to avoid a crash is something to worry about. I don't think the only reason people don't drive like maniacs is that they're afraid they're going to get into a crash. There are many other reasons.
Establish a position in the path of the other car when you have a right of way and they can easily stop, yes (usually). Crash into that other car because you have the right of way, no.
As far as pedestrians, I think it'll be great to be able to cross any road (except limited access highways, I guess) at any spot in the road, and to have cars yield to me. Why doesn't it scale up? What specifically would be the problem? I can think of some limited scenarios where that won't work, because there are constantly pedestrians everywhere. New York City comes to mind, and it'll be interesting to see how robocars drive there, as that's a place where you have to break traffic laws in order to get just about anywhere. But in most places it should be fine.
If that's too radical, the same rules as with other cars should work: Establish a position in the path of the pedestrian when you have a right of way and they can easily stop, yes (usually). Crash into that pedestrian because you have the right of way, no. Interestingly, this is basically the way to deal with pedestrians in Manhattan. Don't crash into them when cross in front of you even though they have a "don't walk" signal; but do take every inch you can without crashing into them, so that you're in a better position so that it's less likely the next pedestrian will cut you off.
What doesn't work with a 4-way stop? The robocar gets there first. You get there second. You ignore the robocar and go first anyway. The robocar yields, and then goes after you. So what? The robocar loses 3 seconds?
"Who was first" means who was first to come to a complete stop. The way to program it is "come to a complete stop before entering the intersection; once you've come to a complete stop, yield to any cars in the intersection or already stopped (whether they are stopped in the intersection or just before it); then go, without crashing into anyone." If someone cuts you off, you stop before hitting them, but not too much before hitting them. Then you continue. You might get cut off by one car. It's very unlikely you'll get cut off by more than one or two cars.
I don't see how robocars are at any significant disadvantage here. 4-way-stops are slow. If anything, they are probably at an advantage, because they can get really close to the car that cuts them off in order to be in a position to finish through the intersection as soon as that car passes by them.
brad
Mon, 2019-06-10 02:09
Permalink
Where doesn't it work
Well, as you say New York city and other dense downtowns. I never suggest hitting a pedestrian under any circumstances, regardless of who is in the right. However, those behind the robocar, and those in the robocar are not going to be thrilled if their cars are always coming to abrupt stops because somebody decides to walk in front of them without a care in the world.
For cars, just consider any cut in. You're driving along, with a short gap in front of you in tight traffic. Any other driver can just cut into that gap and you will widen it and let them in, rather than hit them. Again and again if need be -- an entire line of cars in the lane to your right, if they can just get their nose into the gap in front of you, will force you to stop and let them in. You could sit there for a long time.
Indeed, other robocars could even do this, notice that you are a fellow robocar, and nudge in front of you, and you will pull back.
With humans when this happens, sometimes (fairly often) they let you in. Sometimes they say "no." They act like they don't see you. They move fast to fill the gap. They move quickly to the side to get past you and you retreat when the human asserts in this way.
When a Robocar arrives at a 4 way stop, and starts to go because it is empty, if a human (or other robot) arrives at the stop 2 seconds later, clearly second, they can still go. The robocar will pause as soon as it sees the other car going, and it won't have gone too far in those 2 seconds.
Lined up to make a left turn (with no arrow) and see that the oncoming cars are all robocars? Go ahead, do a Pittsburgh left. They will stop for you. (Actually, Waymo always waits a couple of seconds just to be sure the intersection is clear, so you will get extra time to do your Pittsburgh left.)
FKA
Mon, 2019-06-10 19:42
Permalink
NBD
It takes me three seconds, under normal acceleration, to get the front of my car from the stop line at the four-way-stop by my house to the other end of the intersection. So if the other car (presumably coming from my right) is two seconds behind, I will already be more than halfway through the intersection by the time a collision would occur. So I wouldn't be hitting them; they'd be hitting me.
But let's say the other car is only one second behind. The robocar arrives at the stop line and comes to a complete stop. A car approaching the intersection from the right is about one second from his stop line. The robocar pulls out into the intersection. But the jerk in the car approaching the intersection from the right runs the stop sign, since he knows the robocar will always yield. The robocar slows down as soon as the jerk runs the stop sign, and when the jerk's car is in front of the robocar, the robocar stops. The jerk passes by, and then the robocar continues going through the intersection. The whole thing adds five seconds or so to the robocar's trip.
Sounds like a very minor problem to me.
brad
Mon, 2019-06-10 20:27
Permalink
How much delay
It doesn't care about delay if vacant. It cares if people are inside. The rude driver doesn't know or care if there are people inside, and may not even see. (In the future, I expect people to be able to fully tint a robocar passenger cabin.)
If you get a city full of aggressive drivers, the robocar never gets to go.
FKA
Tue, 2019-06-11 02:49
Permalink
What's the scenario?
If the robocar would never intentionally cause a crash when it is occupied (which makes sense), but would sometimes intentionally cause a crash when unoccupied (which doesn't make sense), then the rude driver potentially might care if the car is occupied. Maybe robocar passenger cabins will be allowed to be fully tinted. I don't know.
I still don't see any realistic scenario where an assertive robocar never gets to go. The robocar can and should establish position in front of other cars when it has the right of way. What it shouldn't do is intentionally crash into other cars for no reason other than to teach people a lesson.
Your 4-way stop scenario causes a delay of a few seconds. It's a rather narrow scenario where the two cars arrive within a certain window of time between each other and the human driving the one car is willing to blatantly run a stop sign to save a few seconds.
There are places in the world where traffic laws are largely unenforced and passive drivers get nowhere. But I'm not suggesting that robocars drive passively. I'm only suggesting that they drive rationally, and that intentionally causing a crash for no reason other than to teach others a lesson is not rational. Establishing position is what is key to driving in these places. Get in front of the other car, and the other car will be forced to yield. That's often a good suggestion here in the USA too. When merging in heavy traffic, position is usually key. Teaching robocars not to be too passive will be important for dealing with that. One difficulty will be matching the desires of the passengers on how much risk to take with the choices of the robocar, and in way that doesn't cause too much liability for the robocar company. Reducing the risk of a crash to zero would be overly passive for many driving situations. But that doesn't mean that intentionally causing a crash ever makes sense.
Predicting how others will behave is a very tough AI problem that will be key to reducing the risks without harming efficiency. Until that problem is solved, robocars will be very limited in places where they can perform well.
brad
Tue, 2019-06-11 12:16
Permalink
Never gets to go
Never is too strong a word, but one can see situations with both unacceptable delay and also delays to all traffic. No, you don't drive passively, but the problem is each assertion is a bluff, and the problem is everybody knows it's a bluff. If challenged, you will yield, 100% of the time. What I am saying is we might make it 99% of the time instead of 100%. And that if you do, you actually no longer get challenged.
This is a well known result in game theory, where you have any game where two players can cooperate or defect. Those who always cooperate fail. Those who always defect fail if the cooperators follow a tit for tat strategy, of cooperating with you until they learn you are a defector. This is the winning strategy. Never before has there been a group of vehicles that can tit for tat as a group.
Now, there are interesting things far less drastic than standing your ground to the point of an impact which might work. For example, it could be that once a car is known as a defector, all the robocars now treat it as badly as they legally can for some period of time.
One way that could work for humans, would for all the cars in the group to never let the defector in, while issuing common courtesy to unknown cars. But humans have a way to "not let somebody in." We can keep close to not allow a gap a human would dare enter. But can a robot do that? There is always some gap that a driver can get a corner of their car into. You would not do this with humans because you can't be sure they won't hit you, with you in the wrong, since you are not allowed to force your way in. But with robots, if I get my nose in to the 5 foot gap they have with the car in front, they will brake and I will get in. Unless there is a chance I won't, even a 1% chance.
FKA
Tue, 2019-06-11 16:34
Permalink
What's the scenario?
An assertion is not a bluff. Putting your car in front of the other car is not a bluff.
What is the scenario with both unacceptable delay and also delays to all traffic that occur with robocars but don't occur with human-driven cars? What is the scenario?
Driving is not a game of prisoner's dilemma. It's nowhere even remotely close to that.
Yes, a robot can keep close to not allow a gap a human would dare enter. In fact, it can do so way more safely than a human can, because it has nearly instantaneous reaction times. I don't think it's necessary to do that, but it's certainly possible, and it's better than intentionally causing a crash.
FKA
Sun, 2019-06-09 07:26
Permalink
FKA
Bleh, forgot to log in. Anyway, I should add that there might be a somewhat different problem, of people intentionally messing with robocars not for immediately selfish reasons but just to mess with them, if robocars and robocar companies become widely disliked, and that is definitely a possibility. There will definitely be a backlash. The only question is if it will be a small minority of the population or a much larger minority (or majority).
Having the robocars intentionally cause car crashes would only exasperate that problem, and get more people on the anti-robocar side. What the proper solution will be will depend on how big the anti-robocar backlash is, and what the reasons are for it. Education and other public relations efforts might be a big part of the proper solution to that. Law enforcement will help if it's a small minority. If the backlash is big law enforcement might just exasperate the problem, though.
By the way, it's impossible to get into a car crash without some possibility that some eggshell plaintiff (1) is going to be injured by it. And intentionally causing such crashes can be very costly even if the plaintiff isn't actually injured, but only pretends to be. Even if you win the case, you still have to pay your attorneys, in most jurisdictions. And even if you win the case and win attorneys fees, the other side might be uncollectable.
(1) definitely look up the term "eggshell plaintiff" if you're not familiar with it
Richard Hershberger
Fri, 2019-06-07 07:07
Permalink
In addition to FKA's
In addition to FKA's commentary on the legal aspect, which is entirely on point, at least for common law jurisdictions, it seems optimistic to suppose that the car can be programmed to produce only minor fender benders. The rude driver will never be injured? And more to the point, for jury sympathy purposes, this will never induce a secondary collision? I would hate to be the defense attorney in that lawsuit. (Or maybe not. Just think of the billable hours!)
brad
Fri, 2019-06-07 13:35
Permalink
Secondary collisions
I suspect you could decide to do it only in places where it will not increase the probability of a secondary collision.
As for injury to the rude driver, if you hit at bumper car speeds, it seems unlikely to cause injury. So unlikely that bumper car rides happily exist and children ride in them. Cars also have shock absorbing bumpers (better ones, in fact) but the difference is they cost $1,000-$2,000 to replace these days with all the fancy gear in them. And the robocar's bumper may have even more fancy gear.
The plan is also that this actually only happens a few times. When the word gets around that ever so often these cars sometimes stand their ground when you challenge them to a game of chicken, people stop challenging them to games of chicken.
Richard Hershberger
Mon, 2019-06-10 08:35
Permalink
"I suspect you could decide
"I suspect you could decide to do it only in places where it will not increase the probability of a secondary collision."
What does this even mean? If there is no primary collision, the probability of a secondary collision is zero. Your premise seems to open with "First, make your systems infallible." Otherwise, inducing a primary collision will by definition increase the probability of a secondary collision.
brad
Mon, 2019-06-10 11:28
Permalink
Secondary collision
You can do it when nobody is following closely behind, or too close on the other sides.
MB
Wed, 2019-06-19 20:45
Permalink
Rather than a crash, how about a ticket?
If the robocar is recording enough information to prove it wasn't at fault in an accident, could that same level of information be forwarded to the police department to issue a citation? It'd probably be easier to amend the law to allow records from reliable third parties, than thread the needle on avoidable crashes.
Add new comment