New elements to the VW story shed interesting light

Last week, I commented on the VW scandal and asked the question we have all wondered, "what the hell were they thinking?" Elements of an answer are starting to arise, and they are very believable and teach us interesting lessons, if true. That's because things like this are rarely fully to blame on a small group of very evil people, but are more often the result of a broad situation that pushed ordinary (but unethical) people well over the ethical line. This we must understand because frankly, it can happen to almost anybody.

The ingredients, in this model are:

  1. A hard driving culture of expected high performance, and doing what others thought was difficult or impossible.
  2. Promising the company you will deliver a hotly needed product in that culture.
  3. Realizing too late that you can't deliver it.
  4. Panic, leading to cheating as the only solution in which you survive (at least for a while.)

There's no question that VW has a culture like that. Many successful companies do, some even attribute their excellence to it. Here's a quote from the 90s from VW's leader at the time, talking about his desire for a hot new car line, and what would happen if his team told him that they could not delivery it:

"Then I will tell them they are all fired and I will bring in a new team," Piech, the grandson of Ferdinand Porsche, founder of both Porsche and Volkswagen, declared forcefully. "And if they tell me they can't do it, I will fire them, too."

Now we add a few more interesting ingredients, special to this case:

  • European emissions standards and tests are terrible, and allowed diesel to grow very strong in Europe, and strong for VW in particular
  • VW wanted to duplicate that success in the USA, which has much stronger emissions standards and tests

The team is asked to develop an engine that can deliver power and fuel economy for the US and other markets, and do it while meeting the emissions standards. The team (or its leader) says "yes," instead of saying, "That's really, really hard."

They get to work, and as has happened many times in many companies, they keep saying they are on track. Plans are made. Tons of new car models will depend on this engine. Massive marketing and production plans are made. Billions are bet.

And then it unravels

Not too many months before ship date, it is reported, the team working on the engine -- it is not yet known precisely who -- finally comes to a realization. They can't deliver. They certainly can't deliver on time, possibly they can actually never deliver for the price budget they have been given.

Now we see the situation in which ordinary people might be pushed over the line. If they don't deliver, the company has few choices. They might be able to put in a much more expensive engine, with all the cost such a switch would entail, and price their cars much more than they hoped, delivering them late. They could cancel all the many car models which were depending on this engine, costing billions. They could release a wimpy car that won't sell very well. In either of these cases, they are all fired, and their careers in the industry are probably over.

Or they can cheat and hope they won't get caught. They can be the heroes who delivered the magic engine, and get bonuses and rewards. 95% they don't get caught, and even if they are caught, it's worse, but not in their minds a lot worse than what they are facing. So they pretend they built the magic engine, and program it to fake that on the tests.

The interesting thing is this logic works at many corporate levels. It could be mid-level engineering managers who do this, just protecting their careers. It could be the heads of engineering for the project (who are the ones being suggested as the culprits and who have been suspended) who have even more to protect. But you can even see a CEO making a decision like this, since the failure to deliver the engines and the car lines would probably mean his being fired as CEO. You can take this question up the chain and imagine people at all levels deciding to cheat or go along with a cheat. (Except perhaps at the board level, I don't see motivation for a board of directors to actually vote for something like this.)

You might even be able to imagine yourself doing it, which is why we want to understand it. Who hasn't promised and found themselves unable to deliver? Who hasn't seen a product be late, or even never make it? Products in computer software and hardware ship late all the time. It's even almost expected in the industry. It sometimes costs companies a lot of money. Sometimes pressure causes companies to ship products before they are ready. That happens because our industry is more tolerant of these failures, though sometimes those projects and companies go down to ruin. In computers, only the customer judges how ready the product was, while in emissions, it is government tests. Even so, computer companies lie about their bugs sometimes.

With every project, there is one thing worse than not delivering, and it's not delivering and not letting people know ahead of time that you were not going to make it. That should be bad, but by being so bad, it meant that people were more willing to fake that they delivered than to own up.

And it wasn't all that bad

To add to the shocks, Consumer Reports tested the car in low-emissions cheat mode and the differences were less than you would expect.

  • Acceleration dropped 0.5 seconds (0-60) on older cars and not much on the latest models.
  • Fuel economy dropped from 53 to 50mpg on the 2015, and from 50 to 46mpg on the 2011 models.

This shows just how competitive the market is (which is good) but also how hard the culture inside VW was. Whoever it was decided to do this because if they delivered an engine with 4mpg less than promised and half a second of acceleration less, they would end their career. On top of this we also see that with 5 more years, they actually were able to deliver and engine with the acceptable MPG of 2011 and good acceleration, though the competition has also moved on.

And now, the European angle

It is worth noting that this scandal both starts with the poor quality of the European tests and standards, and oddly ends with them as well, and this has a certain irony to it.

European emissions standards are much more lax, and their official test is very different from real world driving. So cars which pass the Euro test get out on the roads and emit 10x or more NOx pollutants than they do on the test. This is a result not of cheating, but of "designing to the benchmark" -- another common problem in the computer industry. They make cars which do decently in the very non-real-world euro-test, and don't care very much what they do in real driving.

The USA standards are tighter, and the test is more like real driving. It's not perfectly like real driving, so cars will emit a bit more out on the road, but not an order of magnitude more like in Europe.

In Europe, they could make a decently performing, efficient diesel car because the rules let them do that while putting out lots of pollution in real driving. And tons of people drive diesel there, it is way cheaper per mile -- the fuel has 20% more energy density and costs less per gallon. But meeting VW's goal of doing that at a reasonable price was in fact impossible at the time.

It's also why they got caught. They were tested in the real world because some researchers wanted to show that Europe could do better. "Look," they said, "They are making cleaner diesel cars for real world driving for the USA -- they must be because they are passing the US tests!" And so they tested them to prove it, on real roads and ... whoops.

Back to robocars

We understand the reasons to lie better here. People make promises. They can't deliver so they lie. In this case, the lie caused an external cost -- the extra pollution hurts everybody, with no special burden for VW. This is why reducing pollution is so hard.

With robocar safety, a lie will have some external costs, but mostly internal costs --ie. the vendor who lies will end up paying for the cost of any extra accidents they cause. Or rather, they will pay most of the cost -- there is more and more analysis coming out that suggests that the insurance payments of the existing accident system are still leaving a number of the costs unpaid, and borne by the victims (or even perpetrators) of the accidents. This is independent of the robocars question of course. The recent declarations by Google, Mercedes and Volvo say the companies are willing to accept at least the liability costs we have today on themselves, which means they have little incentive to lie on safety, as they are mostly lying to themselves. (Companies do lie to themselves, but have reasons to put in systems to stop that if they can.)

Comments

Freudian slip?

I've feared my carmaker would show up for this but I don't think they have. Perhaps they did plenty of ethical penance a long time ago, after their Third Reich experienced...

Eventually, even when a project isn't going off the rails, you always start tuning for the benchmarks whether they are Spec, or 3DMark, or Common Core, whatever. That's fine, necessary even.

Eventually, you almost always wind up with cases of "this makes the benchmark better" and either "doesn't do squat in real life" (benign), or "and does something really sucky in real life" (malicious) and everything in between.

Almost never will your entire management chain appear in your cube (Ha! Fooled you, no one gets cubes anymore) in capes, top hats, and twisting their mustaches and go "BwaHaHa, we are Evil(tm), pursue plan Maximum-Deception!!!". Except maybe 3DMark, no really, look into that shit; people literally looked at the name of the binary running and did lo-res/fast rendering for that binary only; amazing.

No, what will happen is your manager will stare glassy eyed at a point 3 feet behind your head and state the the #1 priority is to get the very best performance on FooMark2000, get the best performance for all customer cases foreseen and unforeseen, shave 4 weeks off the promised schedule, and follow the 27 point Corporate Responsibility Achievement Partnership we were told about at onboarding and forced to recertify every 89 days.

If you inform your management chain that some of the above items are in conflict, they will blink rapidly to indicate duress, mutter something about the 27 points of CRAP, how Black Duck will catch it if it is a problem, and then scurry away and never make eye contact with you again.

Wow, those 3dmark cheating stories are amazing, I had not followed this at the time. To me, that would seem like some sort of fraud that the FTC should have gone after. This is more than just developing to the benchmark. The sense is the other diesel companies in Europe were mostly developing to the test, rather than taking specific steps to detect the test or to make non-test deliberately do worse. But no doubt if they went beyond that, we will find out now.

The challenge I point out revolves around what to do with it. Management should be demanding of its engineering teams. Companies that don't do that won't do well. The question is how to skate that fine line between being demanding, and being so demanding that they lie to you, either about when or if they can build something -- or about whether it really does what they asked. (I am not presuming who did the lying here, it could have been engineering management or upper management.)

One method of "skating the line" is to structure the organization so that at least some people are incentivized to keep the organization honest. I am reminded of a story about an accidental 38x antibiotic overdose at UCSF hospital:

"Safe organizations relentlessly promote a “stop the line” culture, in which every employee knows that she must speak up?—?not only when she’s sure that something is wrong, but also when she’s not sure it’s right. Organizations that create such a culture do so by focusing on it relentlessly and seeing it as a central job of leaders. No one should ever have to worry about looking dumb for speaking up, whether she’s questioning a directive from a senior surgeon or an order in the computer."

https://medium.com/backchannel/how-to-make-hospital-tech-much-much-safer-c81dac43684a

The challenge of balancing the competing factors of speed, performance, cost, and quality is faced by almost every organization. In my opinion, danger comes from tying to many people to one metric. The momentum to achieve that goal can overwhelm any checks or incentives intended to maintain that balance.

Add new comment