What we can learn from VW’s emission scandal for IoT

As the digging into Volkswagen’s emission/cheating scandal continues, it’s very interesting to watch the kind of conflicts and issues we see emerge from the whole thing. Interesting not because it’s fun to ridicule corporations (it’s not, especially when emissions are concerned), but because this particular case gives us a good idea of the kind of scandals, issues and questions we’ll increasingly see over the next few years around and sensor-data based decision making.

In the LA Times, VW’s U.S. chief executive Michael Horn is quoted making all kinds of interesting statements and assumptions, some of which might be relevant primarily to the ongoing issue (and the lawsuits that will no doubt follow), others that I’d like to take as stand-ins to consider larger issues.

As a quick recap: VW’s diesel engines had emissions that were hugely worse than legally allowed in the US, and had software installed that would detect when the cars were in the testing lab, where they would change the engine’s mode of operation to meet legal emission requirements, then go back to the (much worse) emissions once back on the street.

So what we have here is already interesting in that we see

  • engine performance emissions being based heavily on software;
  • measurements reported through built-on software that caters to the wrong kind of incentives, ie. not to reporting as truthfully as possible but with certain predetermined outcomes in mind;
  • that software to be closed source so there’s no legal way of external peer review or quality control;

“This was a couple of software engineers who put this in for whatever reason” — Michael Horn

So this VW exec throws the engineers under the bus. This either could either mean that management wants to cover up that this was a conscious top-down decision, or that quality assurance isn’t working if a bug feature like this can be rolled out to millions of cars. (I’m not familiar enough with the US legal system, but I’d expect either way leads to massive shareholder lawsuits.)

Also, is VW agile enough that some individuals could even implement that kind of code without oversight? Does the company have a culture where developers are empowered enough to make these kinds of decisions?

Either way, after this statement I imagine that VW will have a very hard time recruiting and retaining top talent.

“To my understanding, this was not a corporate decision. This was something individuals did.” — Michael Horn

Yes well, duh. Of course it is. Every decision is made by individuals; every corporate decision is made by indivuduals. The question here is: Were the individuals in question a few “rogue engineers” or was it a group of executives passing down an order, however informally?

(Of course, VW wasn’t alone. The problem is a systemic one, as The Guardian reports.)

In settings where software plays an ever-larger role in how a physical product works (or doesn’t) and reports (or misreports) data, these liabilty questions will come more and more to the top of discussions.

I think that increased transparency and scrutiny means that unethical behavior will be surfaced almost always, and that legal regulation should incentivize peer review and research into the code rather than criminalize it. This means revisiting legislation like the DMCA and its international counterparts that “makes it harder for watchdogs to find safety or security issues, such as faulty code that can lead to unintended acceleration or vulnerabilities that let an attacker take over your car. The legal uncertainly created by the Digital Millennium Copyright Act also makes it easier for manufacturers to conceal intentional wrongdoing” EFF.

I’m convinced that ethical behavior and openness are a competitive advantage as they lead to better products, increased trust in the organization and will attract better talent.

Leave a Reply