Some reflections on B.C. Smith’s “Limits of Correctness” 1985
The world is infinitely rich. Smith wrote his article in the cold-war days of the mid 80’s. He pointed out the different natures of formal correctness (that computers can handle best) and informal responsibility (best achieved by humans’ social and moral systems) and how they both contribute to handling the ‘infinite richness of the world’, with particular reference to handling nuclear weapon systems technically. This makes a nuclear strike due to technical failure a risk of low frequency but high impact (LFHI Risk).
Engineering is infinitely rich. Much less so than the world, but failure in industrial production can be relatively high impacting, take for example Toyota’s recently announced problems with the accelerator pedal, or the Y2010 bug of certain credit cards produced by the company Gemalto. Even with quality standards like SPICE, CMMI, EFQM and many more, why does production still seem to fail in rare cases?
Human and formal systems never fail alone. Employing Smith’s thinking, quality standards cannot exclusively be blamed here, since human factors, like common sense or social systems, obviously failed as well. If a failure has occurred then both factors have contributed to it.
Now, I don’t have an answer on how to ultimately handle LFHI Risk, or even what Toyota or Gemalto should do next, but I’ve a feeling that enforcing further quality standards is not the solution.
So far, just a few thoughts.