I recently came across a piece that Michael Howard wrote....:
Perhaps it is my philosophy background that taught me that analogies are actually a really good way of comparing things, making a point, etc.... I'll throw out the question of whether analogies are useful in general and whether comparing computer software to other things is actually a useful endeavor.I have long believed that if someone makes an argument and uses an analogy, then the argument is often weak. But that’s just me!
This is why I usually roll my eyes when I hear statements like, “If [bridges|cars|airplanes] were built like software then…” because comparing physical items and software is just wrong. They are not the same thing, you cannot compare them.
I totally agree that software and items in the physical world are different. But the rules are also different....
In the physical engineering world, we expect engineers to follow formal "threat modeling" for their products. If they don't built the bridge strong enough to not collapse under normal use, they can even be held personally liable. As can their firm, the construction firm, the inspectors, etc.
In the software world we're not actually responsible for anything we produce. We write EULAs that specifically exclude us from liability.
I don't know about you, but I'm not sure I'm want to drive my car across a bridge where I first had to sign a EULA that limited my rights to sue if anything went wrong, and disclaimed any liability and specifically claimed the bridge wasn't necessarily fit for its purpose... I'd probably find another way across the river.
I hate to call Michael disingenuous but I feel that his counter analogy is just flat out wrong. Exclude for a moment, if you will, all of the deliberate attacks against computers. Take a look at computing's track record in just normal reliability under regular operating conditions and I think you'll find that it isn't so hot....
Sure there are different levels of engineering used to build certain cars, etc. At least in the US they all must meet a certain set of basic safety standards before people are allowed to buy them. Same goes for drugs, food, etc. All of these things impact people's safety, and in many cases so do computers. Why do we treat them differently?
If software engineers want to continue to have credibility in the general debate, then they have to start talking about safety, reliability, integrity in the same way that other engineers do.
When the guy building the railroad tracks is told to speed up the project, throw out the requirements and just lay down the tracks and we'll fix it in tracks-2.0 he doesn't just shrug his shoulders and do it. Sure its a regulatory problem, a legal problem, etc. But just like with doctors, lawyers, and professional engineers they all have a code of conduct, ethics, morals that they must abide by. Are there people who skirt the rules, sure. I don't think that diminishes the profession or the code as a whole though.
If I'm an engineer and I'm designing a bridge, car, etc. and I know that my tools are faulty (C, C++, etc) I'm negligent if I go ahead and use them anyway knowing its going to be extremely difficult to prove my results when I'm finished.
Yet in software development we excuse this sort of thing all the time. We used flawed tools, we have flawed infrastructure, we have protocols we know can't withstand the kind of abuse they are up against.
We do have environments where there are constant threats and lives are at stake. In the military when you have a faulty system like this, people die. Then we iterate and produce version-2.0. Hopefully fewer people die. It does make me start to understand how we get mired in red-tape in these sorts of situations, but I do long to finally get a piece of software that doesn't have to disclaim all liability for failures to perform its basic functions.
No comments:
Post a Comment