Wednesday, November 21, 2012

The Software Bathtub Curve

(Disclaimer: please do not use software in the bathtub.)

Professor Hirsch speaks frequently about the bathtub curve observed in semiconductor failure rates. A microchip (unlike, say, a pair of scissors) doesn't follow a simple pattern of gradually wearing out over time until it fails. Instead, the observed likelihood of failure follows three distinct phases: First, a brief initial period when failure rates are high, mostly caused by manufacturing defects. Then there's a long steady period when failure rates are very low. Then, eventually, the physical media starts to break down, and increasing failures result. The shape of the graph looks like a bathtub.

Semiconductor failure rates

A case can be made that software follows a bathtub curve, too. However, rather than failure rate, quality is measured in terms of user satisfaction (always!). In the initial phase (beta and early release), bugs abound, performance is slow, and the application lacks refinement, having not yet benefited from extensive user feedback. Satisfaction is low but increasing. Then, after a few releases to polish it up, users settle into a comfortable working relationship with the app, and satisfaction is consistently high. Eventually, though, due to evolving user needs or technology environment, the software no longer does the job and must be EOL'd or (preferably) upgraded. Differences in methodology will affect the scale of this curve, both in terms of time and application scope: in an agile process, the cycle will run faster and across smaller portions of functionality. But the pattern will be observed nonetheless: a (hopefully) rapid resolution of initial shortcomings, followed by a comparatively longer period of stability.

Software user satisfaction

It's important to remember this when rolling out new software features: you may be tempted to make a big splash by announcing a new release loudly and to all the most important users (the ones who have been demanding those features the most adamantly). Unfortunately, this creates the biggest exposure right when the software is at its most vulnerable. Better is to take a gradual approach, allowing the new functionality to "burn in" under safe conditions, with sympathetic users and in non-mission-critical situations. When problems are found, they can be addressed calmly and efficiently. As the software matures, the size of the user base is allowed to grow. It's true that this approach lacks zazz, and may therefore pose challenges for marketing and sales objectives (so compromises must be made). But over the long haul, I've found that a cautious approach to deployment creates higher user satisfaction overall -- and that drives customer loyalty, which is good for everyone.

1 comment: