Real Money: Poor Software Testing Practices Cost US Companies $59 Billion

[article]
Summary:
According to a new government report, inadequate software testing costs the US economy $59.5 billion a year. How's that for proof that software testers perform a vital service? If you'd like to know how that number was derived, read on as Linda Hayes unwraps some of the methodology behind the study.

According to a new government report, inadequate software testing costs the US economy $59.5 billion a year. How's that for proof that software testers perform a vital service? If you'd like to know how that number was derived, read on as Linda Hayes unwraps some of the methodology behind the study.

Editor's Note: The editors at StickyMinds saw this report and asked Linda to write a column about it. Special thanks to her for tackling the lengthy NIST report and delivering this column on short notice.

If you have been struggling with making a business case for more and better testing, you now have an ally in the form of the National Institute of Software and Technology (NIST), which is part of the Department of Commerce Technology Administration. In their 300+ page report released in May of this year, the NIST estimates that inadequate testing costs the US economy $59.5 billion a year.

As staggering as that amount is, I initially thought the number was too low, considering that USA Today previously estimated the annual loss at $100 billion. But then I realized that the NIST was strictly looking at the costs to developers and users of fixing defects and not to the costs to the business in the form of lost revenue or productivity. But whether it is $100 billion or $59.5 billion, as they say in Texas: A billion here, a billion there, pretty soon you're talking about real money!

Methodology
The NIST surveyed two industries that are heavily dependent on technology: transportation-manufacturing and financial-services. These two categories represent hard and soft goods: transportation-manufacturing systems are used to produce tangible goods, and financial-services systems are used to process electronic data. These differences account for some wide variations in the cost of defects, as you will see. The survey forms delivered to these industries are contained in one of the report appendices.

The report then proposes a taxonomy of costs for software developers and users. Developer costs include labor costs to find and fix bugs, as well as the supporting software, hardware, and external services (think consultants) costs. For users, the costs included the time and resources invested to select software and install it, and the ongoing maintenance costs to detect and repair defects and any damaged data.

Findings
The report is extensive and there's not room here to show you all of the findings. I'll mention a few especially for testers. One caveat: for purposes of the report, testers are lumped in with developers. Try to overlook that if you can, otherwise it may become distracting.

Early on, the report offers a definition of software testing that is—well, interesting. It says "Software testing is the process of applying metrics to determine product quality and the dynamic execution of software and the comparison of the results against predetermined criteria." This wording will probably launch the code review and walkthrough contingents into orbit. In defense of the report, though, it does go on about the difficulty in identifying and measuring quality attributes. At least they noticed.

Next, it refers to a 1995 book that estimates the allocation of effort within the development cycle at 40% for requirements analysis, 30% for design, and 30% for coding and testing. I don't know about you, but I think these numbers are wishful thinking. It also estimates that programmers spend 10% of their time testing, while software engineers spend 35%. The distinction between programmers and engineers is not spelled out, but I must be used to working with programmers.

But the real meat of the report, at least in my opinion, is when it gets down to costs. First it offers the observation that we have all heard before—the sooner you find defects, the less they cost to fix. They offer an example of 1X cost to fix a defect in requirements and analysis, 5X to find in coding and unit test, 10X in integration and system test, 15X in beta test, and 30X in post-release. I've seen them much higher—as much as 1000X in production-but whatever. The point is made.Then, however, it attaches some real money. In the transportation manufacturing sector, the surveys indicated it cost 268.4 hours per major bug for repairs, $604,900 in lost data, and 1.5 months of delay in time to market. Then, somehow, it arrives at a cost per bug of $4,018,588. Don't ask me to explain this last number, I can't and they don't. But it is impressive!

In the financial services sector, those numbers are 18.4 hours per major bug for repairs, $1,425 in lost data, 1.5 months delay in time to market, arriving at a cost per bug of $3,293. We can only assume that the huge difference between these two industries is the difference between repairing defects in manufactured goods versus electronic data—the former is far more costly to fix than the latter, and has a much longer useful life.

Finally, the report boils it down to the cost per developer/tester of inadequate testing: $69,945, of which about half could be avoided through "feasible" improvements. Wow! I see some raise requests on the horizon. But guess what? Only 36% of the costs affect developers—the 64% majority affect users. Maybe we have been pitching quality to the wrong side of the house all along, and the latest trend of only doing IT projects that are user-funded will work in our favor.

Meaning
All in all, this report has the most value by validating what we have always known, and attaching real money to it. Whether you buy these numbers (or whether your management does), there is value in walking through the way they were derived, so that you can make your own business case. And get some real money.

About the author

AgileConnection is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.