What Links the Requirements to Tests in Development Today?

[article]
Summary:

What’s happened to the links between requirements and tests? How do we know what to test and when? How do we, and the customers, know we got the system being built right? What’s the traceability between the two disciplines?

Testing used to happen at the end of the project process once the whole system had been built. Then testing became an iterative done each time. Now extreme programming (XP) advocates are building test code before building actual code, by introducing the concepts of test frameworks and mock objects.

Requirements moved from all specifications being gathered up front in one big exercise, to starting early on in the project and going into ever increasing detail throughout the project, using use cases. What’s the traceability between the disciplines?

Types of Testing
Doing software development in the 80s, only mission critical projects would take testing seriously. A formal concept of testing didn’t seem to exist on other projects. It was all down to the developer’s self- pride to turn out good quality code. Some even wrote special test code into their code, so it’s not really a new concept. We had a bit more time back then! As greater and increasing pressure to deliver against more unrealistic deadlines hit home, developers’ first instincts were to give up doing exhaustive testing of their own code. “Let the user find and report any errors and I’ll fix them later” was what developers thought. We now seem to have come full circle in developer testing.

Of course, customers want to prove for themselves that not only is the code ok, but the systems are practical to use. Customers have championed ever increasing clever ways of doing their tests. There are other types of tests to check the performance and reliability of the system, which relates to the quality of the design rather than code. “Did we get the performance and speed right under large volumes, etc.? How will the system react in a power failure situation?” are typical questions they ask.

Testing has become an institution! At last count I found sixteen test types, under various categories:

Functionality

·         Function: Does the system do what we specified it should do?

·         Volume: Can all parts of the system cope with the maximum data volumes we expect?

·         Security: Do only those actors who we expect to access the system, get access?

Usability

·         The user interface issues like wizards, consistency, documentation, aesthetics, etc.

Reliability

·         Stress test: Extreme workloads, low memory, unavailability of services and hardware.

·         Structure testing: Looks for orphaned code, all links connected, internal structure, etc.

·         Integrity: Resilience to failure

Performance

·         Load testing: Checks performance under various operational loads on the system.

·         Performance profiling: Looks for bottlenecks and inefficient processes.

·         Benchmarking: Tests this system against a known reference system.

·         Contention tests: Tests that multiple actors’ demands on one resource can cope.

Acceptance Testing

·         Alpha and beta tests by customers to see it’s acceptable to use live.

Exploratory Testing

·         Test design and test execution at the same time.

Supportability

·         Installation: Testing software installs under different situations of low space, etc.

·         Configuration: Different software configurations on different hardware

Contingency testing

·         Making sure fail-over procedures work, backups recover, etc.

Testing tools have evolved. Test automation and tools have also progressed in leaps and bounds over the last five years. There are all sorts of testing tools that test for different aspects of the systems quality:

·         Performance tools

·         Data generation tools

·         Simulator tools

·         Stubs

·         Test managing tools

·         Quality tools

·         Coverage tools

·         Black box and white box tools

·         Record / playback tools

The processes have evolved:

·         Besides the tools, the processes have grown and evolved. New techniques have been added into the equation:

·         One break through was the rational unified process [RUP 02] putting testing on the map as its own specialized discipline.

·         Proper testing tools have emerged over time, in the various categories mentioned above, to help automate the testing process.

XP [XP 02] with its code the test first brings a lot of great ideas into play:

·       It is a fantastic way of getting the developers to take responsibility for writing good quality code. The tests are written before the code is even written. So all the issues are fresh in the developers mind.

·       JUnit [JUN 02], Mock Objects [MOC 01] and the like, in the Java world, have brought whole testing components into view and therefore quality into consideration. This has begun to force quality into the code directly at source. Excuse the obvious pun.

·       By writing built in test scripts also means that you build up a great library of regression tests that can be run regularly to prove the build, immediately after doing the build.

·       The concept of continuous building with a blame log of who broke the code to keep people on their toes. The recency aspect of each issue also critical to sorting out problems quickly.

If you look back at the older waterfall methodology, you’ll notice that all the requirements gathering was completed up front and most all of the testing tended to be done right at the end. Often, by then, much of the time was spent trying to get the build to integrate properly. Generally only after some time had elapsed, would the testers get to actually start testing any functionality. Often testers ran out of time and budget long before the testing was completed or even full coverage obtained. Pressure from the business and project managers to get the software released often forced them to release areas they had not tested or were not comfortable with releasing.

With the newer ideas and concepts that have evolved in the last few years, Requirements roles have shifted to the right from everything being done up front to spreading the workload from the project beginning right towards the end of the project.

The same principle is also true of the testers, who have shifted to the left from the far end of the development life-cycle to being involved from the Inception of the project, throughout the software development lifecycle, until the transition into actual live running environment.

So Where Are the Links?
The size of your project will depend upon how the requirements processes are configured. Typically large projects will need comprehensive requirements documents, not only to accommodate the quantities of information but also to help manage change and prioritization, so this paper talks to that larger development project situation.

In the last four years, some things have evolved which have taken testing into a new dimension:

·       Use case model and other requirement artifacts giving everyone an evolving documented idea of what functionality the system is expected to deliver.

·       The use cases are realized, and component contracts established. Thus, the use cases trace to one or many use case realizations.

·       So direct code unit tests and test framework objects can be traced back to the use case realizations. This is the magenta route of unit testing taken above. Mainly done by the designers and developers.

·       Use cases also trace to test cases. Here tests are derived with no knowledge of the internal code, just requirements of what should be happening.

·       Testing a build from automated scripts and other test types such as performance, reliability and supportability, will all be traced back to the use cases and the supporting non-functional documentation. This is the yellow route shown above, which the testing discipline take to prove to the customer that the developers have delivered what was requested.

Above are at least two routes of traceability in testing that have been shown, both of which have their roots Requirement artifacts. The yellow route; derived from a more traditional testing discipline and typically carried out by people that were not involved in development. This route involves testing far more than just functionality; now enhanced and modernized to embrace a more iterative style of regular testing from very early on in the project.

And the magenta route also derived from a traditional developer self-test route, but now being formally enhanced and made more rigorous using testing frameworks, assert statements and mock objects, etc.

Rethinking the V-model of Testing
In an excellent article by Brian Marick, [MAR 00] he points out that the old V model of testing, while still valid in some aspects, is still too simplistic and does not really embrace an iterative development style. He suggests that code is developed in a series of handoffs and the behavior changes at each handoff; therefore, tests must test that behavior as it changes.

He also says that one cannot develop tests from one document only which does not change as this is not realistic. Essentially the V model got in the way of proper testing, so his testing model had to be enhanced to cater for the continual change.

Conclusion
We have come a long way in testing in recent years. Testing appears to be the area under most radical enhancement at the moment. This is a good thing because elusive bugs just seem to creep in everywhere; no matter how hard we try to refine our software development processes for quality.

References

1) [MAR 00] - Marick, Brian - New Models for Test Development - http://www.exampler.com/testing-com/writings/new-models.pdf

2) [RUP 02] Rational Corp. Rational Unified Process - http://www.rational.com/products/rup/index.jsp

3) [XP 02] -Extreme programming website - http://www.extremeprogramming.org/

4) [JUN 02] - The JUnit website - http://junit.sourceforge.net/

5) [MOC 01] - The Mock Object website - http://www.mockobjects.com/ 

About the author

StickyMinds is a TechWell community.

Through conferences, training, consulting, and online resources, TechWell helps you develop and deliver great software every day.