Your Choice: Pay Now or Later!
by Pat Craig 

From the Fall 1998 issue of the Complexity Management Chronicles 

The deadline looms in the not too distant future, and the development process groans on. A quick, clean test is the last hope for meeting the deadline. Testing begins and it become either a mad chaotic scramble or a slow gut-wretching process. An organized test could have saved the day. Instead, poor test preparation becomes one more reason for the project's lateness. One of our clients has shown us a better way.

This client, who runs a  mutual fund back-office, did a great job preparing to test. We would like to report the story as a best practice. Their user acceptance testing team needed to test a multimillion dollar software project. This project involved creating one consolidated order entry system for the entire company. We worked with them on metrics and management reporting.

Taking the Initial Steps:
This team of 24 began by building a high level test plan. This plan outlined what would happen during the testing period. Facing an enormous test effort, management divided the testing project into six phases.

Next, management hired three people as toolsmiths. The toolsmiths enabled the 24 person test group to run relatively independent of the formal software development staff. The toolsmiths used Microsoft Access and Visual Basic to create a number of databases, including input forms and reports, and to copy produciton data. These toolsmiths imported everything into Access that the group needed.

Building Test Cases:
With the broad framework documented in the test plan, the testers wrote highly specific test cases. (A test case is a specific business scenario). Examples of test cases included buying and selling securities. Using an Access database system, the group created 3,900 test cases. The staff classified each test case according to a number of dimensions such as: specific business owner (line of business), type of transaction, pass vs. fail test (including error message number and text), and timing dependencies. This classification system aided management reporting.

Getting Test Data:
Once the group had finished specifying test cases, they determined what specific test data they needed for each case. For example, type of account (401K, IRA, unrestricted), type of customer (regular or premium service) etc. Once testers identified specific production data, the toolsmiths copied it from production to test files.

Next, the entire group spent two full weeks refining the test data. For example, the group put lots of cash into test customer accounts (funny money!) so that they could rerun bond purchase tests and still have cash in the test account.

Testers used each piece of test data for only one test case. (If testers use the same data for more than one case, findin g abug gets overly complicated.) To track which test case owned each piece of test data, the toolsmiths built an Access "ownership" database.

Determining Expected Results:
Good test practice dictates that someone calculate the expected results before testing begins. But calculating the end result can be time consuming because many dynamics can influence it, i.e. broker commissions, volume discounts etc. To solve this, the toolsmiths built a small program to compute results. To make the tests even easier to verify, the toolsmiths created a program to place the expected results onto a look-alike GUI.

It costs a great deal to correct bugs in production code. Whether working on large or small projects, contemplative IT personnel insist that the team find the bugs in the cheaper test phase. Our client's project demonstrates successful test preparation. Sufficient lead time, adequate resources, and good planning proved vital for them.

©Complexity Management 1998
Somerville, Massachusetts
Located in Metropolitan Boston 


Complexity Management Chronicles, a newsletter for software quality assurance professionals, is published in print form four times a year. Send your name and snail-mail address to the e-mail address below if you would like to be on the mailing list - at no cost to USA mailing addresses. 

*********************************************************************** 

Return to Complexity Management Home
Contact Pat Craig at patcraig@alum.mit.edu