In the last several months I have had the opportunity to spend time with managers of several large testing teams. Each of these managers was a leader of a test team. Their team is tasked with testing dozens of applications that make up a portion of the hundreds of applications that must be tested annually by their larger organization. In this entry I will describe some of what I found.
Tasks and Status
Let's talk about test status reporting and passing out daily testing tasks. This is what I call the Micromanagement of test teams.
This composite view of what these test managers expressed as their normal process for doing this part of their job:
- Based on information from the build team, they spend up to 30 minutes per day assigning tests to individual team members to be run that day.
- Based on individual member's test status spreadsheets, they spend up to 1 hour per day building a summary spreadsheet reporting status on the team's testing progress.
- Each individual on their team spends up to 1 hour per day recording their testing status in a spreadsheet that is then emailed to the team lead as their daily test results.
- Each week in preparation for the Director's operations meeting, the test manager spends up to 4 hours rolling up a weekly team status from his team member's individual status spreadsheets.
The typical test team (across those that I talked to) had around 10 members and spent over 20 hours/week summarizing test status and rolling up known test results. If the loaded salary of that person is $50,000 / year, then that one team is spending $25,000 / year counting beans and not testing. Imagine a large testing shop with 20 or even 50 test teams, they might be spending up to $1 million just counting how much they got done!
Manual Testing of New or Changed Functionality
The central function of these test groups were to test the new or changed functions being added to these existing applications. In 100% of these large testing shops, there was little to no test automation being used to do this. In most cases either an Excel or Word-based standardized template is being used to capture the user-entered steps that must be performed to execute the test scenario. The way that the input data and expected results were captured as part of the test scenario varied from shop to shop but there was a recognized need to represent both as part of the test procedure. In each of these test groups, the test results were manually entered into a results spreadsheet and kept by the tester. Defects found in the application were filed in the organization's defect tracking tool and relied totally on the tester to enter enough information for the defect to be reproduced and analyzed by the development team. In general there was no linkage between related test artifacts. The test scenario template was separate from the test results spreadsheet which was separate from the defect record documenting the defect found with the test. The tester's end of day status spreadsheet was the only way of connecting that Test scenario X with Defect Y.
If the testing practice was disciplined enough, the test scenario name is part of the information included in the defect description. The location of the manual test document (aka the test scenario) may be derived from the data recorded in the defect record so the steps to reproduce the problem do not have to be manually cut and pasted from the manual test document into the defect record. Furthermore the manual test documents are accessible and used by the developer in reproducing, fixing, and verifying that they fixed the problem.
Test Case Design and Manual Test Authoring
In most cases, a business analyst has written either a functional requirement, use case, or business process flow document that clearly describes the business user flow that is implemented by the developer of this feature. This document is used as the basis for the formal test case and/or manual test (scenario). A variety of styles were used even within the same set of manual tests as to the level of detail and inclusion of sample test input data. In numerous manual tests, there was only a statement for the tester to observe if the result of the operation "worked". In one shop, there was an expectation that a screenshot would be captured as part of the test results document. This was noted to be needed in their industry as part of the proof that the test was actually performed and it passed for auditing purposes. While there was a notion that at least one test case would exist for each of the new or changed business requirements (or use cases), it was not clear how progress or status was tracked for how many of the requirements had designed test cases or how many of those tests had been run at any given time.
My customer visits with these highly dedicated and motivated test managers helped me understand the project pressures and pain points that exist in their jobs. While I was somewhat disappointed by the technologies in use, I was struck by how disciplined each of these individuals was and how much effort it took for them to regularize and roll up the testing status information from their team's efforts.
In each case, I was visiting to help them evaluate and understand how Rational Quality Manager could bring great business value to their team's work. Indeed, by providing a common framework for storing test plans, test cases, manual tests, and test results, Rational Quality Manager could save them a large percentage of their daily and weekly tasks. By linking the requirement to the test case to the test result to the defect, you can eliminate the need to duplicate or even copy data from one place to another. By having a common repository for all test artifacts, the real-time project status can be gleaned by just logging in to the RQM desktop and can be reported out by clicking a pdf generation for attachment to your status email. Maybe one day, the director's of testing shops will even log in themselves and get their own status and not require an operations review of projects that are on target. The resulting brainpower savings can be focused on helping rescue the projects in trouble! But then some would say I am a dreamer!