STSC Logo About Us Consulting Services CrossTalk STC Conference Resources


Software Technology Support Center


About CrossTalk

  - Mission
  - Staff
  - Contact Us


About Us

Subscription

  - Subscribe Now
  - Update
  - Cancel


Themes Calendar

Author Guidelines

Back Issues

Article Index

Your Comments
Home > CrossTalk Jun 2004 > Article

CrossTalk - The Journal of Defense Software Engineering
Jun 2004 Issue

Introducing TPAM: Test Process Assessment Model
Yuri Chernak, Ph.D., Valley Forge Consulting, Inc.

This article presents a Test Process Assessment Model, TPAMTM, that can be used in conjunction with the Capability Maturity Model (CMM ) Level 2 and Level 3. TPAM is fully consistent with the CMM structure. It presents the test process using three key process areas and defines their process goals and practices.

The Software Engineering Institute's (SEISM) Capability Maturity Model® (CMM®) [1] has a long and successful history of being used by software organizations for assessing and improving their software process. Another strong trend - offshore software development - has also contributed to the increased use of the CMM. American businesses use this framework as a standard approach to assess and select their offshore partners. Likewise, offshore software development companies, especially in India, use the CMM certification as a marketing tool to promote their services and compete for contracts.

One of the known limitations of the CMM is that it does not sufficiently address the software test process. The few testing-related practices defined by the key process area (KPA) Software Product Engineering at CMM Level 3 do not provide sufficient visibility into the test process capability, nor can they be used as a framework for test process improvement. To fill this void, a number of testing maturity models have emerged since the mid 1990s. Some of them were designed to be used in conjunction with the CMM [2, 3, 4]. However, none of these models has gotten much acceptance so far, which motivates us to continue research in this area.

Even though the SEI's CMM IntegrationSM [5] covers the test process much better than its predecessor, a transition from the CMM to the CMMI is not going to happen overnight. Thus, we can expect that U.S. companies, performing either self-assessments or capability evaluations for selecting their subcontractors, will continue using the original CMM for some time. To help these companies assess and improve their test process, this article introduces a Test Process Assessment Model (TPAMTM, pronounced tee-pam) that has been developed to be a CMM-companion model intended to complement the CMM framework at Level 2 and Level 3. TPAM has been primarily influenced by the Systematic Test and Evaluation Process (STEP) methodology [6]. It has been evolving over the years and reflects the author's experience with large-scale projects delivering critical systems used on Wall Street.

Overview of the Existing Test Process Models

As I mentioned earlier, a few models have been developed and published to assist organizations in performing test process assessments. An interesting analysis and comparison of these models can be found in [7]. The following three models have the most detailed descriptions and will be in the scope of my analysis:

  • Test Process Improvement (TPI).
  • Testing Maturity Model (TMM).
  • Software Evaluation and Test Key Process Area (KPA).
The first model, TPI [8], is well documented and presents a set of key areas and their practices that are organized by test process maturity levels. However, TPI was not intended to be a CMM-companion model. Its structure is different from the CMM, which makes it difficult to apply by organizations that have already adopted the CMM as a standard framework for process improvement.

The second model, TMM [4], is a comprehensive model, which is also well documented in the recently published book. It has been developed as a staged maturity model that can be used in conjunction with the CMM and CMMI. However, it appears that the TMM is better suited for use with the CMMI. The CMM defines key process areas, and each process area has defined process goals. The CMM goals are achieved by software practices structured by process features: Commitment to Perform, Ability to Perform, Activities Performed, Measurement and Analysis, and Verifying Implementation [1].

In contrast, the TMM goals are achieved by activities, tasks, and responsibilities that, in turn, are structured by three critical views: manager, developer, and user/client. Furthermore, TMM does not decompose the test process into key process areas. Hence, in some cases the difference in structure can make it difficult using the TMM in conjunction with the CMM. One example is when we need to assess and analyze the entire software process by process features and across all KPAs in the scope of assessment. Another example is when a new project team needs to build the software process, including the test process, from scratch and should focus on selected test process areas at different project phases.

The last model, Software Evaluation and Test KPA [3], was defined and proposed by Richard Bender as an additional CMM process area pertinent to Level 2. This model is completely compatible with the CMM structure and has sufficient details for its implementation. It is a useful tool that can help organizations assess their existing process and establish basic testing practices. However, this model has two limitations. First, it lacks practices related to testware design. Testware means products of the test process developed and used to test a software system, for example, test plans, test designs, test case specifications, test data files, automated testing scripts, etc. Second, it cannot be (and was not intended to be) used above the CMM Level 2 of software process maturity.

Defining TPAM

This article presents a new test process model TPAM ver.1.0. This model is fully CMM-compatible, can support the needs of large-scale critical projects, and can assist organizations in assessing their testing practices and evolving their software process maturity from CMM Level 2 to Level 3. Because TPAM has the same structure and process features as defined in the SEI CMM, it allows us to perform a consistent analysis and assessment data presentation across all process areas, including software testing. TPAM defines test process goals and practices, and it groups them by the following three key process areas that reflect a common definition of the test process:

  • Test Strategy Planning.
  • Testware Design and Maintenance.
  • Test Execution and Reporting.
Each of these key process areas is described in detail below.

KPA - Test Strategy Planning

KPA Overview

The main purpose of the Test Strategy Planning KPA is to establish a common understanding among the stakeholders of the product's evaluation mission and its acceptance criteria.

The Test Strategy Planning KPA involves the following:

  • Identifying the product's quality risks.
  • Developing the detailed test plans for the test levels necessary to evaluate the quality risks.
  • Establishing commitments necessary to perform the testing tasks.

Test strategy planning begins early in the project with defining the software evaluation mission, acceptance criteria, and testing life cycle. The test strategy planning process includes steps to estimate the resources needed, to identify and schedule testing activities, and to negotiate commitments. Iterating through these steps may be necessary to establish detailed plans for each test level identified in the test strategy. Detailed test plans specify the scope, objectives, and approach to testing for a given test level, and identify and schedule testing tasks and deliverables.

The defined test strategy provides the basis for 1) establishing commitments of all parties involved in software testing, 2) developing detailed test plans and the project's testware that are necessary to implement the test strategy, and 3) performing test execution and reporting activities.

KPA Goals

  • Goal 1: The project's overall strategy for software testing is planned and documented.
  • Goal 2: The project's stakeholders agree on the criteria for accepting the software for production.
  • Goal 3: Detailed test plans and testware products are kept consistent with the project's test strategy.

Commitments to Perform

  • Commitment 1: A test leader is designated to be responsible for planning the project's test strategy.
  • Commitment 2: The project follows a written organizational policy for planning a test strategy.

Ability to Perform

  • Ability 1: Responsibilities for planning the project's test strategy are assigned.
  • Ability 2: Adequate resources and funding are provided for planning the project's test strategy.
  • Ability 3: Members of the test group are trained in planning the test strategy.

Activities Performed

  • Activity 1: Test strategy planning is initiated in the early stages and in parallel with the overall project planning.
  • Activity 2: The software evaluation mission and acceptance criteria are defined and agreed by all relevant stakeholders.
  • Activity 3: A testing life cycle with predefined stages and entry-exit criteria is defined.
  • Activity 4: The software quality risks are identified and test levels necessary to mitigate these risks are documented.
  • Activity 5: Plans for using tools, for example, test management and automated testing tools, are prepared.
  • Activity 6: The project's overall test strategy is documented and approved according to the organizational policy and standards.
  • Activity 7: The test group uses the approved test strategy as its basis for developing detailed test plans and testware products.
  • Activity 8: Changes to the project's test strategy are reviewed and incorporated into the detailed test plans.
  • Activity 9: Estimates for the testing effort, cost, and schedule are derived according to a documented procedure.
  • Activity 10: Testing commitments made to individuals and groups external to the organization are reviewed with senior management according to a documented procedure.

Measurement and Analysis

  • Measurement 1: Measurements are made and used to determine the status of the test strategy planning activities.

Verifying Implementation

  • Verification 1: The activities for test strategy planning are reviewed with senior management on a periodic basis.
  • Verification 2: The activities for test strategy planning are reviewed with the project manager on both a periodic and event-driven basis.
  • Verification 3: The SQA group reviews and/or audits the activities and work products for test strategy planning and reports the results.

KPA - Testware Design and Maintenance

KPA Overview

The purpose of Testware Design and Maintenance is to develop and maintain the project's testware products that are necessary for adequate software testing. Testware design and maintenance involves the following:

  • Identifying testware products.
  • Planning activities and estimating resources to perform the work.
  • Designing and maintaining the testware products.

Testware design activities are planned at the beginning of a software project. Testware products, required for adequate software testing, are identified and designed according to the defined requirements and guidelines. The test group reviews the testware products before using them in test execution. Test documentation is maintained and kept consistent with the software requirements. Iterating through these steps may be necessary at various test levels. For some test levels, for example, systems integration or user acceptance testing, a group external to the project's test group may perform design of test documentation. In such cases, the project's test group must ensure that the testware products developed by the external parties are documented and controlled.

KPA Goals

  • Goal 1: Testware products, necessary to implement the project's test strategy, are identified.
  • Goal 2: Requirements and guidelines for designing, verifying, and maintaining the project's testware products are defined and followed.
  • Goal 3: Testware design, verification, and maintenance activities are planned and consistently performed.

Commitments to Perform

  • Commitment 1: The project follows defined requirements and guidelines for designing, verifying, and maintaining the testware products.

Ability to Perform

  • Ability 1: A group that is responsible for designing and maintaining the project's testware products exists.
  • Ability 2: Responsibilities for designing, verifying, and maintaining the project's testware products are assigned.
  • Ability 3: Adequate resources and funding are provided for designing and maintaining the project's testware products and acquisition of necessary tools.
  • Ability 4: Members of the test group are trained to perform their testware design and maintenance activities.

Activities Performed

  • Activity 1: Testware design and maintenance activities are planned in the early stages of and in parallel with overall project planning.
  • Activity 2: Testware products required for adequate software testing are identified.
  • Activity 3: Requirements and guidelines for designing, verifying, and maintaining testware products are defined based on the project's context and needs.
  • Activity 4: The project's testware products are developed, documented, verified, and maintained according to defined requirements and guidelines.

Measurement and Analysis

  • Measurement 1: Measurements are made and used to determine the status of the testware design and maintenance activities.

Verifying Implementation

  • Verification 1: The activities for testware design and maintenance are reviewed with senior management on a periodic basis.
  • Verification 2: The activities for testware design and maintenance are reviewed with the project manager on both a periodic and event-driven basis.
  • Verification 3: The SQA group reviews and/or audits the testware work products, testware maintenance activities and reports the results.

KPA - Test Execution and Reporting

KPA Overview

The purpose of Test Execution and Reporting is to provide adequate visibility into the software quality, its fitness for end-user needs, and the personnel ability to support the software system when it will be released into production. Based on testing results, management makes a decision about the software readiness for deployment in production. Test Execution and Reporting involves the following:

  • Preparing and setting up test environments.
  • Performing testing at various levels.
  • Reporting and tracking the test execution progress.
  • Reporting and tracking the software defects found in testing and production.

Testers use documented test plans as the basis for performing their activities at all test levels. Management coordinates activities of all parties involved in testing, monitors their progress, tracks software defects and communicates their resolution status to the affected groups. For each test level an evaluation is performed to determine testing completeness and effectiveness.

KPA Goals

  • Goal 1: Preparation for software testing is planned.
  • Goal 2: Software testing is performed according to the project's test strategy and test plans.
  • Goal 3: Testing progress is documented and tracked against the project plan.
  • Goal 4: Testing results are evaluated to determine testing completeness and effectiveness.
  • Goal 5: Software defects found in testing and in production are reported and their resolution status is tracked and communicated to the affected parties.

Commitments to Perform

  • Commitment 1: The project follows a written organizational policy for performing software testing and defect reporting activities.

Ability to Perform

  • Ability 1: Responsibilities for performing software testing, defect reporting, and evaluating test results are assigned.
  • Ability 2: Adequate resources and funding are provided for performing software testing activities.
  • Ability 3: Test managers, testers, and other individuals involved in software testing are trained to perform their tasks.

Activities Performed

  • Activity 1: The test group plans and conducts preparation for test execution, for example, setting up test environments, defining build release and defect reporting procedures.
  • Activity 2: The test group performs software testing in accordance with the project's test strategy and test plans.
  • Activity 3: Actual testing progress is reported and tracked against the project plan and corrective actions are taken as necessary.
  • Activity 4: The test group conducts periodic internal reviews to track testing performance and issues.
  • Activity 5: The test group evaluates the results of each level of testing to determine testing completeness and effectiveness.
  • Activity 6: Software defects found in testing and in production are reported and their resolution status is tracked to closure according to the defined procedure.
  • Activity 7: Responsible parties (for example, a project manager, test leader, or Change Control Board) review software defects to analyze their impact and to determine their fixing priority.
  • Activity 8: For each software release, changes to the software defect status are communicated to the affected parties, for example, system testers, QA acceptance group, or end-users.

Measurement and Analysis

  • Measurement 1: Measurements are made and used to determine the status of the test execution and reporting activities.
  • Measurement 2: Measurements are made and used to determine the quality of the software under test.

Verifying and Implementation

  • Verification 1: Test execution and reporting activities are reviewed with senior management on a periodic basis.
  • Verification 2: Test execution and reporting activities are reviewed with the project manager on a periodic and event-driven basis.
  • Verification 3: The SQA group reviews and/or audits the test execution and reporting activities and reports the results.

Applying TPAM on Software Projects

As the test process is a part of the entire software process, its evaluation should be performed in conjunction with other important process areas defined in the CMM. Hence, TPAM is intended to be used as a CMM-companion model for software process assessments or capability evaluations at Level 2 and Level 3. The inclusion of the TPAM process areas in the assessment scope should result in better visibility into software process maturity in general, and into the capability level of the test process areas in particular.

Depending on the assessment objective and target level of process maturity, the assessor selects key process areas from the CMM and TPAM that will be in the scope of the assessment. At the next step, the assessor selects TPAM process goals that are relevant to the process maturity target level. Table 1 in Appendix A shows the correspondence between the TPAM goals and the CMM maturity levels. In the course of assessment, each of the test process goals is evaluated based on a set of related TPAM practices. Appendix B shows the suggested relationship between the goals and the key practices for each of the TPAM process areas. Because the TPAM structure is based on the same process features as the CMM, gathering the assessment data and analyzing and presenting the assessment results will be consistent across all KPAs in the scope of such an assessment.

TPAM defines the test process using three key process areas, which provides flexibility to apply the model on software projects. For example, in the case of large-scale renovation projects organizations frequently hire a new development team, experienced in the latest technologies, whose mission is to develop a new system that will replace the legacy system. One of the common challenges such a team faces is to build a software process from scratch. In this project context not all process areas are required to be at the same level of capability at the beginning of the project. The project team can start building the software process by focusing on selected CMM process areas, such as Requirements Management and Software Project Planning, along with the TPAM key process area - Test Strategy Planning. As the project progresses, the project team can focus on other CMM and TPAM process areas to further evolve the software process to support their project needs.

Thus, the key benefits of TPAM include the following:

  • Better visibility into software process maturity on CMM-based assessments or capability evaluations at Level 2 and Level 3.
  • Consistency of assessment data gathering and analysis across all CMM and TPAM key process areas in the scope of assessment.
  • Assistance in establishing basic software testing practices at CMM Level 2 and evolving the test process, as part of the entire software process, to CMM Level 3.
  • TPAM flexibility that allows the project team building the process by focusing on selected test process areas at different phases of the software project life cycle.

Acknowledgements

I am grateful to Kenneth Hebert, senior manager at JP Morgan Chase, for the opportunity to apply on the WINS project the first version of TPAM. I thank Robin Goldsmith for reviewing this article and giving suggestions for its improvement. Finally, I am grateful to the CrossTalk reviewers for their feedback and comments.

References

  1. Paulk, Mark C., et. al. The Capability Maturity Model. Guidelines for Improving the Software Process. Addison-Wesley, 1995.
  2. Gelperin, D., and A. Hayashi. "How to Support Better Software Testing." Application Development Trends May 1996: 42-48.
  3. Bender, R. "Software Evaluation and Test KPA." Bender & Associates, Inc. www.sztest.net/download/SEI%20cmm.pdf.
  4. Burnstein, I. Practical Software Testing. Springer-Verlag, 2003.
  5. Chrissis, M. B., M. Konrad, and S. Shrum. CMM:. Guidelines for Process Integration and Product Improvement. Addison-Wesley, 2003.
  6. Hetzel, B. The Complete Guide to Software Testing. John Wiley and Sons, 1988.
  7. Weatheill, T. "In the Testing Maturity Model Maze." Journal of Software Testing Professionals Mar. 2001:. 8-13.
  8. Koomen, T., and M. Pol. Test Process Improvement. Addison-Wesley, 1999.

Appendix A. TPAM Goals and CMM Maturity Levels

Table 1 below shows the correspondence between the TPAM goals and the CMM maturity levels. The shaded box presents a set of process goals pertinent to CMM Level 2. If the assessment is planned for CMM Level 3, all TPAM goals should be included in the assessment scope.

TPAM KPA ML2 ML3
Test Strategy Planning G1
Test Strategy Planning G2
Testware Design and Maintenance G1
Testware Design and Maintenance G3
Test Execution and Reporting G1
Test Execution and Reporting G3
Test Execution and Reporting G5
Test Strategy Planning G3
Testware Design and Maintenance G2
Test Execution and Reporting G2
Test Execution and Reporting G4
Table 1. TPAM Goals and CMM Levels

KPA - Key Process Area
ML - CMM Maturity Level
G1,...GN - Test Process Goals

Appendix B. Mapping the Key Practices to Goals

A software process assessment based on the SEI CMM framework requires that the goals of each key process area be evaluated by analyzing the related key practices. The tables below provide information about the TPAM key process areas defined in this article. They present a recommended mapping between the TPAM goals and their related key practices grouped by the common process features.

Goal Commitment Ability Activity Measurement Verification
1 1, 2 1, 2, 3 1, 2, 3, 4, 5, 6, 9, 10 1 1, 2, 3
2 2 1, 2 2, 10 1 1, 2, 3
3 1 1, 3 7, 8 1 1, 2, 3
KPA - Test Strategy Planning

Goal Commitment Ability Activity Measurement Verification
1 1 1, 2, 3, 4 1, 2 1 2, 3
2 1 1, 2, 3, 4 1, 3, 4 1 2, 3
3 1 1, 2, 3, 4 1, 3, 4 1 1, 2, 3
KPA - Testware Design and Maintenance

Goal Commitment Ability Activity Measurement Verification
1 1 1, 3 1 1 1, 2, 3
2 1 1, 2, 3 2 1 1, 2, 3
3 1 1, 3 3, 4 1 1, 2, 3
4 1 1, 2, 3 4, 5, 6 1, 2 1, 2, 3
5 1 1, 2, 3 6, 7, 8 1 1, 2, 3
KPA - Test Execution and Reporting


About the Author
Yuri Chernak, Ph.D.

Yuri Chernak is the president and principal consultant of Valley Forge Consulting, Inc. As a consultant, Chernak has worked for a number of major financial firms in New York helping senior management improve their software test process. Chernak is a member of the Institute of Electrical and Electronic Engineers (IEEE) Computer Society. He has been a speaker at several international conferences and has published papers on software testing in the IEEE journals. Chernak has a doctorate in computer science.

E-mail: ychernak@yahoo.com



TPAMTM is a registered trademark of Valley Forge Consulting, Inc.

USAF Logo


Privacy and Security Notice  ·  External Links Disclaimer  ·  Site Map  ·  Contact Us