Skip common site navigation and headers
United States Environmental Protection Agency
Environmental Technology Verification Program
Begin Hierarchical Links EPA Home > ETV Home > Documents > Quality Management Plan End Hierarchical Links

 

ETV Quality Management Plan

Document Index

EPA Report No: EPA/600/R-03/021

December 2002

This document is available in the Adobe Acrobat PDF Format. Click here for information about Portable Document File (PDF) Formats.
or
Click here to directly download the Acrobat Reader. Exit EPA

(To view the PDF, it is recommended that you use the latest Acrobat Reader) 


ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM
US Environmental Protection Agency
Environmental Technology Verification Program
QUALITY MANAGEMENT PLAN
December 2002
(html version)
 

National Risk Management Research Laboratory
National Exposure Research Laboratory
National Homeland Security Research Center
Office of Research and Development
U.S. Environmental Protection Agency
Cincinnati, Ohio 45268

APPROVED BY

Hugh W. McKinnon, M.D., M.P.H., Director
National Risk Management Research Laboratory
Sally C. Gutierrez, Director
NRMRL Water Supply and Water Resources Division
Gary J. Foley, Ph.D., Director
National Exposure Research Laboratory
Frank T. Princiotta, Director
NRMRL Air Pollution Prevention and Control Division
E. Timothy Oppelt, Director
National Homeland Security Research Center
Linda S. Sheldon, Ph. D., Acting Director
NERL Human Exposure and Atmospheric Sciences Division
Teresa M. Harten, Director
Environmental Technology Verification Program
 

ACKNOWLEDGMENTS

The first draft of the ETV QMP was developed for the pilot period (1995–2000) by a team of writers consisting of the following quality assurance staff members from the U.S. Environmental Protection Agency Office of Research and Development National Risk Management Research Laboratory and National Exposure Research Laboratory: Sam Hayes, Lora Johnson, Ann Vega, and Jeff Worthington. Subsequent revisions included significant input in the form of comments from members of the Environmental Technology Verification Team. Verification organizations also provided comments. The team of writers were joined by the following EPA staff in development of the final document: Nancy Adams, Penelope Hansen, Linda Porter, and Shirley Wasson. This revision reflecting the conversion of the pilot projects to Centers targeted to broader classes of technologies was developed by Shirley Wasson, Teresa Harten, and Nancy Adams with input from the team.


TABLE OF CONTENTS


DOCUMENTS AND GENERAL TERMS

ABBREVIATIONS AND ACRONYMS

INTRODUCTION

PART A

MANAGEMENT SYSTEMS

1.0 MANAGEMENT AND ORGANIZATION

1.1 ETV quality policy

1.2 Organization structure

1.3 ETV customer identification and ETV customer needs/expectations/work objectives

1.4 Management negotiation with verification organizations on constraints

1.5 Resources

1.6 Authority to stop work for safety and quality considerations

2.0 QUALITY SYSTEM AND DESCRIPTION

2.1 Authorities and conformance to E4 quality standard

2.2 Quality system documents

2.3 Quality system scope

2.4 Quality expectation for products and services

2.5 Quality procedures documentation

2.6 Quality controls

2.7 Quality systems audits (QSAs)

3.0 PERSONNEL QUALIFICATION AND TRAINING

3.1 Personnel training and qualification procedures

3.2 Formal qualifications and certifications

3.3 Technical management and training

3.4 Retraining

3.5 Personnel job proficiency

4.0 ETV VERIFICATION ORGANIZATION SELECTION

4.1 Planning and control of selection process

4.2 Technical and quality requirements

4.3 Quality specification/conformance

4.4 Peer review of extramural agreements

4.5 Conformance of verification testing efforts

5.0 DOCUMENTS AND RECORDS

5.1 Scope

5.2 Preparation, review, approval, and distribution

5.3 Documents and records storage and obsolete documents and records

6.0 COMPUTER HARDWARE AND SOFTWARE

6.1 General procedures

6.2 Scope of ETV computer hardware/software procedures

6.3 Configuration testing

6.4 Measurement and testing equipment configurations

6.5 Change assessments - configurations, components, and requirements

6.6 ETV website roles and responsibilities

7.0 PLANNING

7.1 Systematic planning process

7.2 Planning document review

8.0 IMPLEMENTATION OF WORK PROCESSES

8.1 Implementation

8.2 Procedures

8.3 Oversight

9.0 ASSESSMENT AND RESPONSE

9.1 Numbers and types of assessments

9.2 Procedures

9.3 Personnel qualifications, responsibility, and authority

9.4 Response

10.0 QUALITY IMPROVEMENT

10.1 Annual review for quality improvement

10.2 Detecting and correcting quality system problems

10.3 Cause and effect relationship

10.4 Root cause

10.5 Quality improvement action

PART B
COLLECTION AND EVALUATION OF ENVIRONMENTAL DATA

1.0 PLANNING AND SCOPING

1.1 Systematic planning of the verification test

1.2 Systematic planning for verification testing

2.0 DESIGN OF TECHNOLOGY VERIFICATION TESTS

2.1 Design process

2.2 Generic verification protocols and test/QA plans: planning documents from the design process

3.0 IMPLEMENTATION OF PLANNED OPERATIONS

3.1 Implementation of planning

3.2 Services and items

3.3 Field and laboratory samples

3.4 Data and information management

4.0 ASSESSMENT AND RESPONSE

4.1 Assessment types

4.2 Assessment frequency

4.3 Response to assessment

5.0 ASSESSMENT AND VERIFICATION OF DATA USABILITY

5.1 Data verification and validation

5.2 Existing data

5.3 Reports reviewed

REFERENCES

APPENDIX A U.S. EPA Records Control Schedule

APPENDIX B Measures of Success

APPENDIX C Existing Data: Policy and Process

APPENDIX D Recommended Language for Solicitations


DOCUMENTS AND GENERAL TERMS

Return to Table of Contents


Annual progress report

The report developed on an annual basis to report implementation of the ETV program.

Audit of data quality
An examination of a set of data after it is collected and 100% verified by project personnel, consisting of tracing at least 10% of the data from original recording through transferring, calculating, summarizing and reporting.

Directors of quality assurance
Quality assurance directors for the EPA ORD laboratories, the National Exposure Research Laboratory and the National Risk Management Research Laboratory

DQO
Data quality objective. The DQO process provides a method for establishing DQOs for an individual technology verification test.

DQO process
A systematic planning process that clarifies test objectives and establishes a basis for the types, quality, and quantity of data required to support customers' decisions for use of a technology. Guidance may be found at www.epa.gov/quality/qa_docs.html.

E4
The ANSI/ASQC national consensus standard, which is the Agency standard, is applicable to extramural agreements. The standard is entitled, E4-1994, Specifications and Guidance for Quality Systems for Environmental Data Collection and Environmental Technology Programs.

EPA center manager
The EPA person designated by EPA line management to serve as the lead for an individual ETV center.

EPA center quality managers
The EPA quality assurance person designated by EPA line management to manage quality assurance efforts on behalf of the center manager.

EPA line management
The management structure to whom each EPA center manager reports; i.e. branch chief, division director, laboratory director.

EPA review/audit reports
The "quality records" developed by EPA as a result of conducting assessments during ETV implementation.

ETV extramural agreement
The contractual record developed by the EPA and signed by the verification organization.

ETV director
The EPA person designated by EPA ORD to lead the ETV team.

ETV operation/implementation plan

A center plan for conducting the verification process (an optional document).

ETV team
EPA employees actively working on the ETV program; the ETV director, center managers, and the directors of quality assurance are core members.

ETV test objective
The stated objective(s) of each technology test. Verification organizations use the DQO process or systematic planning to establish test objectives and test measurement quality criteria.

ETV verification
A verification produced only by the USEPA ETV program.

ETV verification report
The report of the result of an individual technology test.

ETV verification statement
A summary statement, developed by the verification organization and approved by the EPA Laboratory Director, which reports individual technology performance.

ETV webmaster
The person designated by EPA line management with responsibility for establishing and maintaining the ETV website.

Evaluation contractor
The contractor selected to collect information on center performance.

Generic verification protocol
(Also known as guideline documents, generic test protocols, or simply protocols). Those protocols developed, modified, or selected to promote uniform testing by the verification organization for a single class of technologies. Adequate documentation of a robust protocol may allow the development of abbreviated individual test/QA plans which incorporate the generic verification protocol by reference. Protocols may retain draft status until verification testing is performed, then finalized, building upon the testing experience.

Laboratory director
The directors of EPA ORD laboratories, the National Exposure Research Laboratory and the National Risk Management Research Laboratory.

Office of Research and Development assistant administrator
The administrative lead person directing the EPA's Office of Research & Development.

Performance evaluation audit
A quantitative evaluation of a measurement system.

Quality management plan
The specific policies and procedures for managing quality related activities in the ETV program.

Quality system audit
The qualitative assessment of data collection operations and/or organization(s) to evaluate the adequacy of the prevailing quality management structure, policies, practices, and procedures for obtaining the type and quality of data needed.

Raw data
All data and information recorded in support of analytical and process measurements made during planning, testing, and assessing environmental technology including support records such as: computer printouts, instrument run charts, standards preparation records, field log records, technology operation logs, and monitoring records. ETV test files (all records including raw data) and technical data and associated quality control data which support the data summarized and the conclusions made in each ETV verification report.

Records
All books, papers, maps, photographs, machine readable materials, or other documentary materials, regardless of physical form or characteristics, made or received by the EPA or a verification organization or their designated representative for the ETV program.

Stakeholder groups
Groups set up for each center consisting of representatives of any or all of the following verification customer groups: buyers and users of technology, developers and vendors, the consulting engineers, the finance and export communities, government permitters, regulators, first responders, emergency response, disaster planners, and public interest groups.

Standard operating procedures
Procedures describing routine verification activities including sample collection, analytical testing, and associated verification processes.

Technical systems audit
A qualitative on-site evaluation of sampling and/or measurement systems.

Test measurement
Those critical measurements that must be made during the course of a technology test to evaluate achievement of the ETV test objective.

Test/QA plan
The plan developed by a verification organization for each individual test of a technology or technology class. Therefore, the test/QA plan may include more than one technology. The test/QA plan provides the experimental approach with clearly stated test objectives and associated quality objectives for the related measurements. The test/QA plan may incorporate or reference existing generic verification protocols or provide the basis for refining draft protocols.

Verification (ETV verification)
Establishing or proving the truth of the performance of a technology under specific, predetermined criteria or protocols and adequate data quality assurance procedures.

Verification organizations
The public and private sector organizations holding cooperative or interagency agreements or contracts to assist EPA in implementing the ETV program.

Verification organization manager
The person designated by the verification organization to manage the center and serve as the chief point of contact with the EPA.

Verification organization quality manager
The person designated by the verification organization to manage quality assurance for the center on behalf of the verification organization manager.

Verification organization quality management plan
The procedures for quality related activities developed and implemented by the verification organization to assure quality in the work processes and services developed for ETV. If the verification organization has a current quality system that accommodates ETV's needs, additional quality system elements do not need to be developed.

Verification organization review and audit reports
The "quality records" developed by the verification organization as a result of conducting assessments during ETV implementation.

Verification Strategy
The ETV Program Verification Strategy, published in February 1997, outlines the goals for the pilot period (1995-2000), programmatic operating principles, pilot selection criteria, key definitions, budgets, and implementation activities that molded the ETV program, as well as the challenges that emerged and the decisions that needed to be addressed in the future.


ABBREVIATIONS AND ACRONYMS

Return to Table of Contents
 
 

ACE any credible evidence
ADQ audit of data quality
ANSI American National Standards Institute
ASQ or ASQC Exit EPA American Society for Quality
CA cooperative agreements
CBD Commerce Business Daily
CMD Contracts Management Division
DEP data evaluation panel
DQO data quality objective
EPA Environmental Protection Agency
ETV environmental technology verification
FBO Fed BizOpps
FRC Federal Records Center
FTE full time equivalent
GAD Grants Administration Division
GVP generic verification protocol
IAG interagency agreement
IQGs Information Quality Guidelines
ISO International Standards Organization
NERL National Exposure Research Laboratory
NRMP National Records Management Program
NRMRL National Risk Management Research Laboratory
OAQPS Office of Air Quality Planning and Standards
OMIS ORD Management Information System
ORD EPA's Office of Research and Development
OSHA Occupational Safety and Health Administration
PE performance evaluation
PERFORMS EPA's system of performance assessment
PO project officer
QA quality assurance
QAPP quality assurance project plan
QAM quality assurance manager
QSA quality systems audit
QC quality control
QMP quality management plan
SOP standard operating procedure
SOW statement of work
T/QA plan test/QA plan
TSA technical systems audit
USEPA United States Environmental Protection Agency
VO verification organization

INTRODUCTION

Return to Table of Contents


Background

The Environmental Technology Verification Program (ETV) was established by the Environmental Protection Agency (EPA) to evaluate the performance characteristics of innovative environmental technologies across all media and to report objective performance information to the permitters, buyers, and users of environmental technology. ETV evolved in response to the following mandates:

  • A 1995 Presidential directive to EPA in Bridge to a Sustainable Future, to "work with the private sector to establish a market-based verification process . . . which will be available nationally for all environmental technologies within three years."
  • Goals articulated in the Administration's Reinventing Government; A Performance Review which directed EPA to begin a comprehensive environmental technology verification program no later than October 1995.
  • Congressional appropriation language contained in the FY96 and FY97 budgets, that the Agency fund technology verification activities at the $10 million level in each year.

To comply with these directives, EPA's Office of Research and Development (ORD) established a five-year pilot program to evaluate alternative operating parameters and determine the overall feasibility of a technology verification program. ETV began the five-year pilot period in October 1995. At the conclusion of the pilot period, the Agency prepared a Report to Congress containing an evaluation of the results of the pilot program and recommendations for its future operation.

Credible, high-quality performance information is one of the tenents of ETV. Therefore, the highest appropriate level of quality assurance is used throughout the program. The EPA's Office of Research and Development, under which ETV operates, has implemented an Agency-wide quality system to assure that activities conducted in EPA research laboratories and other facilities or at facilities being operated on behalf of or in cooperation with the EPA are supported by data of known and acceptable quality for their intended use. Each of the ORD laboratories involved in ETV, the National Risk Management Research Laboratory (NRMRL) and the National Exposure Research Laboratory (NERL), operate under laboratory-specific quality management plans (QMPs). The ETV QMP is consistent with the policies expressed in the individual laboratory QMPs and is intended to provide an overarching, uniform quality system for all aspects of the ETV program.


Program Description

Developers of innovative environmental technology report numerous impediments to commercialization. Among those most frequently mentioned is the lack of acceptance of vendor performance claims. The success of the pilot program shows that objective, independently acquired, high-quality performance data and operational information on new technologies significantly facilitates the use, permitting, financing, export, purchase, and general marketplace acceptance of such technologies. ETV provides this data and information to the customer groups that require them to accelerate the real world implementation of improved technology. Improved technology more thoroughly, rapidly, and efficiently protects human health and the environment. It is important to stress that the product of ETV is high-quality data and information, not technology approval or endorsement. Although there is substantial EPA involvement in guiding and administering this program, ETV does not provide EPA endorsement or certification of commercial products.

At the conclusion of the pilot period the Agency internally reviewed the performance and operation of the program to assess its future direction and scope. The ETV Director recommended consolidation of the program into six technology centers:

  • Advanced Monitoring Systems Center (AMS)
  • Air Pollution Control Technology Center (APCT)
  • Greenhouse Gas Technology Center (GHG)
  • Drinking Water Systems Center (DWS)
  • Water Quality Protection Center (WQP)
  • Pollution Prevention, Recycling and Waste Treatment Center (P2,R,WT)

During 2000 and 2001 the first five centers above were established. The sixth center was not put in place due to a lack of adequate funding to support it. In addition, subsequent to the terrorist attacks of September 11th, 2001, the role of ETV in verifying homeland security type environmental technologies was brought to the fore. As vulnerabilities in the Nation's critical infrastructure were identified, two areas of particular concern – drinking water supply systems, and the health and safety of the Nation's workforce in their places of employment – became a focus for EPA. ETV was called upon to support the Nation's homeland security efforts by adapting its testing and evaluation process for technologies for protecting and cleaning up drinking water systems and buildings. ORD realized the tremendous testing and evaluation capability that resided in the various technology verification organizations that have been and continue to operate each center, therefore, it was determined to be in the best interest of the Nation to utilize the existing technology verification organizations, where possible and practical, to execute this mission. Three of the existing verification organizations – Battelle, Research Triangle Institute, and NSF International – were enlisted to support the homeland security-related technology testing and evaluation needs. A new center operated by Battelle – the Building Decontamination Technology Center (BDT) – was established, as the existing verification organization funding agreements did not have this type technology verification within their scopes of work.

The homeland security efforts fall into two categories: water security and safe buildings. The water security efforts are being accomplished through the existing cooperative agreements with the AMS, DWS, and WQP centers. The safe buildings efforts are being accomplished through GSA contract to two existing verification organizations, Battelle and Research Triangle Institute. Contracts through GSA were chosen in lieu of cooperative agreements because it was believed that for this area the Agency and federal government would have to be directive in many of the elements of the effort: the stakeholders, the test methods, deciding on the technologies to be tested, and possibly other aspects. This was in contrast to the water security effort in which the ETV management process consistent with cooperative agreements was to be followed. None of the contract funding will be commingled with cooperative agreement funding by the technology verification organizations. The ETV project officers have been trained in the distinctions between managing cooperative agreements and contracts.


Operation of the Centers

The technology verification organizations are all not-for-profit entities that work with or for EPA through an extramural agreement (a cooperative agreement, an interagency agreement, or an existing General Services Administration (GSA) contract). EPA and the technology verification organization roles are identical to those established during the pilot period. Each agreement has oversight by an EPA project officer who may also be the EPA Center Manager. EPA provides substantial oversight through an active quality assurance program. Each technology verification organization is contractually required to fully implement EPA QA requirements for planning, auditing, and documenting the testing and reporting activities. Qualified peer reviewers are also utilized to review the technical aspects of the test plans and of the final reports.


Program and Quality Management Documents

Several documents define the overall operation of the Program. The first to be published (February, 1997) was the Environmental Technology Verification Strategy. This document describes the goals, customer and key word definitions, basic operating principles, project selection criteria, and the programmatic and budgetary vision of the program. The Strategy is evaluated periodically for the need for modification and amplification. The second major program management document being used by ETV to guide its operation is this document, the ETV Quality Management Plan (QMP).


The ETV Quality Management Plan

The ETV QMP uses the structure, policies, and standards established in the American National Standard Specification and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. This document, ". . . describes a basic set of mandatory specifications and non-mandatory guidelines by which a quality system for programs involving environmental data collection and environmental technology can be planned, implemented, and assessed". Based on the structure and standards of E4, the ETV QMP contains the definitions, procedures, processes, inter-organizational relationships, and outputs that assure the quality of both the data and the programmatic elements of ETV. Part A of the ETV QMP contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program. Part B contains the specifications and guidelines that apply to test-specific environmental technology testing activities involving the generation, collection, analysis, evaluation, and reporting of test data.

The ETV QMP is designed to play a major role in clearly delineating the roles and responsibilities of all the diverse and important participants. The ETV Program is organizationally complex. Within EPA, the Program is coordinated through ORD's ETV Team, consisting of staff from eight branches located in three divisions in two laboratories, NRMRL and NERL, and one ORD center, the Homeland Security Research Center, including the quality assurance staff assigned to each organizational element. There are also numerous outside organizations involved through the extensive stakeholder process, the technology verification organizations who bear most of the quality assurance responsibilities, and testing and consulting companies hired by technology verification organizations to conduct field and laboratory work. Finally, EPA program offices and regions are increasingly involved in outreach activities, as are other Federal agencies and states.

Each technology verification organization uses this document and its parent, the E4 standard, to create center-specific quality management plans that assure that the testing and evaluation efforts carry the appropriate level of quality assurance to meet the needs of the users of the performance information. These QMPs have been submitted to EPA for review and approval at the outset of the operation and are reviewed annually in conjunction with the review of the ETV QMP. The annual reviews incorporate lessons learned from the experiences of the centers, the feedback from the Program's customers, and to accommodate any policy or programmatic changes.


PART A
MANAGEMENT SYSTEMS

Return to Table of Contents

Part A of the ETV Quality Management Plan contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program.

Part B of the ETV Quality Management Plan contains the specifications and guidelines that apply to test-specific environmental activities involving the generation, collection, analysis, evaluation, and reporting of test data.

Note: The small italicized statements following the section titles refer to requirements of the ANSI / ASQC E4-1994.


1.0 MANAGEMENT AND ORGANIZATION

1.1 ETV quality policy
Return to Table of Contents

The Office of Research and Development shall establish and implement a quality policy to ensure that the Environmental Technology Verification (ETV) program produces the type and quality of program outputs needed and expected by ETV clients.

The EPA Office of Research and Development's (ORD) quality policy for the Environmental Technology Verification (ETV) program is established as follows:

The quality system for the overall ETV program seeks to be consistent with industry consensus standards. Each verification organization shall implement a valid and approved quality system. The Agency's required quality system for cooperative agreements and contracts is ANSI/ASQC E4. Each verification test will be performed according to planned and documented, pre-approved test/QA plans. All technical statements in ETV verification reports shall be supported by the appropriate data.

1.2 Organization structure
Return to Table of Contents

The relevant organizations, functional responsibilities, levels of accountability and authority, and lines of communication shall be formally defined in the quality system and approved by the EPA laboratory directors responsible for the quality of work performed by or in cooperation with each EPA laboratory.

The overall organizational structure of the ETV program graphically presents lines of accountability, authority, and communication. The general functional responsibilities for the major organizational units are specified in the structure.

To view the organization chart, click here.

1.2.1 Assistant administrator for ORD and the EPA administrator responsibilities:

  • provide overall Program direction
  • serve in a program leadership role with Congress, other agencies of the Executive Branch, and the general public

1.2.2 ORD laboratory directors responsibilities:

  • approve and implement annual program budgets and resource allocations
  • allocate laboratory personnel and other resources to accomplish ETV's goals, including appointment of the ETV director
  • approve all ETV verification statements
  • review and approve ETV Verification Strategy and ETV Quality Management Plan (QMP)
  • ensure that appropriate assessments (i.e. program quality system assessments, see Table 9.1) are implemented

1.2.3 Division directors and branch chiefs responsibilities:

  • allocate appropriate division and branch personnel and other necessary resources to support centers associated with the division / branch
  • appoint a quality assurance manager for each ETV center
  • provide oversight via administrative and technical review of center outputs and products prior to public release

1.2.4 ETV director responsibilities:

  • leads the ETV team by providing communication opportunities, e.g., periodic conference calls, meetings, and training
  • coordinates the overall ETV program, including design of multi-year strategies, operating principles, implementation activities, and annual budgets
  • communicates ETV team and program activities, progress, outputs, and recommendations to EPA, Congress, agencies in the Executive Branch, customer groups, and the general public
  • maintains an up-to-date ETV website containing materials relevant to the program and to each center
  • manages overall ETV program outreach activities to ensure that stakeholder and customer groups are knowledgeable about the existence and use of ETV generated data
  • collects data on operational parameters and program outputs to continuously evaluate the ETV program and make recommendations to management and the Congress on its present and future operation.
  • reviews, approves, and assists in revision of ETV QMP
  • ensures ETV QMP is implemented in the ETV program

1.2.5 ETV team responsibilities:

  • establish mutually acceptable program-level strategies and protocols
  • participate in development of an overarching ETV outreach strategy
  • communicate center-specific progress, issues, difficulties, and lessons learned
  • meet to discuss program objectives, seek collegial guidance, and evaluate success
  • review ETV QMP

1.2.6 EPA center manager responsibilities:

  • oversee verification organizations
  • communicate requirements for and oversee the verification organization quality system
  • arrange for peer review of verification organization proposals and ETV verification reports
  • attend and/or conduct regular meetings with stakeholders
  • oversee production and approval process of ETV verification reports and ETV verification statements
  • assist with ETV outreach activities for the assigned ETV center
  • participate in ETV team activities
  • ensure that appropriate assessments (i.e. center-specific assessments, see Table 9.1) are implemented

1.2.7 Verification organization responsibilities:

  • establish, attend, and/or conduct meetings of stakeholders
  • maintain communication with EPA to assure mutual understanding and conformance with EPA quality procedures and expectations and ETV policies and procedures
  • manage the oversight and conduct of verification activities
  • assure that quality procedures are incorporated into all aspects of each ETV project
  • develop, conduct, and/or oversee test/QA plans in cooperation with technology vendors
  • solicit technology vendor proposals or vendor products
  • operate ETV activities within their documented and approved quality management plan
  • prepare ETV verification reports on technology tests
  • prepare a three-to-five page ETV verification statement at the completion of each technology verification
  • appoint a quality manager, responsible for ensuring that the verification organization and its suppliers/contractors have quality systems in compliance with this QMP, and that the verification organization complies with their documented quality system.

1.2.8 Stakeholders group responsibilities may include the following:

  • assist in development of generic verification protocols
  • assist in prioritizing the types of technologies to be verified
  • review project-specific procedures and selected ETV verification reports emerging from the ETV center
  • assist in the definition and conduct of outreach activities appropriate to the technology area and customer groups
  • serve as information conduits to the particular constituencies that each member represents

1.2.9 EPA directors of quality assurance responsibilities:

  • develop and implement the ETV quality system at the direction of the ETV director and in coordination with the ETV team
  • document the ETV quality system in the ETV QMP
  • review, and update, if necessary, the ETV QMP annually in cooperation with the ETV director
  • work with ETV center quality managers to ensure implementation of ETV QMP
  • provide current copies of the ETV QMP to the appropriate participants in the ETV program
  • communicate quality issues and information to the ETV team in a timely manner
  • conduct internal quality systems audits (QSAs).

1.2.10 EPA center quality manager responsibilities:

  • communicate quality system requirements, quality procedures, and quality issues to the assigned EPA center manager and verification organization
  • review and comment on verification of organization quality system description to verify conformance to the quality provisions of this document
  • perform QSA of each center's quality system to verify conformance to the quality
  • provisions of this document
  • perform document reviews (see Table 5.1 for specific documents and frequency)
  • perform technical systems audits (TSAs) and performance evaluations (PEs) of projects, as appropriate
  • provide assistance to center personnel in resolving quality assurance issues

Information available on the ETV website presents a current listing of the centers that are either underway or soon to be awarded. Included are the EPA center managers and verification organization managers. Tables contain their names, company affiliations, and phone numbers.


1.3 ETV customer identification and ETV customer needs / expectations / work objectives
Return to Table of Contents

The ETV director, center managers, and verification organizations are responsible for coordinating the identification of customers and communicating the needs of the internal and external customers to ensure that ETV work products satisfy their needs.

1.3.1 External customers (i.e., outside EPA) include, but are not limited to:

  • public and private sector buyers and users of technology
  • developers and vendors of technology
  • the consulting engineering community that recommends technologies to buyers
  • federal, state and local government permitting/regulatory, homeland defense, first responder, and disaster planning agencies
  • international marketers and the financial and insurer communities
  • Congress

In a general sense, needs and expectations of external customers include:

  • ETV verification reports and ETV verification statements supported by objective and reliable data, provided in a timely manner
  • a justifiable documented approach to selecting technologies for testing
  • a practical approach in testing which provides efficient, timely, well-documented, and cost-effective technology tests
  • full disclosure of all testing results, including those which do not verify the technology manufacturer's claims
  • user-friendly documents (e.g., easy to read and to implement)
  • technology operation consistent with statements in the ETV verification report

For each center, needs and expectations of external customers are defined and documented in the minutes of stakeholders meetings. The process to define these center-specific needs and expectations includes:

  • discussions between the EPA center managers, verification organizations and stakeholders
  • development by EPA center managers, verification organizations, and stakeholders of center verification project test objectives and data quality objectives prior to testing

1.3.2  Internal customers of the ETV program are those EPA staff responsible for execution of the ETV program in accordance with the expectations of Congress and the Administration. These customers include EPA and ORD senior managers who expect conformance with management and quality policies of the Agency.

Other EPA staff in the regions and headquarters benefit from the program in the following areas:

  • data of known and useful quality
  • expedited use of improved environmental and homeland security technologies
  • ETV testing accomplished on a wide variety of technologies
  • user-friendly documents (e.g., easy to read and to implement)
  • development of appropriate verification testing protocols


1.4 Management negotiation with verification organizations on constraints
Return to Table of Contents

When necessary, appropriate EPA management shall negotiate acceptable measures of quality and success when constraints of time, costs, or other problems affect the verification organization capability to fully satisfy customer needs and expectations.

When constraints of time, costs, or other problems significantly affect the verification organization capability to fully satisfy the ETV quality system needs and expectations, or when problems are identified through the EPA QA function, the verification organization will notify the project officer and negotiations will proceed according to agreement terms.


1.5 Resources
Return to Table of Contents

When necessary, appropriate EPA management shall negotiate acceptable measures of quality and success when constraints of time, costs, or other problems affect the verification partner's capability to fully satisfy customer's needs and expectations.

The laboratory directors shall provide adequate resources to the ETV directors of quality assurance, EPA center managers and EPA center quality managers to enable them to plan, implement, assess, and improve the overall ETV program and quality system effectively.
Laboratory directors take the following actions to achieve the above policy:

  • provide full time equivalent (FTE) allotment of EPA center managers
  • provide FTE allotment of QA and other support personnel at each laboratory's geographical location
  • provide sufficient travel funds for each center for an appropriate level of oversight and external assessments. An appropriate level of oversight should be determined based on the needs of the center. Travel needs may include attendance at team training meetings, verification tests, vendor meetings, and stakeholder meetings. Travel for assessments is based on the requirements of Part A, Table 9.1.
  • provide for maintenance of communication lines between ORD laboratory directors, the ETV team, and the ETV director.


1.6 Authority to stop work for safety and quality consideration
Return to Table of Contents

The verification organization shall stop unsafe work and work of inadequate quality, or shall delegate the authority to do so to others.

The following procedures are necessary to stop unsafe work and work of inadequate quality:

  • The verification organizations shall ensure compliance with all federal, state, and local health and safety policies during the performance of the verification tests. This includes obtaining appropriate permits.
  • The verification organization quality system shall identify one or more individuals who may issue a stop work order in the event that unsafe work or work of inadequate quality is identified.
  • EPA center managers and EPA center quality managers shall contact the authorized individual(s) in the event that work of inadequate quality is discovered.
  • In extreme circumstances, the EPA center managers may ask GAD or CMD to intervene if the verification organization does not implement their approved quality management plan.


2.0 QUALITY SYSTEM AND DESCRIPTION

Return to Table of Contents

A quality system shall be planned, established, documented, implemented, and assessed as an integral part of an ETV management system for environmental technology verification programs defined by ETV quality policy.

Development and subsequent endorsement of this plan by the ETV director and EPA line management are evidence that the ETV quality system is planned, established, documented, implemented, and assessed as an integral part of an EPA ETV management system


2.1 Authorities and conformance to E4 quality standard
Return to Table of Contents

The ETV quality system shall address applicable parts of E4 and shall include the organizational structure, policies and procedures, responsibilities, authorities, resources, and guidance documents.

The authorities for developing appropriate quality systems for ETV are USEPA Order, Federal Register, CFR Parts 30 & 33, February 15, 1996, and the Interim Policy for Higher-level Contract Quality Requirements, effective March 20, 2001.

This plan complies with ANSI / ASQC Exit EPA E4-1994, Specifications and Guidance for Quality Systems for Environmental Data Collection and Environmental Technology Programs, the Agency standard. E4 is comparable to the International Standards Organization (ISO) 9000 standards series, as shown in the comparison table provided in Annex B-5 to E4. Another acceptable quality system model is ISO 17025, General Requirements for the Competence of Calibration and Testing Laboratories.

The ETV quality system addresses each applicable individual "specification" provided in the published quality standard, ANSI / ASQC Exit EPA E4-1994, using the policies and procedures in this plan, as appropriate.

Verification organizations develop quality system descriptions to be consistent with both ANSI / ASQC Exit EPA E4-1994 (and/or ISO 9001) and this document.


2.2 Quality system documents
Return to Table of Contents

The ETV quality system shall be described in a QMP that is reviewed and approved by the ETV director and EPA line management.

The ETV quality system is described in this Quality Management plan.

  • The ETV team develops and implements the quality system.
  • The ETV director, ORD laboratory directors, and appropriate ORD division directors review and approve the ETV QMP and subsequent revisions to the plan, as policy for the ETV program.
  • Verification organization quality systems (which are consistent with ANSI / ASQC E4-1994) are described in a written ETV center quality management plan, and are reviewed and approved by verification organization management, the EPA center manager, and EPA center quality manager. Subsequent revisions are reviewed in a similar manner.


2.3 Quality system scope
Return to Table of Contents

The ETV quality system description shall identify in general terms those items, programs, or activities to which it applies.

This quality system description applies to the following:

  • the EPA ETV program
  • selection and oversight of verification organizations
  • review and approval of verification center organization quality management plans
  • ETV products (e.g., test / QA plans, reports, ETV verification statements)
  • planning, implementation, and assessment activities supporting ETV verification activities


2.4 Quality expectation for products and services
Return to Table of Contents

The ETV quality system shall include provisions to ensure that products or results of the environmental programs defined by the ETV program are of the type and quality needed and expected by ETV clients.

The preeminent products of the ETV program are the environmental technology verification reports and statements issued by EPA and the verification organization. Provisions to ensure that these products and other results of the ETV program are of the quality expected include:

  • Products are reviewed as described in part A, section 5.0.
  • QSAs and technical assessments are conducted as described in part A, section 9.0. Technical assessments may include field and laboratory audits, performance evaluation audits, and audits of data quality.


2.5 Quality procedures documentation
Return to Table of Contents

Following approval of the ETV QMP, management elements of the quality system shall be implemented as described.

Verification organizations must operate the ETV centers under a written and EPA-approved quality management plan that is based on E4 and/or the provisions of this plan.

  • Verification organizations provide evidence of compliance before verification activities begin, as required by the extramural agreement signed by the verification organization.

The EPA center manager is responsible for obtaining a copy of the verification organization quality management plan, as specified in the Solicitation, for his own review and forwarding the document to the EPA center quality manager for review and approval prior to planning technology tests.


2.6 Quality controls
Return to Table of Contents

The ETV quality system description shall define when and how controls are to be applied to specific technical or technology testing efforts and shall outline how these efforts are planned, implemented, and assessed.

2.6.1 ETV program controls include:

  • existing EPA policies and procedures for selection and administration of verification organization efforts in ETV
  • an approved ETV QMP
  • quality management, assurance, and control procedures as part of the extramural agreement

Specifically, the data produced by ETV centers are controlled such that verified data will be published in verification reports, regardless of the outcome of the testing.

2.6.2 Center-specific controls include:

  • generic verification protocols and specific test / QA plans developed and approved prior to testing
  • oversight by the EPA center quality managers of the implementation process and follow-up to any finding of nonconformance
  • technical operations assessment
  • specified quality control requirements in the extramural agreement

Center-specific procedures for planning, implementation, and assessment are described in the verification organization quality system. Procedures for planning, implementing, and assessing the overall ETV quality system are detailed in part A sections 7.0, 8.0, and 9.0 and in part B .


2.7 Quality system audits (QSAs)
Return to Table of Contents

At regular intervals (at least annually) the ETV quality system shall be reviewed and its description updated, if appropriate, to reflect changes in the organization as well as changes in ETV quality policy.

The EPA directors of quality assurance perform an internal QSA of the program preferably in the first year after approval of the ETV QMP in accordance with the process as outlined in part A section 9.0. The assessment report provides input into the update of the ETV quality system as defined in the ETV QMP. The EPA center managers of quality assurance perform a QSA of the center preferably in the first year after approval of the center QMP.


3.0 PERSONNEL QUALIFICATION AND TRAINING

3.1 Personnel training and qualification procedures
Return to Table of Contents

Personnel performing work shall be trained and qualified based on appropriate requirements prior to the start of the work or activity.

3.1.1 EPA center managers are selected based on:

  • educational background and / or a degree that is directly relevant to the center technology area
  • work experience specific to the center technology area
  • experience in program management
  • participation in required training for project officer responsibilities on extramural agreements, as documented in training records

3.1.2 EPA center quality managers are selected based on:

  • educational background and / or a degree relevant to the technology tests and programs
  • work experience specific to QA of technology tests and programs
  • experience in quality management.

3.1.3 Verification organization personnel

Key participants working directly for or on behalf of the verification organization in support of the center and / or individual test operations are selected by the verification organization and evaluated by the EPA during the solicitation process. Solicitation evaluation criteria for key personnel will vary, but typically include a consideration of the following:

  • educational background and/or a degree(s) relevant to technical areas represented in the center
  • work experience related to the technology areas represented in the center
  • experience in quality management

The verification organization quality management plan will document training and qualification procedures for verification organization personnel.


3.2 Formal qualifications and certifications
Return to Table of Contents

The need to require formal qualification or certification of personnel performing certain specialized activities shall be evaluated and implemented where necessary.

ETV program management, quality management, and center management require no formal qualification or certification other than where applicable:

  • EPA Project Officer Training and Extramural Agreement Training (or Work Assignment Manager training as appropriate)
  • Appropriate Occupational Safety and Health Administration (OSHA) courses

Formal qualification or certification of personnel performing specialized activities for each center or for specific test / QA plans is addressed on a center-specific or test / QA plan-specific basis. Verification organizations maintain records of the qualification or certification of such personnel.

NOTE: Requirements for formal qualifications or certification may be based on applicable federal, state, or local requirements associated with a particular test. Examples of possible certifications include but are not limited to drinking water plant operator certification, professional engineering registration, and certification of industrial hygienists.


3.3 Technical management and training
Return to Table of Contents

Appropriate technical and management training, which may include classroom and on-the-job, shall be performed and documented.

EPA line management is responsible for appropriate technical and management training for staff working on the ETV program. Such training will be documented in each individual's training file.Verification organizations are responsible for personnel training and qualification procedures for each center or for specific test/QA plans. Verification organizations maintain the training records (available for review by EPA).

The ETV team will be trained at meetings which occur at least once a year. At these meetings, the team develops policy, and shares information and lessons learned. The directors of quality assurance provide training on the requirements of the ETV QMP during the periodic workshops organized by the ETV director.


3.4 Retraining
Return to Table of Contents

When job requirements change, the need for retraining to ensure continued satisfactory job proficiency shall be evaluated.

The need for retraining EPA ETV staff is evaluated on an annual basis by the appropriate line management.

Evaluating the need for and performing retraining of verification organization staff is the responsibility of the verification organization.


3.5 Personnel job proficiency
Return to Table of Contents

Evidence of personnel job proficiency shall be documented and maintained for the duration of the technology test or activity affected, or longer if required.

3.5.1 EPA center managers - The existing performance standards of the EPA center managers may already include tasks consistent with the following items. These items should be considered for specific identification in the performance standards:

  • active participation in the ETV team; communicating center issues, lessons learned, required reports, and appropriate
  • assistance to members of the ETV team and management
  • developing solicitations and/or management of CAs / IAGs / contracts
  • facilitating stakeholders group activities
  • ensuring development of and contributing to generic verification protocols and test / QA plans
  • providing a leadership role to ensure technologies are selected consistent with the ETV Verification Strategy
  • serving as a communication link between EPA and the verification organization, in particular, providing information and documents to support the ETV website
  • reviewing draft and final ETV verification reports and other center documents
  • reporting to program management the completeness and validity of the ETV verification statement prior to report issuance
  • ensuring the timely delivery of complete and consistent ETV products and services

Evidence of personnel job proficiency is found in the human resources module in OMIS for tracking training, and in PERFORMS for satisfactory job performance.

NOTE: Evaluations are the responsibility of the appropriate supervisor and are not a record of the ETV program.

3.5.2 EPA center quality managers - The existing performance standards of the EPA center quality managers may already include tasks consistent with the following items. These items should be considered for specific identification in the performance standards:

  • verification organization center quality management plan review
  • quality system reviews
  • technical system audits, performance evaluation audits, and audits of data quality
  • generic verification protocols, test / QA plans, and verification reports and statements reviews
  • complete and timely center audit reports

NOTE: Evaluations are the responsibility of the appropriate supervisor and are not a record of the ETV program.

3.5.3 Verification organization staff - Verification organizations document and maintain records (such as annual performance reviews) of personnel job proficiency for work performed directly in support of the verification organization ETV activities.

NOTE: Evaluations are the responsibility of the verification organization and are not a record of the ETV program.


4.0 ETV VERIFICATION ORGANIZATION SELECTION
 

4.1 Planning and control of selection process
Return to Table of Contents

Funding of extramural agreements associated with the ETV program shall be planned and controlled to ensure that the quality of verification tests is known, documented, and meets technical requirements and acceptance criteria of the clients.

The ETV program is designed to investigate ways to facilitate the verification and use of environmental technology. The ETV program secures verification organizations through appropriate instruments governed by rules found in Title 31, Section 6303, of the US Code, and in EPA Order 5700.1.

Planning to select verification organizations requires:

  • assessing and prioritizing environmental technology categories for use in center projects (i.e., defining the scope of each center project in terms of technology areas to be tested)
  • establishing ANSI / ASQC E4-1994 as an applicable quality standard
  • issuing solicitations
  • selecting the appropriate verification organization based on their experience and proficiency
  • managing the selection process to ensure the quality of verification tests includes:
    • implementing controls stipulated in EPA policies and procedures for extramural agreements
    • establishing specific language in each solicitation requiring development and implementing a quality system consistent with the ETV quality system, and ANSI / ASQC E4-1994. Suggested language is given in Appendix D.
    • reviewing the applicant's proposed quality system to verify that it meets the solicitation requirements and provides for quality of verification tests which will be known, documented, and which will meet technical requirements.


4.2 Technical and quality requirements
Return to Table of Contents

Extramural agreement solicitation documents shall contain information clearly describing the technical and quality requirements associated with the verification testing.

Technical and quality requirements expressed in the solicitation include technical evaluation criteria for technical skills and experience of staff members, and demonstrated experience in the development of quality systems relevant to ETV. The policy pertaining to extramural agreements requires that a verification organization develop and receive EPA approval for a quality management plan consistent with this document and E4 prior to conducting technical activities. If the verification organization intends to perform verifications by contracting or sub-contracting with other organizations, all of the controls incumbent upon the verification organization specified in Section 4.1 pass through to the contractor or sub-contractor.


4.3 Quality specification / conformance
Return to Table of Contents

Extramural agreement solicitation documents shall specify the ETV quality requirements for which the verification organization is responsible and how the verification organization conformance to client requirements shall be verified.

ETV quality requirements for which the verification organization is responsible are specified in the Solicitation and in this ETV QMP. During verification organization selection, the applicant proposals and written responses to the requirements are reviewed for conformance to the Solicitation specifications. After a verification organization is selected, the EPA center quality manager reviews and the EPA center manager approves written quality system documents (e.g., center QMPs) for conformance to the EPA and ETV quality policies and procedures.


4.4 Peer review of extramural agreements
Return to Table of Contents

Extrumural award documents shall be reviewed for accuracy and completeness by qualified personnel prior to award.

Peer review is an integral part of EPA's project planning, implementation, and assessment process. Solicitation packages are internally peer reviewed prior to their issuance. Responses to the solicitation undergo a peer review process which supports the award of the extramural agreement.


4.5 Conformance of verification testing efforts
Return to Table of Contents

Appropriate measures shall be established to ensure that the verification testing efforts satisfy all terms and conditions of the extramural agreement. Verification organizations shall have a demonstrated capability to meet all terms and conditions.

Once a verification organization has been selected, measures to ensure continued conformance to terms and conditions in the extramural agreement are implemented as described in part A, sections 8.0, 9.0, and 10.0.


5.0 DOCUMENTS AND RECORDS

5.1 Scope
Return to Table of Contents

Procedures shall be established, controlled, and maintained for identifying, preparing, reviewing, approving, revising, collecting, indexing, filing, storing, maintaining, retrieving, distributing, and disposing of pertinent quality documents and records. Such procedures shall be applicable to all forms of documents and records, including printed and electronic media. Measures shall be taken to ensure that users understand the documents to be used. Documents and records requiring control shall be identified.

A document is an instruction, specification, or plan containing information on how the ETV program functions, how specific tasks are to be performed, or how specific products or services are to be provided. Examples include the ETV QMP, the ETV Center QMPs, test / QA plans, and the ETV Strategy.

A record is a statement of data and facts pertaining to a specific event, process, or product, that provides objective evidence that an activity has occurred. Examples include verification statements and reports, raw and summary data tables, data notebooks, audit reports, and stakeholder meeting minutes.

Documents and records to which this policy applies include:

  • ETV Verification Strategy
  • ETV QMP (this document)
  • extramural agreement records
  • verification organization center quality management plans
  • minutes of stakeholder meetings (summary for the record)
  • generic verification protocols (how a given type of technology is verified)
  • test / QA plans--procedures for an individual test, including standard operating procedures (SOPs)
  • raw data (all written and electronic data generated when tests are conducted)
  • ETV verification reports (comprehensive reports on a technology verification project)
  • ETV verification statements (summary statement for an individual technology test)
  • annual ETV progress report
  • EPA reviews and audit of reports
  • verification of organization reviews and audit reports

Information in this section applies to both electronic and printed documents and records, as well as original documents and records developed on behalf of the ETV program that are required to demonstrate the quality of information and data provided in ETV verification reports.

TABLE 5.1 Documents and Records Management Scheme

Record Type
Preparation/Updating
Review
Approval
Finals distributed to:

ETV Verification Strategy

ETV director

ETV team

VP / C managers

laboratory directors

ORD Deputy Assistant Administrator

ETV Webmaster

ETV Quality Management plan

ETV directors of quality assurance

ETV team

VO managers

EPA center quality managers

ETV director

laboratory directors

division directors

 ETV Webmaster

CA/IAG/contract records

EPA center manager

VO manager

EPA line managers

ETV director

EPA line managers

N/A

VO quality management plan

VO manager

VO quality manager

EPA center quality manager

EPA center manager

EPA center quality manager

N/A

minutes of stakeholder meetings

VO manager

EPA center manager

stakeholders

N/A

 ETV Webmaster

generic verification protocol

VO manager

EPA center quality manager

VO center quality manager

stakeholders

ETV director

EPA center manager

ETV Webmaster
(draft and final versions)

test/QA plan
(including SOPs)

VO manager

VO center quality manager

EPA center quality manager

EPA center manager

vendors
stakeholders

ETV Webmaster
vendors

raw data

VO manager

N/A

N/A

EPA can request copies

ETV verification report

VO manager

EPA center quality manager

VO center quality manager

vendor

EPA center manager

ETV director

ETV verification statement

VO manager

EPA center manager

EPA center quality manager

VO center quality manager
vendor

ETV director

laboratory directors

ETV Webmaster

annual ETV progress report

evaluation contractor

ETV Team

VO managers

ETV director

laboratorydirectors

EPA center reviews/ audit reports

EPA center quality manager

EPA center manager

N/A

laboratory directors

VO manager

VO quality manager

EPA program reviews/
audit reports

directors of quality assurance

ETV director N/A

laboratory directors

VO manager

VO quality manager

VO reviews/audit reports VO quality manager VO manager N/A EPA center manage
r
EPA center quality manager

Note: entries in approval column assume review by approving official.
VO = verification organization
N/A = not applicable.


5.2 Preparation, review, approval, and distribution
Return to Table of Contents

Sufficient documents and records shall be specified, prepared, reviewed, authenticated, and maintained to reflect the achievement of the required quality for completed work and/or to fulfill any statutory requirements. Documents used to perform work shall be identified and kept current for use by personnel performing the work. Documents, including revisions, shall be reviewed by qualified personnel for conformance with technical requirements and quality system requirements and approved for release by authorized personnel.

Table 5.1 lists the pertinent quality documents and records for ETV, the person(s) responsible for preparing and updating these documents and records, the reviewers, those given approval authority for each record type, and the distribution plan. In Table 5.1, where a procedure is not applicable (e.g., a document is not subject to approval), N/A is entered in the table. All reviewers and approving officials receive copies of the documents and records they review/approve; the Distribution column in Table 5.1 lists only those individuals who receive final copies, in addition to the reviewers and approving official. For revised documents, these same review, approval, and distribution pathways are followed. Unless otherwise noted, material placed on the ETV website is available for public inspection, comment, and use.


5.3 Documents and records storage and obsolete documents and records
Return to Table of Contents

Obsolete or superseded documents shall be identified and measures shall be taken to prevent their use, including removal from the work place and from the possession of users when practical. Maintenance of records shall include provisions for retention, protection, preservation, traceability, and retrievableness. While in storage, records shall be protected from damage, loss, and deterioration. Retention times for records shall be determined based on extramural agreement and statutory requirements, or, if none stated, as specified by the EPA director and EPA line management.

Obsolete records should be clearly marked as such. These records may be retained in the workplace for historical reference, or they may be removed to archival storage. ETV will follow ORD 's Records Management Policy (see Appendix A ), which addresses requirements for indexing, filing, maintaining, retrieving, and disposing of documents and records from all extramural financial agreements. The current minimum requirement is that all records be kept for seven years after the final payment on an extramural agreement.


6.0 COMPUTER HARDWARE AND SOFTWARE

6.1 General procedures
Return to Table of Contents

Computer software and computer hardware configurations used in the ETV program shall be installed / tested / used / maintained / controlled / documented to meet users' requirements and shall conform to this quality policy and applicable consensus standards and / or data management criteria.

At the program level, ETV does not expect to develop software. At the center level, if verification organizations intend to develop software to support their ETV process (or an individual test / QA plan), they should have procedures in place as specified here. If the verification organization uses only commercial software for office operations (e.g., word processing software, spreadsheet software), it is unlikely that they would need specific procedures for assessing software quality. Part A, sections 6.2 through 6.6, apply only to software and software / hardware configurations developed specifically for the ETV program.

The following are the ETV program procedures which ensure that each center controls the quality of all computer hardware/software configurations for the program.

  • The EPA center manager and the verification organization discuss and agree upon the computer hardware and software requirements of the center and/or specific test / QA plan.
  • Once decisions are finalized, the verification organization supplies evidence of meeting all requirements before data collection, reduction, or validation procedures begins.
  • For software developed for ETV programs, the verification organization tests all applications and configurations using a test data set or by running a shakedown test of the system to ensure all applications/configurations are operating to specifications. The verification organization must show evidence of a system to maintain, control, and document such software and hardware configurations. This includes, but is not exclusive of: resources to correct any hardware/software failure with minimal downtime to the program, tracking upgrades/revisions to software or configuration changes, documenting software names, versions, and copyright dates, and complete documentation of the code. Complete documentation of code includes the written code with comments structured in a modular form.


6.2 Scope of ETV computer hardware / software procedures
Return to Table of Contents

Computer software and computer hardware/software configurations covered by ETV's quality policy include, but are not limited to:

  • operation or process control of environmental technology systems (including automated data acquisition and laboratory instrumentation)
  • and data bases containing environmental data

Computer software and computer hardware/software configurations covered by this quality management plan include all agreed upon, center-specific applications or configurations. These include, but are not limited to:

  • evaluating and reducing environmental data
  • reporting environmental data
  • data bases containing environmental data


6.3 Configuration testing
Return to Table of Contents

Computer hardware/software configurations shall be tested prior to actual use and the results shall be documented and maintained.

On a center level, the verification organization conducts tests of the computer hardware/software configuration using a standard set of testing conditions.

NOTE: The verification organization is required to have a system to document all testing of computer hardware / software configurations, as required by part A section 6.1 . A test data set or a standard set of testing conditions should be developed on a center- or test / QA plan-specific basis. Maintenance testing should be easily trackable and retrievable.


6.4 Measurement and testing equipment configurations
Return to Table of Contents

Computer hardware/software configurations integral to measurement and testing equipment that are calibrated for a specific purpose do not require further testing unless:

  • the scope of the software usage changes OR
  • modifications are made to the hardware/software configuration.

On a center level, verification organizations perform the following procedures (as provided in the verification organization quality system).

Whenever computer hardware / software configurations integral to measurement and testing equipment are calibrated for a specific purpose, further testing is not normally performed unless the scope of the software usage changes or modifications are made to the hardware / software configuration.

In the event either of the above mentioned changes occurs, the verification organization retests the changes as described in part A sections 6.1 and 6.3 . Retesting is documented to the same extent as the original application / configuration.


6.5 Change assessments - configurations, components, and requirements
Return to Table of Contents

Changes to hardware/software configurations, components, or program requirements shall be assessed to determine the impact of the change on the technical and quality objectives of the ETV program supported.

The verification organization is responsible for assessing the changes, determining the need for testing, and reporting the assessments to the EPA center manager.


6.6 ETV website roles and responsibilities
Return to Table of Contents

The ETV website shall be operated in such a way that it serves all ETV participants and customers through prompt and accurate posting of ETV information and documents.

The EPA center managers, or alternate(s) designated in writing by the center manager, are responsible for promptly sending the following information to the ETV webmaster:

  • general fact sheets and brochures
  • stakeholders lists (and updates)
  • meeting announcements and summaries
  • generic testing protocols (indicating draft or final)
  • test / QA plans (indicating draft or final)
  • FBO announcements
  • ETV verification statements
  • upcoming meetings / speeches / announcements.


7.0 PLANNING

7.1 Systematic planning process
Return to Table of Contents

systematic planning process shall be established, implemented, controlled, and documented to:

  • identify the customer(s), and their needs and expectations
  • identify the technical and quality goals that meet the needs and expectations of the customer
  • translate the technical and quality goals into specifications that shall produce the desired result
  • consider any cost and schedule constraints within which technology test activities are required to be performed
  • identify acceptance criteria for the results or measures of performance by which the results shall be evaluated and customer satisfaction shall be determined.

    7.1.1 Systematic planning process established for ETV is conducted as follows:

  • EPA establishes the number and type of ETV Centers necessary to comply with the Presidential mandate to cover all environmental technologies.
  • EPA lays out basic program operation parameters in two documents, the ETV Verification Strategy and ETV QMP.
  • Based upon the ETV Verification Strategy, the ETV director, in consultation with the ETV team and EPA line management, designs an annual budget.
  • Appropriate personnel are appointed from within the ORD laboratories to fill ETV positions. Selection and information on qualifications are presented in part A sections 3.1 and 3.2 .
  • The division and branch management provides resources and planning to support the duties of EPA staff, such as training and travel. Technical training is discussed in part A sections 3.3 and 3.4 .
  • Verification organizations are selected to manage the ETV Technology Centers in conformance with part A section 4.0 .
  • EPA's requirements for the appropriate extramural agreement (e.g., cooperative agreement, interagency agreement, contract) are met.
  • After selection, the verification organization, in consultation with the EPA center manager, establishes stakeholders groups that contain representatives of customer groups of concern to that center's areas.
  • The EPA center manager and the verification organization develop plans for center verification tests and present them to the stakeholders for review and comment.
  • The EPA center manager, the EPA center quality manager, the verification organization, and the stakeholders group hold at least one joint meeting annually to:
    • identify, revise, and/or clarify the technical and quality goals of the work to be accomplished
    • translate the technical and quality goals into written specifications that will be used to produce the desired result
    • consider any cost and schedule constraints within which test activities are required to be performed
    • develop qualitative measures of performance by which the results will be accepted
    • determine testing priorities and evaluate customer satisfaction.
  • Minutes of each meeting are taken by the verification organization and distributed to participants for comment. Minutes of stakeholders meetings are incorporated within the record management scheme described in part A section 5.0.

7.1.2 Implementation of the systematic planning process.

Planning is accomplished through frequent meetings among participants and through posting initial planning documents and stakeholders meeting minutes on the ETV website. Procedures for planning at the center and test level are addressed in Part B. Procedures for implementing the planning process are detailed below:

  • Customer identification - ETV customers are identified in part A section 1.
  • Technical and quality goals identification - In addition to the goals identification which occurs at stakeholder meetings mentioned in Section 7.1.1, these are identified during planning meetings with senior management, conference calls with ETV participants, and meetings with EPA quality professionals and technical staff.
  • Technical and quality goal specifications - The ETV director works with the EPA center managers, EPA directors of quality assurance, and other quality professionals to translate technical and quality goals of the overall program into the ETV QMP.
  • Cost and schedule constraints - These are discussed during planning meetings with senior management and considered yearly for allocation to each center.
  • Measures of performance - The ETV director develops measures of performance for the program, which are evaluated by the ETV team and verification organizations throughout the course of the program. Appendix B contains the most current measures of performance.
  • Customer satisfaction evaluation - The program evaluates all centers on an annual basis. This evaluation includes customer satisfaction measures as appropriate.

7.1.3 Systematic planning process controls include:

  • development and implementation of written procedures (verification test protocols and test / QA plans)
  • requirement of minutes of stakeholders group meetings
  • review of verification organization work efforts by the EPA center manager and EPA center quality manager

7.1.4 Systematic planning process documentation includes the ETV Verification Strategy, the ETV QMP, the verification organization QMPs, and test / QA plans.


7.2 Planning document review
Return to Table of Contents

All planning documentation shall be reviewed and approved for implementation by authorized personnel before the specific work commences. Such documentation includes but is not limited to test/QA plans and generic verification test protocols.

Planning document review is discussed in part A section 5.2 .


8.0 IMPLEMENTATION OF WORK PROCESSES
 

8.1 Implementation
Return to Table of Contents

Work shall be performed according to approved planning and technical documents.

The planning for the implementation of the EPA management and quality work processes is contained in part A section 7.0 . The individual ETV center work is performed according to planning documents written by the center. All technology verification work shall occur according to protocols and test/QA plans developed and agreed upon by EPA, the verification organization , and the vendor. The authors, reviewers, and approvers of these documents are specified in part A section 5.0, Table 5.1 .

The approved protocols and tes t/ QA plans shall be present on the site of testing, and the work shall be implemented in accordance with them. During the work phase, modifications to plans and procedures shall be documented, and the modifications shall be incorporated into the final protocols and test / QA plans. The authors, reviewers, and approvers of changes to these documents are the same as for the original documents and are specified in part A section 5.0, Table 5.1 .

Verification organizations are responsible for implementing their work processes in accordance with their quality systems.

 
8.2 Procedures
Return to Table of Contents

Procedures shall be developed, documented, and implemented for appropriate routine, standardized, special, or critical operations. Operations needing procedures shall be identified. The form, content, and applicability shall be addressed, and the reviewers and approvers shall be specified.

Procedures for the overall operation of the ETV program are contained in the ETV Verification Strategy, the ETV QMP and in other appropriate EPA policies (e.g., extramural agreement, records management). The individual ETV centers shall identify and document those operations in their centers requiring procedures as discussed in Part B . Procedures shall be written in a format that can be readily comprehended by the user and shall contain sufficient detail and clarity to ensure that results are achieved effectively. Appropriate operations documents, authors, reviewers, and approvers are specified in part A section 5.0, Table 5.1 .


8.3 Oversight
Return to Table of Contents

Implementation of work shall be accomplished with a level of management oversight and inspection commensurate with the importance of the program and the intended use of the results, and shall include the routine measurement of performance against established technical and quality specifications.

EPA line management has responsibility for oversight of verification work processes as discussed in part A section 1.0 . Verification organization oversight and responsibilities for the verification work processes are given in the individual center QMPs.


9.0 ASSESSMENT AND RESPONSE

9.1 Numbers and types of assessments
Return to Table of Contents

Assessments shall be planned, scheduled, and conducted to measure the effectiveness of the implemented quality management systems. Several types of assessments are available for this purpose. Management shall determine during the planning stage the appropriate types of assessment activities. Assessments shall include an evaluation to determine and verify whether technical requirements, not just procedural compliance, are being implemented effectively.

The assessments shown in Table 9.1 and the minimum frequency are commensurate with the importance of the ETV program and the intended use of the verification results. Quality systems audits shall be used to measure the effectiveness of the implemented management systems and technical systems. Performance assessments shall be used to evaluate performance of the center technical operations. Data assessments shall assess reported data quality. Verification organizations perform self-assessments in accordance with the individual center management plans, and EPA performs self-assessments and independent assessments of verification organizations . The types of assessments are defined in Part B, Section 4.0.


Table 9.1 Assessments
 

Level
Assessment Tool
Assessors
Responders
Basis for Assessment
Minimum Frequency
Reason for Assessment
Report Reviewed by
Program
Quality Systems Audit
EPA directors of quality assurance
ETV program management
ETV QMP
once; thereafter, as requested
assess management practices for ETV program
Laboratory directors
ETV director
Center
Quality Systems Audits
self: VO center quality managers
VOs
center QMP
once; thereafter, as requested
assess quality management practices of VO for the ETV centers
VO managers
Center
Quality Systems Audits
independent: EPA center quality managers
VOs
center QMP
once; thereafter, as requested
assess quality management practices of VO for the ETV centers
EPA directors of quality assurance
EPA center managers
ETV director
Center
Technical Systems Audits
self: VO center quality managers
VOs: Field testing organizations
test / QA plans
self: once per test
assess technical quality of verification tests
VO managers EPA center quality managers
Center
Technical Systems Audits
independent: EPA center quality managers
VOs: Field testing organizations
test/QA plans
independent: once per year, as applicable
assess technical quality of verification tests
EPA center managers
Center
Performance Evaluation Audits
self: VO quality managers
VOs: Field testing organizations
test/QA plans
self: each test, as applicable
assess measurements performance
VO managers and EPA center quality managers
Center
Performance Evaluation Audits
independent: EPA center quality managers
VOs: Field testing organizations
test/QA plans
independent: each center, as applicable
assess measurements performance
EPA center managers
Center
Audits of Data Quality
self: VO quality managers
VOs: Field testing organizations
raw data and summary data
self: At least: 10% of all of the verification data
assess data calculations and reporting
VO managers EPA center quality managers
Center
Audits of Data Quality
independent: EPA center quality managers
VOs: Field testing organizations
raw data and summary data
independent: for each center, as applicable
assess data calculations and reporting
EPA center managers

Note: General Auditing Guidelines: Because of the high visibility of ETV testing, the systematic planning should provide for sufficient auditing to insure the integrity of the data. The target minimums are a TSA on every test by the VO, and at least once per year per center by EPA. Applicability means that if a test is capable of being quantitatively audited, PEAs should be performed by the VO, and once per year per center by EPA. An ADQ is performed on 10% of test data ( "10% of the test data" means a random selection of 10% of the data from all of the measured parameters) that has already been 100% verified by project personnel. In the case of continuously monitoring instruments operating over long periods of time, a representative amount of the data (suggest 10%) may be verified. In cases where the target minimums appear to be excessive, the professional judgement of the verification test planners will prevail. (Also, see Part B, 4.2 for information re: assessment frequency.)


9.2 Procedures
Return to Table of Contents

Assessments shall be performed according to written and approved procedures, based on careful planning of the scope of the assessment and the information needed. Assessment results shall be documented and reported to management. Management shall review the assessments.

Assessments shall be planned according to the scope of the assessment and the information needed. Suitable written procedures for planning and conducting audits shall be contained in the operating manuals of EPA quality teams, the operating and quality manuals of the verification organizations, and EPA guidance documents (EPA G / 7). Assessments are based on interviews, on the physical examination of objective evidence, on results of analysis of blind samples, and on the examination of the documentation of past performance. The basis for technical assessments of ETV verification tests is the test / QA plan. Results are documented in audit reports, and reviewed by appropriate management as described in Table 9.1. Auditing shall occur when verification testing at the center is at a stage where auditing is feasible.
 

9.3 Personnel qualifications, responsibility, and authority
Return to Table of Contents

Personnel conducting assessments shall have the appropriate technical or management skills to perform the assigned assessment. Management shall determine and document the level of competence, experience, and training necessary to ensure the capability of personnel conducting assessments. The responsibilities and authorities of personnel conducting assessments shall be clearly defined and documented, particularly in regard to authority to suspend or stop work in progress upon detection and identification of an immediate adverse condition affecting the quality of results or the health and safety of personnel.

EPA or verification organization management determines and documents the level of competence, experience, and training of their respective audit personnel during hiring and periodic performance reviews. Qualified audit personnel, as listed in Table 9.l , have access to the appropriate management personnel and documents required to perform their audit duties. They are organizationally independent of the program or center they are auditing. They have the responsibility and authority to:

  • identify and document problems that affect quality of verification results
  • propose recommendations for resolving problems that affect quality of verification work processes or results
  • independently confirm implementation and effectiveness of solutions

If auditors identify a severe problem affecting verification quality, EPA center managers have the authority to request of the verification organization manager that work be stopped until the problem is addressed. If auditors identify a problem where the health and safety of personnel are in danger, they have the responsibility to bring it to the immediate attention of appropriate EPA management, verification organization management, and onsite testing personnel.


9.4 Response
Return to Table of Contents

Responses to adverse conclusions from the findings and recommendations of assessments shall be made in a timely manner. Conditions needing corrective action shall be identified and the appropriate response made promptly. Follow-up action shall be taken and documented to confirm the implementation and effectiveness of the response action.

When the recommendations and conclusions from the findings of assessments are adverse, response from the auditee detailing the corrective action shall be expected within 10 working days of receiving the audit report. Auditors shall follow up with appropriate documentation to confirm the implementation and effectiveness of the response.


10.0 QUALITY IMPROVEMENT

10.1 Annual review for quality improvement
Return to Table of Contents

A quality improvement process shall be established and implemented to continuously develop and improve the ETV Quality System.

The ETV director and EPA directors of quality assurance review the Quality Management Plan annually and recommend improvements to the plan.

The EPA directors of quality assurance recommend and negotiate quality improvements with the ETV team during the annual meeting and through the ETV website.


10.2 Detecting and correcting quality system problems
Return to Table of Contents

Procedures shall be established and implemented to prevent as well as detect and correct problems that adversely affect quality during all phases of technical and management activities.

EPA center managers and EPA center quality managers report problems in any of the areas to EPA line management and to the EPA directors of quality assurance:

  • adequacy of the ETV quality system
  • consistency of the quality system
  • implementation of the quality system
  • correction of quality system procedures
  • completeness of documented information
  • quality of data
  • quality of planning documents
  • implementation of the work process

EPA line managers respond promptly to address correction of the quality problem.


10.3 Cause and effect relationship
Return to Table of Contents

When problems are found to be significant, the relationship between cause and effect and the root cause shall be determined.

The following are general procedures. Specific procedures are found in the individual verification organization written quality systems. When problems are significant, the quality manager determines and documents the relationship between cause and effect, and when possible, determines and documents the root cause of the problem. The quality manager provides this information to the appropriate center managers so corrective action can be authorized and implemented.
A significant problem is any problem requiring:

  • a testing protocol change OR
  • a management system change OR
  • a quality system change (either internal or external to EPA, but still within the ETV program).

NOTE: The verification organization quality managers in accordance with their quality systems are continually reviewing and assessing their projects for conformance with their quality documents. At the program level, assessment reports from the individual projects are monitored and evaluated by the EPA directors of quality assurance for trends or recurring problems that are indicative of significant problems affecting the ETV program as a whole. Any such situation is immediately communicated to the ETV director. The ETV director shares the information and any corrective actions with the EPA center managers.


10.4 Root cause
Return to Table of Contents

The root cause should be determined before permanent preventative measures are planned and implemented.

To guard against implementing ineffective changes, EPA personnel ensure when possible that root causes are determined before preventative measures are planned and implemented.


10.5 Quality improvement action
Return to Table of Contents

Appropriate actions shall be planned, documented, and implemented in response to findings in a timely manner.

In the event that a significant problem is identified that requires a structural change to the ETV program, the ETV Director will initiate discussions with EPA line management appropriate to correct the deficiency.

 

PART B
COLLECTION AND EVALUATION OF ENVIRONMENTAL DATA

Return to Table of Contents

Part A of the ETV Quality Management Plan contains the specifications and guidelines that are applicable to common or routine quality management functions and activities necessary to support the ETV program.

Part B of the ETV Quality Management Plan contains the specifications and guidelines that apply to test-specific environmental activities involving the generation, collection, analysis, evaluation, and reporting of test data.


1.0 PLANNING AND SCOPING

Return to Table of Contents

The work of the ETV program at the project level is to verify the performance of commercial-ready technologies. As discussed in part A section 7.0 , the planning process begins with the Statement of Work (SOW) contained in the solicitation. The successful applicant becomes the verification organization for the center.


1.1 Systematic planning of the verification test
Return to Table of Contents

All work involving the generation, acquisition, and use of environmental data shall be planned and documented. The type and quality of environmental data needed for their intended use shall be identified and documented using a systematic planning process. The test-specific planning must involve the key users and customers of the data. EPA center managers should guide planning activities and ensure that participants are informed of and understand completely the requirements of each test.

The programmatic planning for verification of commercial-ready technologies is discussed in part A section 7.1.1 . This section continues the discussion of systematic planning at the project level.

Verification organizations, working with the EPA center managers, begin a systematic process to plan the individual verification tests. Systematic planning may be accomplished through the data quality objectives process (EPA G-4). The planners perform the following actions:

  • refine the scope of their respective technology areas
  • determine interest in verification from the manufacturers of commercial-ready technologies within the defined scope of the technology areas
  • convene stakeholder groups, containing representatives of verification customer groups, which provide input during the planning process
  • mediate and facilitate the selection of focus areas
  • prepare generic verification protocols which are developed to promote uniform testing for a given type of technology
  • coordinate the review and revision of the protocols (See the review and approval scheme in part A section 5.0 .) keeping in mind both customer and EPA objectives for verification as defined in the ETV Strategy
  • solicit vendor agreements to participate in verification of their products based on the generic protocol (some iteration of the two previous points frequently occurs here as the vendors review and request revision of portions of the generic protocols)
  • prepare test / QA plans for the acquisition of data to verify the performance of the vendors' technologies

The protocols and test / QA plans describe the experimental approach, with clearly stated test objectives and associated quality objectives for the related measurements.


1.2 Systematic planning for verification testing
Return to Table of Contents

  • Organizations that participate in the test shall participate in the planning.
  • The scope and objectives of the verification testing and the desired action or result from the work shall be defined.
  • The data to be collected to achieve verification shall be identified, and the QA and QC requirements to establish the quality of the data shall be defined.
  • Verification tests shall undergo a design process.
  • Verification tests shall be documented.
  • Equipment, operators, and skill levels required for the verifications shall be identified.
  • Any constraints (e.g. time and budget) shall be identified.
  • Conditions, which will suspend work, shall be identified.
  • Assessment tools shall be determined.
  • Methods and procedures for storing, retrieving, analyzing, and reporting the data shall be identified.
  • Methods and procedures for minimizing, characterizing, and disposal of hazardous waste generated during the test shall be identified.
  • 1.2.1 Planning personnel

    The verification organization shall coordinate test planning among the participating organizations including EPA, the stakeholders, the vendors, and any testing organizations and laboratories participating in the test. The verification organization, with the concurrence and oversight of the EPA center manager, shall identify the planning roles of the various players, and shall conduct planning activities by shared communication via teleconference, video conference, and in-person meetings, as appropriate, and within the constraints of the budget.

    1.2.2 Purpose, scope and objectives

    The purpose of this testing is to verify the performance of commercial-ready technologies. Another objective is to develop an efficient method for testing commercial-ready technologies. Many of the centers accomplish this objective by preparing generic verification protocols whereby the performance of similar technologies can be verified in the future using the same protocol. The characteristics of individual technologies and the specifics of individual tests are described in the test / QA plan. For some tests the technologies are sufficiently similar that more than one product in the same technology area is tested under the same test / QA plan. Depending on the technology and the test, technologies may be tested on multiple occasions. The testing experience may be used to refine the GVP.

    1.2.3 Data to be collected and design of experiment

    During planning of the technology verification test, the process, environmental, laboratory, response, and QA data to be collected are identified. Also identified are testing organizations, test personnel, skill levels, methods, procedures, and equipment unique to each verification test. Planning is integrated into design as discussed in part B section 2.0 .

    1.2.4 Documentation and reporting

    Records generated during the verification tests are listed in part A section 5.0 . Records consist of both paper and electronic records. Electronic methods for storing, retrieving, analyzing, and reporting the data are generally commercially available programs for word processing, spreadsheet, or database processing, or commercial software developed especially for data collection and processing on a specific instrument or piece of equipment. Centers may also develop software/hardware configurations, as appropriate, in their technology verification tests. The use of computer hardware and software is discussed in part A section 6.0 . Paper records such as field notebooks, bench sheets, field data sheets, custody sheets, and instrument printouts are part of the raw data test record and kept with the study records.

    1.2.5 Assessments

    The assessment tools and minimum frequencies of assessments for the verification tests are identified in part A section 9.0 . The definitions of the assessment tools and suggested frequencies are given in part B section 4.0 .

    1.2.6 Constraints, suspension of work, waste minimization and disposal

    Verification organizations work under the constraints of time and resources communicated to them by the EPA ETV Director and the EPA center manager. When constraints are determined by the verification organization to affect quality, the resolution of the problem proceeds as described in part A section 1.5 . Circumstances under which work can be suspended are discussed in part A section 1.7 . If waste is generated as part of the verification testing, the verification organization seeks to minimize the amount, and disposes of it in accordance with applicable local, state, and federal laws.


2.0 DESIGN OF TECHNOLOGY VERIFICATION TESTS
 

2.1 Design process
Return to Table of Contents

The design shall incorporate those activities pertaining to verification of performance identified during the planning process, establish test specifications, and identify appropriate controls. The design shall include

  • Selection of field sampling or testing equipment, and its operational parameters, as appropriate
  • Selection of field sampling or testing methods, as appropriate
  • Sample types, numbers, quantities, handling, packaging, shipping, and custody, if applicable
  • Sampling locations, storage, and holding times, if applicable
  • Selection of analytical methods, quality measures of performance, analysis providers, if applicable
  • Requirements for calibration standards, and performance evaluation samples, as appropriate
  • Requirements for field and/or laboratory QA/QC activities
  • Requirements for qualifications of testing, sampling and/or analysis personnel
  • Protection of health and safety of test personnel and the public
  • Readiness reviews prior to data collection
  • Assessments required including technical and performance audits, audits of data quality, and assessments of data use limitations
  • Data reporting requirements
  • Methods for validating and verifying the data
  • Requirements for data security, archival, and retention
  • Integration of time and schedule constraints
  • Procedures for minimization or disposal of wastes generated during verification activities

2.1.1 Designers

The design of an ETV verification test is provided by a team that includes stakeholders, EPA staff, verification organization staff, and vendors. The output of the design process is a test / QA plan that details the planned tests and documents the rationale, assumptions, and personnel involved. EPA provides guidance for writing test / QA plans (EPA G-5). For some classes of technologies the output may be a generic verification protocol (GVP).

2.1.2 Objectives

The goal of an ETV verification test is the production of high-quality testing data for use by a decision maker in determining the appropriateness of the technology for the intended use. In designing technology performance verification operations, designers use a modification of the EPA's data quality objectives (DQO) process (EPA G-4) for those ETV verification tests for which quantitative goals can be specified. A modification is required because the DQO process is geared toward making a decision. The ETV program does not participate in making ranking decisions regarding the technologies. It provides unbiased reporting of performance through testing. The objective of the design process is to identify and harmonize all components necessary to conduct a successful test.

2.1.3 Design process and components

The planning process considers selection of test parameters, availability of test equipment, availability of testing personnel, optimal test procedures, and the necessary and sufficient data quality indicators for test measurements. The verification test design takes into account constraints of time, scheduling, and resources.

The product of the design process is a test/QA plan.

  • The plan documents the process and assumptions used for planning, as well as those persons responsible for the planning.
  • The plan specifies the field and laboratory tests to be conducted, the baseline parameters, the number of replicate tests, and the controls.
  • Field and laboratory equipment and optimal operating parameters are specified.
  • If the testing involves samples, the plan specifies sampling methods, sample types, numbers, quantities, handling, packaging, shipping, and custody. Also specified are sample locations, storage conditions, and holding times.
  • Analysis methods, quantitative measures of performance, calibration standards, calibration check standards, and performance evaluation samples, as appropriate, and as identified in the planning process, are incorporated into the design.
  • Methods and procedures are included to ensure the test produces data of known and acceptable quality.
  • The design incorporates any other field or laboratory QA/QC activities identified by planners.
  • The design specifies the requirements for qualifications of technical staff responsible for obtaining, analyzing, and evaluating the data. Protection of the health and safety of testing personnel and the public is incorporated into the design.
  • Procedures for the minimization and disposal of wastes generated are designed into the verification activities.

2.1.4 Assessments

Assessments incorporated into the design include self-assessments (internal audits) by the verification organization and independent assessments by EPA. The assessments identified in the planning process are incorporated into the design. The type and minimum number of assessments are identified in part A section 9.0 . A suggested schedule of assessments is given in part B section 4.0 .

2.1.5 Validating, reporting, securing, and archiving data

Data are verified by the data collectors and independently validated by technical assessors as indicated under Audits of Data Quality in part A, Table 9.1 . Data are reported in ETV verification reports and ETV verification statements. Data records are stored as discussed in part A, section 5.0 and in Appendix A .


2.2 Generic verification protocols and test/QA plans: planning documents from the design process

Return to Table of Contents

Planning documents from the design process include generic verification protocols and test/QA plans.

Writing planning documents is generally a lengthy process involving iterations of review and revision. Authors should be knowledgeable of the activity and the equipment described in the planning documents. Two types of planning documents have been identified, as the core documentation needed for operation of an ETV center: the generic verification protocol and the test / QA plan. The generic verification protocol is meant to promote uniform testing for a single center and, therefore, is considered a more general document. The test / QA plan contains the specific information needed to conduct a verification test.

2.2.1  Generic verification protocols provide the necessary framework for development of the more detailed test/QA plan. The specific content and level of detail given in generic verification protocols may vary between centers. For some centers, the generic verification protocol may be so detailed that the test / QA plan may require very little additional information. Given the variable nature of the generic verification protocol, no specific format has been proposed.

The issues that may be addressed in the generic verification protocol are the following:

  • General description of the center
  • Responsibilities of all involved organizations
  • Experimental design
  • Equipment capabilities and description
  • Description and use of field test sites
  • Description and use of laboratory test sites
  • QA / QC
  • Data handling
  • Requirements for other documents
  • Health and safety
  • References

The QA / QC section of the generic verification protocol typically describes the activities that verify the quality and consistency of the work and provides data quality descriptors, such as accuracy, precision, representativeness, completeness, comparability, and detection limit, as appropriate. Preparation and use of appropriate QA procedures such as QC samples, blanks, split and spiked samples, and performance evaluation (PE) samples to verify performance of the technology being tested can be described. Frequency of calibrations and QC checks and the rationale for them can be described. Procedures for reporting QC data and results can be given. Who is responsible for each QA activity, and who has the responsibility for identifying and taking corrective action can be specified. However, if these items vary between tests within a given center, the more appropriate document in which to describe them may be the test / QA plan.

The protocol may cite documents or procedures that explain, extend, and/or enhance the protocol such as related procedures, the published literature, or methods manuals. The specific location of any reference not readily available from a full citation in the reference section should be given (as in a facility-specific standard operating procedure) or attached to the protocol.

2.2.2  Test/QA plans contain the following elements as given in Guidance on Quality Assurance Project Plans (QAPPs), G-5. (In the ETV QMP, a test/QA plan is identical to a QAPP.) Not all elements listed are appropriate to every test. The test/QA plan will note and explain those elements that are not applicable. EPA takes a graded approach to the level of detail expected in a test/QA plan. For highly visible programs such as ETV, a higher level is expected.

Group A: Project Management

This group of QAPP elements covers the general areas of project management, project history and objectives, and roles and responsibilities of the participants. The following nine elements ensure that the project's goals are clearly stated, that all participants understand the goals and the approach to be used, and that project planning is documented:

  • A1 Title and Approval Sheet
  • A2 Table of Contents and Document Control Format
  • A3 Distribution List
  • A4 Project/Task Organization and Schedule
  • A5 Problem Definition/Background
  • A6 Project/Task Description
  • A7 Quality Objectives and Criteria for Measurement Data
  • A8 Special Training Requirements/Certification
  • A9 Documentation and Records

Group B: Measurement/Data Acquisition

This group of QAPP elements covers all of the aspects of measurement system design and implementation, ensuring that appropriate methods for sampling, analysis, data handling, and QC are employed and will be thoroughly documented:

  • B1 Sampling Process Design (Experimental Design)
  • B2 Sampling Methods Requirements
  • B3 Sample Handling and Custody Requirements
  • B4 Analytical Methods Requirements
  • B5 Quality Control Requirements
  • B6 Instrument/Equipment Testing, Inspection, and Maintenance Requirements
  • B7 Instrument Calibration and Frequency
  • B8 Inspection/Acceptance Requirements for Supplies and Consumables
  • B9 Data Acquisition Requirements (Non-Direct Measurements)
  • B10 Data Management

Group C: Assessment/Oversight

The purpose of assessment is to ensure that the QAPP is implemented as prescribed. This group of QAPP elements addresses the activities for assessing the effectiveness of the implementation of the project and the associated QA/QC activities:

  • C1 Assessments and Response Actions
  • C2 Reports to Management

Group D: Data Validation and Usability

Implementation of Group D elements ensures that the individual data elements conform to the specified criteria, thus enabling reconciliation with the project's objectives. This group of elements covers the QA activities that occur after the data collection phase of the project has been completed:

  • D1 Data Review, Validation, and Verification Requirements
  • D2 Validation and Verification Methods
  • D3 Reconciliation with Data Quality Objectives

The generic verification protocol, if one exists, may be incorporated by reference.

If another level of detail is required for describing test activities, for example operation of an instrument, a standard operating procedure may be written and attached to the test/QA plan. The following topics, from EPA QA/G-6 Guidance for Development of Standard Operating Procedures (SOPs), may be included (or a reference provided) in the standard operating procedure:

  • Title Page
  • Table of Contents
  • Procedures - The following are topics that may be appropriate for inclusion in technical SOPs. Not all will apply to every procedure or work process detailed.
  • Scope & Applicability (describing the purpose of the process or procedure and any organizational or regulatory requirements),
  • Summary of Method (briefly summarizing the procedure),
  • Definitions (identifying any acronyms, abbreviations, or specialized terms used),
  • Health & Safety Warnings (indicating operations that could result in personal injury or loss of life and explaining what will happen if the procedure is not followed or is followed incorrectly; listed here and at the critical steps in the procedure),
  • Cautions (indicating activities that could result in equipment damage, degradation of sample, or possible invalidation of results; listed here and at the critical steps in the procedure),
  • Interferences (describing any component of the process that may interfere with the accuracy of the final product),
  • Personnel Qualifications (denoting the minimal experience the SOP follower should have to complete the task satisfactorily, and citing any applicable requirements, like certification or "inherently governmental function"),
  • Equipment and Supplies (listing and specifying, where necessary, equipment, materials, reagents, chemical standards, and biological specimens),
  • Procedure (identifying all pertinent steps, in order, and materials needed to accomplish the procedure such as:
    • Instrument or Method Calibration and Standardization
    • Sample Collection
    • Sample Handling and Preservation
    • Sample Preparation and Analysis (such as extraction, digestion, analysis, identification, and counting procedures)
    • Troubleshooting
    • Data Acquisition, Calculations & Data Reduction Requirements (such as listing any mathematical steps to be followed)
  • Computer Hardware & Software (used to store field sampling records, manipulate analytical results, and/or report data), and
  • Data and Records Management (e.g., identifying any forms to be used, reports to be written, and data and record storage information).
  • Quality Control and Quality Assurance Section - QC activities are designed to allow self-verification of the quality and consistency of the work. Describe here the preparation of appropriate QC procedures (self-checks, such as calibrations, recounting, reidentification) and QC material (such as blanks - rinsate, trip, field, or method; replicates; splits; spikes; and performance evaluation samples) that are required to demonstrate successful performance of the method. Specific criteria for each should be included. Describe the frequency of required calibration and QC checks and discuss the rationale for decisions. Describe the limits/criteria for QC data/results and actions required when QC data exceed QC limits or appear in the warning zone. Describe the procedures for reporting QC data and results.


3.0 IMPLEMENTATION OF PLANNED OPERATIONS

3.1 Implementation of planning
Return to Table of Contents

Environmental data operations shall be implemented according to the approved planning documents. Deviations shall be documented and reported to and evaluated by management. Approved changes shall be made and distributed to test personnel to replace previous versions of the documents.

Technology performance verifications are implemented according to the generic verification protocols and test / QA plans prepared during planning. During implementation, changes are incorporated, reviewed and approved according to the scheme discussed in part A section 5.0 . Test personnel have access to the approved planning documents, approved changes to planning documents, and all referenced documents. The final protocols are posted on the ETV web page for future use for similar technology verifications.

All implementation activities are documented. Suitable documents are bound notebooks, field and laboratory data sheets, spreadsheets, computer records, and output from instruments (both electronic and hard copy). All documentation is developed as described in the planning documents. All implementation activities are traceable to the planning documents and to test personnel.


3.2 Services and items
Return to Table of Contents

Only qualified and accepted services and items shall be used in the performance verification operations. Acceptance shall be identified on the items themselves and /or in documents traceable to the items. Tools, gauges, instruments, and other sampling, measuring, and testing equipment used for activities affecting quality shall be controlled as required and, at specified intervals, calibrated to maintain accuracy with specified limits. Documentation of calibration shall be maintained and shall be traceable to the equipment. Periodic preventative and corrective maintenance of equipment shall be performed, and it shall be recalibrated prior to use.

ETV program services are delivered by the verification organizations. The verification organizations are accepted via the solicitation, proposal, and extramural agreement process as discussed in part A section 4.0 .

Qualified and accepted services and items used in testing are provided for in the verification organization quality systems. The center quality management plan contains provisions for acceptance of services and items, and documentation of acceptance. Control of equipment, calibration to maintain accuracy within specified limits, maintenance, and documentation is the responsibility of the verification organization. The verification organization verifies that the tools, gauges, instruments, and any other sampling, measuring, and testing equipment used for activities affecting quality are controlled as required by the planning documents, and calibrated at specified intervals to maintain accuracy within specified limits. Equipment found to be out-of-specification is not used without documented repair and reassessment of performance. All maintained and repaired equipment is recalibrated as necessary before it is used for measurement work.

Oversight is the responsibility of EPA, and is conducted through review and acceptance of the verification organization quality system documents, the center quality management plan, and through independent audits. All of the requirements for quality of goods and services on verification organizations passes through to their subcontractors.


3.3 Field and laboratory samples
Return to Table of Contents

Handling, storage, cleaning, packaging, shipping, and preservation of field and laboratory samples shall be performed according to required specifications, protocols, or procedures to prevent damage, loss, deterioration, artifacts, or interference. Sample chain of custody shall be tracked and documented.

If samples for analysis are taken in the field, they are to be handled according to procedures in the test / QA plan. The oversight responsibility of EPA is to determine that the approved quality systems and verification plans contain adequate procedures for handling, storage, cleaning, packaging, shipping, and preservation of field and laboratory samples to prevent damage, loss, deterioration, artifacts, or interference. The verification organization provides adequate chain of custody procedures, as required.


3.4 Data and information management
Return to Table of Contents

Data or information management, including transmittal, storage, validation, assessment, processing, and retrieval, shall be performed in accordance with the approved instructions, methods, and procedures.ETV program records and the procedures for handling them are listed in part A section 5.0 .


4.0 ASSESSMENT AND RESPONSE

4.1 Assessment types
Return to Table of Contents

4.1.1 Quality Systems Audits

A quality systems audit (QSA) is an on-site review of the implementation of a verification organization quality system as documented in the ETV Center's approved QMP. This review is used to verify the existence of, and evaluate the adequacy of, the internal quality system. A QSA may be a self-assessment or an independent assessment. Since quality systems audits most effectively identify problems when they are conducted early, they should be performed in the year following approval of the QMP. See part A section 9.0 for required frequency. Guidance is available for conducting QSAs (EPA QA/G-2).

4.1.2 Technical Systems Audits

A technical systems audit (TSA) is a qualitative on-site evaluation of sampling and / or measurement systems. The objective of the TSA is to assess and document acceptability of all facilities, maintenance, calibration procedures, reporting requirements, sampling and analytical activities, and quality control procedures. An approved test / QA plan provides the basis for the TSA. Self TSAs are conducted by the verification organization and independent TSAs are conducted by EPA as required by part A section 9.0 . Assistance for the TSA may be available to EPA ETV center quality managers from QA support contractors. TSAs are most useful when conducted early in the life cycle of a project when corrective actions (if necessary) can be performed that will minimize any loss of data. Guidance is available for conducting TSAs (EPA QA/G-7).

4.1.3 Audits of Data Quality

An audit of data quality (ADQ) is an examination of the data after they have been collected and 100% verified by project personnel. Assessing whether the Data Quality Indicator (DQI) goals specified in the test / QA plan were met requires a detailed review of the recording, transferring, calculating, summarizing, and reporting of the data. ADQs are conducted as required by part A section 9.0. Self ADQs are conducted by the verification organization. Independent ADQs are conducted by EPA Assistance for the ADQ may be available to EPA ETV center quality managers from QA support contractors. Guidance is available for conducting ADQs (EPA QA/G-9).

4.1.4 Performance Evaluation Audits

A performance evaluation audit (PEA) is a quantitative evaluation of a measurement system. Although each measurement in a test program could be subjected to a performance evaluation, the critical measurements (designated in the test / QA plan) are more commonly evaluated. An evaluation of a measurement system usually involves the measurement or analysis of a reference material of known value or composition. The value or composition of reference materials must be certified or verified prior to use, and the certification or verification must be adequately documented. Ideally, the identity of the reference material is disguised so that the operator or analyst will treat the material no differently than a test program sample. PEAs are conducted as required in part A section 9.0.


4.2 Assessment frequency
Return to Table of Contents

Activities performed during technology verification performance operations that affect the quality of the data shall be assessed regularly, and the findings reported to management to ensure that the requirements stated in the generic verification protocols and the test/QA plans are being implemented as prescribed.

Because of the high visibility of ETV Testing, the systematic planning should provide sufficient auditing to insure the integrity of the data. The types and minimum frequency of assessments for the ETV programs are listed in part A section 9.0 . The target minimum types and numbers of assessments for verification tests are the following:

  • quality systems audit - self-assessments as provided in the center quality management plan, one independent assessment by EPA,
  • technical systems audits - self-assessments for each test as provided for in center QMPs and test / QA plans, and independent assessments by EPA, minimum of one per year per center as applicable. Applicability means that the verification testing occurring at the center is at a stage where auditing is feasible.
  • performance evaluation audits - self-assessments as applicable for each test as provided in the test / QA plan and independent assessments by EPA, as appropriate and applicable. Applicability means that the verification has a quantitative measurement parameter capable of being audited.
  • audits of data quality - self-assessments of at least 10% of all the verification data from each test; and independent assessment by EPA, as applicable. "10% of all the verification data" means a random selection of 10% of the data from all of the measured parameters.

In cases where the target minimums appear to be excessive to the verification test planners, their professional judgement will prevail. Additional assessments may be included in individual test / QA plans. Assessments by the verification organization will occur on a continuous and stable level as provided in Table 9.1. EPA center quality managers receive and review self assessment reports and subcontractor assessment reports provided by verification organizations.


4.3 Response to assessment

Return to Table of Contents

Appropriate corrective actions shall be taken and their adequacy verified and documented in response to the findings of the assessments. Data found to have been taken from non-conforming equipment shall be evaluated to determine its impact on the quality of the data. The impact and the action taken shall be documented.

Assessments are conducted according to procedures contained in the verification organization quality systems or the quality procedures available to EPA personnel, as discussed in part A, section 9.0. Findings are provided in audit reports. Responses to adverse findings are required within 10 working days of receiving the audit report. Follow-up by the auditors and documentation of response are required.


5.0 ASSESSMENT AND VERIFICATION OF DATA USABILITY

5.1 Data verification and validation
Return to Table of Contents

Data obtained during verification tests shall be assessed, verified, and qualified according to their intended use (as verification performance data). Any limitations on this intended use shall be expressed (quantitatively to the extent practicable) and shall be documented in the ETV verification report.

Data are verified by the data collector. Data verification procedures are specified in the centers' quality management systems. Audits of data quality are used to validate data at the frequency cited in Table 9.1 and are documented in the data audit report. The goal of an audit of data quality is to determine the usability of test results for reporting technology performance, as defined during the design process. Validated data are reported in the ETV verification reports and ETV verification statement along with any limitations on the data and recommendations for limitations on data usability. All validated data arising from testing under the ETV program are disclosed in verification reports, even if the technology did not perform to the expectations of the technology provider.


5.2 Existing data 
Return to Table of Contents

Any data obtained from sources that did not use a quality system equivalent to the E4 Standard shall be assessed according to approved and documented procedures.

Existing data may be used for planning, subject to the individual rules set up by each center. Data used for verification collected outside the ETV test is subject to rigorous scrutiny according to the procedure in Appendix C .


5.3 Reports reviewed

Return to Table of Contents

ETV verification reports containing data and reporting the results of technology verification performance shall be reviewed independently (i.e., by others than those who produced the data or the reports) to confirm that the data or results are presented correctly. These reports shall be approved by management prior to release, publication, or distribution.

The procedure for ETV verification report and ETV verification statement review and approval is given in part A section 5.0 . ETV verification reports are peer-reviewed by the EPA ORD peer review process. ETV verification statements are signed by the respective EPA laboratory directors and the verification organization representative.


REFERENCES

Return to Table of Contents

American Society for Quality Control. American National Standard Specifications and Guidelines for Quality Systems for Environmental Data Collection and Environmental Technology Programs. ANSI/ASQC E4-1994, E4.Exit EPA American Society for Quality, 1994.

Environmental Technology Verification Policy Compendium
Unpublished work.

Environmental Technology Verification Program Quality and Management Plan for the Pilot Period (1995-2000), EPA/600/R-98/064, Cincinnati OH: U. S. Environmental Protection Agency, 1998.

Environmental Technology Verification Program Verification Strategy. EPA/600/K-93/003. Washington DC: U.S. Environmental Protection Agency 1997

Guidance for the Data Quality Objectives Process, EPA QA/G-4 , EPA/600/R-96/055. Washington DC: U.S. Environmental Protection Agency, 2000.

Guidance for Quality Assurance Project Plans, EPA QA/G-5 Washington DC: U.S. Environmental Protection Agency, 1998.

Guidance for the Preparation of Standard Operating Procedures (SOPs) for Quality Related Documents, EPA QA/G-6, EPA/240/B-01/004. Washington DC: U.S. Environmental Protection Agency, 2001.

Guidance on Technical Audits and Related Assessments, EPA QA/G-7 Washington DC: U.S. Environmental Protection Agency, 2000.

Guidance for Data Quality Assessment, EPA QA/G-9 , EPA/600/R-96/084. Washington DC: U.S. Environmental Protection Agency, 2000.

Integrated Information and Quality Management Plan (HQMP) for the National Exposure Research Laboratory, DCN NERL HQMP No. 1, Spring 2002.

Quality Management Plan for the National Risk Management Research Laboratory (NRMRL), DCN NRMRLQA 001 rev 1, NRMRL QMP.

Simes, G. F., Preparation Aids for the Development of Category II Quality Assurance Project Plans, EPA/600/8-91/004. Cincinnati OH: U.S. Environmental Protection Agency, 1991.


APPENDIX A

Return to Table of Contents

U.S. EPA RECORDS CONTROL SCHEDULE

Applicable records schedules include the following:

EPA Series No.
Title
003 Grants and Other Program Support Agreements
006 Program Management Files (Agency-wide All Programs)
185 Quality Assurance Project Plans (Agency-wide All Programs)
202 Contract Management Records (Agency-wide All Programs except Superfund Site Specific)
258 Final Deliverables and Reports (Agency-wide All Programs)

Consult the National Records Management Program (NRMP) website (EPA-NRMP) for the most recent information on EPA records management.


APPENDIX B

Return to Table of Contents

What Constitutes Success for ETV?

Timing

  • No more than one year for verification organization selection.
  • No more than one year after verification organization selection for completing the organizational phase (i.e., stakeholder selection, technology prioritization, initial protocol development, stakeholder approval of protocols).
  • For each technology verification event, no more than twelve months (six months for Safe Buildings) between vendor agreement and draft final report (excluding the duration of the test).
  • No more than two months for EPA approval and one month for publication.

Cost of Operation, Testing, Participation

  • Funding support program-wide includes 30% support from sources other than EPA ORD by 2004.

Customer Satisfaction

  • Significant number of States accept ETV data for permitting.
  • Significant number of consulting engineers use ETV data for making technology recommendations.
  • Greater than 70% of vendors surveyed have positive experience in ETV.
  • Vendors return to test additional technologies under ETV.
  • Applications for testing exceed ETV capacity.

Effects

  • Vendor sales data; technology use data.


APPENDIX C

Return to Table of Contents

ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM
EXISTING DATA: POLICY AND PROCESS

Background

The Environmental Technology Verification program was established by the USEPA for the purpose of verifying the performance of commercial-ready technologies for their ability to monitor, prevent, control, or clean-up pollution. Verification is accomplished by the evaluation of objectively-collected, quality-assured data which are provided to potential purchasers and permitters as an independent and credible assessment of the performance of a technology. Data are collected and evaluated by independent third party verification organizations chosen from the public sector (such as states), the private sector (such as non-profit research institutions), federal laboratories, and others. EPA provides oversight of the verification organization to assure the credibility of the process and data, and keeps the authority for the verification process and decision.

The ETV program seeks to identify optimal methods to verify environmental technologies without compromising quality. Stakeholder groups, consisting of representatives of major verification customer groups, advise and assist EPA and the verification organizations in this effort. One consistent and urgent request has been that existing data, i.e., data collected prior to the ETV program, be used for ETV verification. This suggestion is reinforced by the programs of individual states, as well as those of other countries, that routinely consider previously-collected data in the verification of vendor claims for a technology. The purpose of this document is to establish a guideline whereby the ETV program may use these "historical," "existing," or "secondary" data to increase and enhance the scope of individual center projects.


POLICY

Currently, under the ETV program, the verification organization and the technology developers typically plan and execute tests which provide the objective and quality-assured data by which the environmental technologies are evaluated. Existing data are used to support test/QA plan development. Measurements and data are collected in a demonstration of the technology by the developer, under the direction of the verification organization, and overseen by EPA. Reports are peer-reviewed and Verification Statements are issued. In this closely monitored scenario, the origin and quality of the data upon which the verification statement rests are generally known and documented, and therefore the possibility for verification decision error is minimized. The consequences of a serious verification decision error can include verification of fraudulent claims, litigation, and loss of credibility for the ETV program, the verification organizations, and EPA.

Compelling arguments exist for considering using certain qualified existing data to replace some or all of the verification testing for a given technology. Some technologies are time-consuming and expensive to evaluate. Due to resource constraints, demonstrations can, at best, show the performance of the technology under only limited conditions. A test may provide only one small performance snapshot in time as opposed to providing data from several years of performance collected by the developer or his customers under a full range of conditions. Limited resources may require that testing focus on only one component of a technology rather than its full range of capability. Before coming to the commercially viable stage of development, these technologies may have been tested numerous times with acceptably reproducible results.

Judicial precedent provides argument for the defensible use of existing data. In Daubert v. Merrill Dow Pharmaceuticals, Inc., the Supreme Court in 1993 adopted a new standard for the admissibility of scientific evidence. The Court there held that Federal Rule of Evidence 702 requires that, when presented with proposed scientific testimony, the district court must make a preliminary assessment of whether the reasoning or methodology underlying the testimony is scientifically valid, and therefore reliable. The Court declined to adopt a definitive checklist or test, but noted several factors a court should consider. Those factors include:

  1. does the theory or technique involve testable hypotheses;
  2. has the theory or technique been subject to peer review and publication;
  3. are there known or potential error rates and are there standards controlling the technique's operation; and
  4. is the method or technique generally accepted in the scientific community?

The court must also consider the relevance or fit of the proposed testimony by determining if the reasoning and methodology can properly be applied to the facts at issue.

The Clean Air Act Credible Evidence Revisions (see Federal Register, Vol. 62, No. 36, February 24, 1997) provide precedent within the Agency for defensible consideration of existing data for verification use. These revisions clarify that data from methods which are not EPA Standard Reference Methods can be used in enforcement actions and for compliance certification. Conversely, emission sources will be able to use any credible evidence (ACE) for contesting allegations of noncompliance in enforcement actions. As the rule states, it "exemplifies EPA's common sense" approach to environmental protection, which encourages smarter, cheaper and more flexible means of achieving environmental goals without compromising the fundamental health and environmental protections provided by federal environmental laws." It follows that if EPA can use ACE for enforcement actions, it can be considered for verification.

Other precedent within the Agency exists at the Office of Air Quality Planning and Standards (OAQPS). OAQPS uses secondary data, defined as data that are utilized for a purpose other than that for which they were initially collected, in its regulatory efforts. In order to effectively focus its quality assurance (QA) efforts within the constraints of available resources, OAQPS concentrates its consideration of secondary data according to category of project. The QA activities associated with evaluating secondary data are conducted to assure that the data will be adequate and sufficient for their planned secondary use.

Recognizing therefore that it is neither prudent nor cost-effective to ignore existing data, the ETV program establishes by this document a consistent process to evaluate these data for the extent of their credibility and usability in the verification decision. Data to be considered for use to replace verification testing undergo a rigorous process of evaluation using stringent criteria. The following guidelines are used to qualify existing data for verification purposes (detailed procedures follow in the "process" section of this document):

  • Data are evaluated using qualified reviewers following the data evaluation process established in the "process" section of this document .
  • The documentation of the candidate data is sufficient to allow the reviewers to assess the quality of the data set and its usability for verification.
  • The data are evaluated to determine that they meet the same minimum quality acceptance criteria as that collected in a comparable ETV center demonstration.
  • All of the data used for a verification must have been objectively collected, independently of the vendor.
  • Only data collected under a well-defined, documented quality system will be considered. Such data sets should contain all the elements required to withstand peer review, and thus be usable for verification.

Recognizing that useful data exist which will not qualify for verification under these guidelines, and responding to customer needs, individual centers may establish individual evaluation criteria by which existing data may be considered. These data may not be used directly for verification, but may be used, for example, to support planning or to augment verification testing. No ETV program-wide guidelines are necessary for the use of existing data for purposes other than for verification.


PROCESS

Identifying and Qualifying the Data

The vendor proposes the data to be evaluated. EPA and the verification organization shall (with input from the stakeholder group, as applicable) identify for the vendor the procedures and acceptance criteria used in the Center verification tests to evaluate technology performance. These procedures and criteria are the same as those used for other technologies evaluated by the verification organization. The data acceptance requirements are developed by EPA, the verification organization, and interested stakeholders, and are not specific to the existing data. The vendor and verification organization perform the initial evaluation.

The vendor shall provide the verification organization with the detailed protocols and test/QA plans used to develop the existing data. The vendor shall identify those data that he believes will meet the acceptance criteria, qualify those data, and submit the data along with detailed evidence that the data meet the requirements. The evidence shall be submitted to the EPA and verification organization in a report. The report shall show how the data verify the performance of the technology, identify data that were excluded, give an explanation of how and why they were excluded, and address other requirements specific to the center project. The vendor shall be prepared to provide all of the raw data.

The verification organization shall review the vendor's planning documents to determine whether they meet the requirements of those being used by the verification organization for evaluation tests of other technologies. At a minimum, the existing data protocols and test/QA plans shall require the same level of QA/QC, replicate tests, data treatment, and reporting as that required by the verification organization in its technology verification tests. The verification organization shall conduct a detailed review of the vendor's data report to determine whether the data can be used to evaluate the performance of the technology. The verification organization must have access to the raw data to work through a reasonable random sample (suggest 10 percent of the data). A recommended method for evaluation of data is tracing a random selection of data points from the raw data set to the final report.

Minimum General Acceptance Criteria

  • The technology is based on sound scientific and engineering principles.
  • The conditions under which the data were collected are clearly defined and were appropriate for the demonstration of the capabilities of the technology.
  • The data are quality assured. For example, where appropriate, the documentation provides a measure of the bias and precision of the measurements. Where needed, minimum detection limits have been determined and reported. Where applicable, the measurement range of the technology is given. A narrative statement will include a discussion of how well the data represent the capabilities of the technology in its intended environmental application.
  • Sufficient data are available to support the performance verification of the technology. Sufficiency of the data will be determined by the EPA and verification organization reviewers.
  • The data may not have been produced by the vendor. Vendor-generated data may be reviewed as part of the evaluation process because it is a rich source of knowledge about the technology. Only data collected objectively and independently of the vendor, however, may be used to replace verification testing.

Specific Acceptance Criteria

In addition to the general acceptance criteria, the specific center stakeholders may impose specific acceptance criteria which must be as stringent as the acceptance criteria for the data collected during verification testing.

Convening the Data Evaluation Panel

If the verification organization determines that the report does not present data of sufficient quality and quantity to evaluate the performance of the technology, the vendor is notified and no further action is required. If the verification organization determines that the vendor's report does provide data of sufficient quality and quantity, then a data evaluation panel (DEP) is appointed. The verification organization enlists the services of three qualified reviewers to serve on the DEP. The DEP will generally consist of one person from EPA, one person from the verification organization, and one person who is an outside expert in the technology being evaluated. The DEP must contain members who are credible, experienced, knowledgeable, and qualified in the technical areas critical to the technology being evaluated. The members of the DEP must be objective and have no real or perceived conflict of interest with the commercial developer of the technology they are evaluating. DEP members must be independent; they cannot have been involved in the collection of the data being evaluated. When the submitted data are proprietary, confidentiality agreements are provided.

Evaluation of the Data by the DEP

The DEP reviews and agrees on the data acceptance criteria and determines their applicability to the data being evaluated. The evaluation shall follow the procedures and criteria developed by the verification organization and EPA for other technology verifications conducted by the center.

The verification organization provides a written summary of its review to the DEP. The DEP reviews and evaluates the data using the agreed upon acceptance criteria.. The DEP determines that the data were gathered following appropriate test protocols similar to the protocol used for verification testing. It ensures that the data were gathered following written test/QA plans developed using a similar protocol. Planning must have included specific test objectives, experimental design, criteria for data quality, QA/QC procedures followed and reported, number of samples or frequency of sampling, and sampling and analytical procedures. The DEP must determine that the data quality meets or exceeds the minimum data quality requirements of the verification testing conducted by the center.

The quality and usability of the existing data shall be evaluated against clearly defined data quality requirements based on the data quality requirements of the ETV center verification testing.. The data shall be sufficient to evaluate the performance of the technology.

Recommendations for Acceptance of Data for Use in Performance Verification Role

The DEP shall prepare a report on its findings. At a minimum the report must address the following:

  • Was the data collected by following the protocol and test/QA plan provided by the vendor?
  • Do the data meet the minimum QA / QC requirements of the ETV center verification tests?
  • Do the data adequately support the performance verification of the technology? Are there enough data, and are the data of sufficient quality for the verification organization, the ETV program, and EPA to place their reputations on the line?

The DEP report provides a written statement of the performance of the technology as provided by the data, a statement of how well the data meet the acceptance criteria, and a data acceptance recommendation.

Review and Acceptance of Recommendation by Verification Organization and EPA

The EPA center manager and the EPA ETV director reviews the report, determines whether to accept the data acceptance recommendation, and allows the verification to go forward using existing data. An Environmental Technology Verification Report with an accompanying Verification Statement will be prepared, reviewed, approved, and signed.


APPENDIX D

Return to Table of Contents

RECOMMENDED LANGUAGE FOR SOLICITATION OF VERIFICATION ORGANIZATIONS

Verification organizations in the ETV program are solicited via cooperative agreements, interagency agreements, or contracts. Appropriate language must be incorporated into the solicitation and/or the award/agreement documentation by the Grants Administration or Contracts Management. The following language and supporting documentation are recommended to be included in the solicitation for the verification organization, whether competitive or non-competitive.

Quality Assurance Requirements

The awardee shall comply with the following:

Before award, the proposal shall include a copy of the offerer's quality management plan describing the quality system that provides the framework for planning, implementing, and assessing work performed to carry out the required quality assurance and quality control activities.

After award, the awardee must submit a Quality Management Plan (QMP) prepared in accordance with the EPA Requirements for Quality Management Plans (QA / R-2) and the requirements as described in the latest version of the Environmental Technology Verification Program Quality Management Plan (ETV QMP). The center QMP must be approved by the EPA center manager and EPA center quality manager before testing begins.

 

____________________

Return to Table of Contents

 

 
Begin Site Footer

EPA Home | Privacy and Security Notice | Contact Us