This web site was copied prior to January 20, 2005. It is now a Federal record managed by the National Archives and Records Administration. External links, forms, and search boxes may not function within this collection. Learn more.   [hide]

Executive Summary

Final Minutes from Meeting

Methods and Data Comparability Board

May 11th through the 13th, 1999

Cincinnati, OH



A total of 48 participants attended parts of the meeting either by phone or in person.  Federal agencies represented included EPA, USGS, ACOE, and DOE.  States represented included Virginia, Delaware River Basin Commission (New Jersey), Wisconsin, Arizona, and New York. Other monitoring interests represented were American Society for Testing Materials (ASTM), Standard Methods, Association of Metropolitan Sewerage Agencies (AMSA), East Bay Municipal Utility District, Eastman Chemical, Chemical Manufacturers Association (CMA), Montgomery Watson, Merck, Hach, Tetra Tech, Argonne National Laboratory, VG Elemental, Waste Policy Institute (WPI), Environmental Technical Commercialization Center (ETC2), Dynecorp, American Public Health Laboratories (APHL),  National Institute for Scientific Testing (NIST), and Kent State University.



Outreach Workgroup Meeting

            The Board’s public web site was discussed. The Board will provide additional comments on the proposed web site and slide show during May.  The web site will be made public in June and the site universal resource locator (URL) will be advertised beginning in July.


Funding and CRADAs Discussion

            USGS and USEPA remain very supportive of the Boards efforts; however, sufficient funding to support  all planned Workgroup efforts is currently not available. Additional sources of funding must be located. Pre-proposals for Workgroup funding should be prepared to generate potential funding interest, possibly through CRADAs or grants, and  from other interested entities.


Accreditation Workgroup Meeting

            An outline was developed for a position paper on accreditation of federal laboratories. The workgroup plans to have a draft position paper done by October. Sections of the position paper were assigned to Workgroup members for preparation. The Workgroup will draft a letter to EPA, for Board approval, regarding EPA’s planned approach to providing performance evaluation standards.


Biology Workgroup Meeting

            The revised biological methods NEMI database structure document was approved. The database structure is based on NEMI  specifically for toxicology methods. The workgroup discussed applying the structure and search strategies to microbiology, immunoassay, and invertebrate methods by the end of July.


                                         Minutes from Methods Board Meeting

                                              May 11th – 13th, 1999

                  Andrew W. Breidenbach Environmental Research Center

                                                   Cincinnati, OH





Katherine Alben - NY Dept. Health/SUNY

Harold Ardourel – USGS, WRD

Bob Berger - EBMUD/AMSA

Herb Brass - EPA

Bob Carlson - Kent State

Jerry Diamond - Tetra Tech

Andy Eaton – Mont. Watson/Std. Methods

Rick Dunn - Hach

Larry Keith – WPI / ACS

Charlie Peters – USGS, WRD

Ed Santoro - DBRC

Merle Shockey – USGS, WRD –by phone

Marion Thompson - EPA

Jim King - Dynacorp

Larry Fradkin – EPA

Gene Rice – EPA

Cliff Annis – Merck

Bernie Malo – ASTM                          

Rob Henry – VG Elemental

Tony Scandoro – DOE-Argonne

Michalann Harthill –USGS, BRD

Ann Strong - USACOE

Marianne Lynch – EPA – by phone

Dick Reding - ??

Khoune ??– EPA ATP

Lynn Bradley – APHL – by phone       

Glenn Patterson – USGS, WRD – by phone    

John Klein – USGS, WRD

Barabara Erickson – AZ Dept Health

Donna Francy – USGS, WRD

Mike Miller – WDNR –by phone

Charles Patton – USGS, WRD

Charles Feldman – EPA

Tom Behymer - EPA

Matt Brin – ETC2

Deborah Hammit – ETC2 

Bob Bordner – EPA

Greg Karl – EPA

Chuck Spooner – EPA – by phone

Chuck Job - EPA

Tom Maloney – USGS,WRD – by phone

Dennis McChesney –EPA

Elaine Streets – DOE – Argonne

David Craig - EPA

Richard Ayers – VA DEQ

Ed Glick - EPA

John Rumble – NIST –by phone

Roger Stewart – VA DEQ – by phone




Outreach Workgroup Meeting


Public Web Site 

Demonstrated and discussed current format and content. Discussed questionnaire results:

                     Add counter, add what’s new hyperlink

                     Support for guest book, response mechanisms need to be worked out

                     Complete Board meeting minutes will be available on the web site

                     Products need Board approval prior to going on web site

                     Board members - names and affiliations only not phone numbers and addresses

                     Include caveat for all workplans and products - proposed products, schedule and content could change, Board positions are works in progress

                     Remove all budget information from workgroup workplans

                     Hopefully, at least semi-annually, workgroup chairs get public relations update to Charlie for web page


Publicizing web page – Bob Berger contacted several organizations.  We need to send them web page for review and then they could link to the Board and publicize our activities.  The Board web page can be prominently featured on Office of Water, USGS, ASTM, and other Board member organization web sites.


                     Continue working with Council to coordinate web sites and outreach efforts

                     Obtain appropriate organization news letter spots –hardcopy and email

                     Future presentations should include web site information


Responses to Guest book

                     Charlie will either answer guestbook entries or send to the appropriate person on the Board for response

                     Each workgroup will have a guestbook; there will be no general guestbook on the homepage, to encourage more specific comments and questions

                     If too many requests are received, then it may be necessary to get contractor-support to deal with responses.


Content of Web site:  We need to add information about the next Board meeting and request ideas for discussion at next meeting. Feature a question and answer section (FAQs) based on guestbook entries.


Next steps:  Charlie will send an email to all Board participants for final comments on the public web page and slide show.  Include web addresses in email. The password protected site will be maintained to conduct Board internal business.




Funding and CRADAs


USGS (Bob Hirsch, Lew Wade, and Tim Miller) is very supportive of the Council and Board efforts.  It appears that funds are available for State Board delegate travel costs in 1999.  The June Council meeting will be the last one this FY.  The USGS has been able to provide staff involvement through Charlie Peters, Harold Ardourel, Merle Shockey, Charlie Patton and Michal Harthill’s continued involvement. USGS is committed in the charter to pay for travel for 1 non-fed to board meetings.  USGS is currently funding 5 or 6 non-fed delegates. The USGS expects level funding for FY00.  Funding of workgroup activities are the next budgetary hurdle.  Other organizations need to contribute more.  Hopefully, the upcoming ACWI meeting will help the situation. 


On the EPA side, Chuck Spooner has approximately $400K for Council and Board support.  Herb also reported that his office has $125K available.  Elizabeth Fellows and Jim Hanlon (OW) are very supportive.  However, there is not enough money to run both the Council and Board as hoped or projected. USGS and EPA are presently committing approximately $2 million in salaries, travel, etc. Other organizations need to contribute.


Opportunities for funding in AWWRF and WERF if technical topics appropriate.


Those organizations having new methods or involved in method development may be interested in PBMS pilots.  We need to conceptualize a pilot and then approach organizations.  WERF meets in July; AWWRF meets in late August.  Funds would not be available for a year after that.  It would be good to have Pilot pre-proposals in place by the end of June.  We need basic elements of what the proposed research project would look like and questions being answered. DOE is also very interested in PBMS.  There is a June meeting of the National Analytical Managers in DOE to whom Mary Verwolf will be speaking about the Board.  It may be appropriate to bring up pilot pre-proposals there.


There may also be opportunities to institute or shepherd pilot studies already in progress or proposed (e.g., Wisconsin, Border States - Mexico). Drinking water industries could be partnered with pulp and paper - singular chemicals, disinfection, commercial methods or manufacturers to get funding on pilot projects. Need to get very specific about what we need to do; i.e., studies needed and funding needs.




CRADAs - Larry Fradkin


A CRADA allows federal labs to enter into agreements with private and university sectors.  These parties can provide supplies, labor, etc.  However, the federal government can’t give money directly to private companies.  A CRADA can provide private money to fund feds. CRADAs are about the only way that feds can take private money. CRADAs are not new; they must  make sense to all sides. Feds can get royalties if involved with a successful patent. One advantage of CRADAs is that they are exempt from contract laws for example, they can be noncompetitive.  Indirect costs or direct costs can be included so that overhead is not visible, and can include in-kind services. The Board wouldn’t have a CRADA but the Board could formulate a CRADA through a federal agency. CRADA may allow the Board to avoid some of the beaurocratic and technical hurdles in dealing with AWARF and some other funding organizations.  A schematic of a proposed CRADA could be as follows:


USGS --- EPA à Contractor—Universities—States



Research Consortium –Companies, Research Foundations, Others (private only)


The Technical Commercialization Center (ETC2)(Cleveland, Ohio), funded by EPA, helps formulate CRADAs, developing research consortiums, licenses, etc.  They do the leg work for setting up a CRADA and the services are free. ETC2 could put ideas on the website to try and make this happen. We need to determine specifically what want to achieve.   Each workgroup should develop a plan.


What about cost of managing a CRADA?  Oversight and project management in other CRADAs is covered on the federal side.  There is no cost to the private group.


For Federal - State partnering, there are better methods than CRADAs - interagency agreements for example.


Some concern was expressed in having one private firm  providing the money to do some work in a CRADA.  Perhaps we can get several companies to fund Board efforts through a consortium.


Should the Board endorse the idea of pursuing CRADAs? 


CRADAs are not rigid, they can be renegotiated, and they can be amended easily. They’re easy to get into and out of these agreements. CRADAs take about two months to develop. We should get Rick Lieberman to describe how the AWARF CRADA was managed. Marty Allen was also involved with that CRADA.


Accreditation Workgroup Meeting


Attendees –Barbara Erickson, Harold Ardourel, Andy Eaton, Tom Maloney by phone and Charlie Peters.


Workplan:  The workgroup needs to revise their workplan to emphasize that only federal lab accreditation is being addressed and not state or private lab accreditation.  The workgroup may address those aspects in a future position paper. We need to add a new objective for state and private labs and field certification.


Position paper:  An outline was prepared for a position paper by the Board that states the need for accreditation of Federal water quality monitoring labs. Federal labs need to have national certification to gain acceptance by states for work done for regulatory purposes. Federal labs don’t want to be required to get certified by each individual State.  For accreditation beyond drinking water, it may be advisable to pursue accrediting the system in the lab and not the methods. 


Goals for national certifications:  There is a need to assure comparability of data and assure state acceptance of data (QAQC blessing for data).  The position paper will address water only. The paper needs to describe how DQOs differ for ambient versus compliance monitoring. Compliance DQOs are set in stone. Ambient DQOs may be more or less stringent (one may be looking at lower levels, however, there may not be as stringent MQOs).  It is critical that data meet state data quality requirements, sufficient for sharing and applicable for ambient and compliance monitoring.


There are several types of accreditation.  ISO standards are more general than most which has advantages. A lab needs to show it can meet DQOs. NELAC is very specific regarding how many times you run a sample and on which instrument, etc. Bioterrorism and food safety focus use the ISO25 guidelines (CDC involved with this) (revised number is 17025?).    HL7 format must be used for bioterrorism and food safety. The position paper will indicate that different standards and different programs were considered.  NELAC is the most recognized accrediting body presently. GLP is recognized but not for environmental programs.  The Navy does their own accreditation but they are considering joining NELAC. NELAC is not PBMS-based. ISO is PBMS-based but not to a great level of detail.  AALA  focuses on compliance monitoring labs – a checksheet monitoring system. Only 3 or 4 states use AALA.


We need to start a position paper this FY.  The Canadian lab certification board (CALCAN) should be added to the list of programs. Tom Maloney will help research the ISOs, NVLAP and CALCAN.   Barbara will research NELAC and NELAP. Robin Nissan - Navy. Tom McAninch will summarize ISO web page information. Bart Simmons - E4, NELAC/NELAP. Harold will look at ASTM guidelines and standards. We need to list other write-ups needed and ask for volunteers to complete those parts. Part II - water only and ambient versus baseline. Part III add Barts’ comments to EPA and add other feds. Part IV add ASTM. Part V  add forensic lab accreditation – EPA. Part VI - will be the actual position statement.


Need ACWI support for whatever system the position paper recommends. Five slides will be prepared for the ACWI presentation : 1) goals of position paper, 2) list of fed data collection entities, 3) what standards are currently being used by federal labs, 4) what accreditation programs exist, 5) prepare position paper and ask for support.


The workgroup discussed State drinking water program licensure program and various related concerns. States need to have EPA approval and must run performance standards. Standards can be supplied by NIST. Some states (CA, WI, NY) also do these themselves. Standards do not exist for all matrices. Performance standards generally use water as a matrix, not typically a variety of matrices. There is no money in State environmental agencies to pay for this.


EPA no longer has money to fund state lab accreditation through NELAP. If not, who will do this? Concerned that EPAs NELAC support is slowly eroding away. The EPA office handling NELAC is changing (from the office of  R&D).  EMMC will also be moved into this new office. There is no chief yet and there is concern that there will not be support in this new office. NELAP has seen this coming and so has reduced the standards somewhat before this fell apart. 11 states will have been accredited, but at a lower/quicker standard.


There are still a lot of questions concerning accreditation training, liability, and costs to States. Who will accredit the state lab accreditors?


Jerry Parr and Bill Moore are preparing a 1 day NELAP certification training on how to get and maintain lab certification, based on the latest NELAC rules. This would be presented at the WTQA conference, July 22nd.  Jerry will report on training at the next meeting.


The position paper needs to add CDC, FWS, FS, NOAA, BLM, and others to the federal lab section of the paper. NELAC is a FACA group. AALA and NIST currently do 3rd party accreditation. There are no recognized standards for national accreditation.  NELAC should provide oversight for PE samples. A standard was written, but who will be paid to implement it? Harold presented slides for Herb’s presentation to ACWI, highlighting the position paper under development recommending support for national accreditation of federal water-testing labs.  The workgroup will concentrate for now on federal labs since this is more feasible. 


Biology Workgroup Meeting


Attendees - Gene Rice, Michalann Harthill, Bob Berger, Donna Francy, Ed Santoro, Dennis McChesney, Bob Berger, Katherine Alben, Jerry Diamond, Mike Miller.


The biological methods section for the NEMI structure and use document was revised and  describes what type of biological methods will eventually be incorporated into NEMI, the types of searches we'd like to see for those methods, and the types of organizations we envision using this database.  This revised information was presented and generally approved. The weasel was chosen as the workgroup mascot.


Jerry provided a draft database structure patterned after NEMI and some of EPA's EMMI fields, using a couple of toxicity test methods and a couple of macroinvertebrate collection methods.  Presently, the field that is called method specific requirements (which is not an official NEMI field) attempts to sum up the salient features of a method that distinguish it from other methods of its type and matrix.  This field is essentially Chris Ingersoll's table for tox methods.  The workgroup needs to think about whether this structure is okay for now or whether we really do need separate fields for each specific method requirement.  If possible, it would be nice to have as many similar data fields across biological methods as possible.  The information we put in the particular data field will be different depending on the type of method but the field still applies. One thing to think about here is that the search engine can search on text within a field and pick out those tox methods, for example, that require 25°C or those macroinvertebrate collection methods that use a kick net in wadeable streams.  These don't have to be separate fields for searching purposes.  However, as we were lead to understand by the NEMI group, for query purposes, an item or requirement needs to be in its own field if we think it will be a major item upon which methods will be sorted.  We still need more input here from folks like Jim King from DynCorps who is helping with the EMMI database structure.  Clearly, some fields such as precision or sensitivity, will take work to dig out for a given method and many methods may have nothing to report, at least easily.  That may be okay as that separates methods that underwent more 'validation' from other methods.


As a result of the May 19 workgroup meeting, the work group has decided it will do the following tasks over the next 3 weeks or so:


Microbiological Methods - Donna and Gene

Dennis will send Donna fields they use for microbiology. Donna Francy and Gene Rice will take total coliform and crypto methods and expand their database fields for those methods.  They will look at the NEMI fields as well as draft fields done by Jerry Diamond as they develop a spreadsheet for these particular methods.


Macroinvertebrate Methods

Mike and Tetra Tech will focus on 1 or 2 habitats and work up spreadsheet - methods characteristics. Mike and Jerry will discuss the feasibility of using the fields Jerry devised patterned after the chemistry NEMI fields. 



Katherine updated tables - will focus on atrazine - need to look at recent literature and get methods.



Chris Ingersoll and Jerry will take a look at perhaps one type of toxicity test method and also do the same exercise.


For all of the above methods, if info for a certain field is not known but probably available, that will be noted in the respective table.  If a given field doesn't apply to a type of method, that will be indicated too.  Draft tables are due to Jerry for distribution by June 18.


The workgroup agreed that it should concentrate on populating method database fields for now. Eventually, we will all join forces and design an 'expert system' that would guide a user into NEMI from many potential lines of questioning (e.g., what's the right type of method if I want to answer such and such a question).


Goals of Biological Database

We would like to provide comprehensive evidence of methods so users can decide which ones to use.  How does one decide which method to use? Owners of methods should participate in extraction and review effort, maybe help with financing, and be responsible for NEMI updates?


Program Objectives


Tier 1: “Expert System” level; why someone would want to use biological methods; monitoring goals and biological methods


Surveillance vs. Compliance, trend analysis, process control, land use decisions - BMPs, Rapid vs. detailed characterization, human vs. ecological.


Tier 2:  Method Characterization


                     strategy of method, QA/QC required; quality of data

                     screening versus quantitative

                     media type

                     level of organization

                     field versus lab

                     terrestrial versus aquatic

                     relative cost/effort/expertise


Tier 3:  nuts and bolts of each method.


Where do we address method comparability?

                     partly in quality of data

                     method characteristics (Tier 2) and specifics of methods (Tier 3)


Attachment B summarizes the three Tiers or levels of the database discussed.


NEMI Workgroup


Attendees: Larry Keith, Ann Strong, Jim Keith, Marion Thompson, Bernie Malo, John Klein, Cliff Annis, Elaine Streets, Anthony Scandoro, Herb Brass, Charlie Peters, Chuck Spooner, Merle Shockey, Charlie Peters


Tony Scadora (Argonne National Laboratory) - Database Critique

Tony indicated that it is probably best to buy a known browser search engine for the internet rather than building one.  Oracle has such a search engine.  Chuck Spooner is also involved with a similar issue in STORET and could help out.  Tony noted that the NEMI database is not so huge or the transactions so many.  However, data do need to be reliable.  Oracle would be a good way to go.  USGS can help out.


--         Going to take money to do regardless of which database one uses.

--         Chuck — can EMMI be used rather than Oracle?


Jim King:  EMMI internet version doesn’t have much of the functionality required by the Board. Also USGS has some funding to set-up a pilot NEMI using Oracle, once a database is propulated.  EMMI could be close to the needs of NEMI if linked to the full methods and searchable. 


Larry noted that we have a large variety of users, not only chemists.  EMMI is not necessarily adaptable to non-chemists. 


Chuck questioned whether EMMI will keep going with its own web version if the Board produces a NEMI? 


Marion answered possibly yes.  EPA is concerned that methods are updated and current.  Perhaps EMMI may have a subset of the database that doesn’t have all Board fields.  She also noted that nonwater methods are also in EMMI.


All agreed that the goal is to meet NEMI requirements and the EMMI user community


--         It would be logical for the Board and EMMI to join forces and have one product on water methods

--         Board’s methods or subset of EMMI?


Assumptions for NEMI

(1) Port data from EMMI and EMSD to USGS Oracle after cleanup.

(2) Chemistry and Biological schematics

                     25 fields for chemistry methods plus associated metadata fields (units of measure, etc.).

                     not sure how many new fields for Biological Methods


What needs to be done to clean up data for ORACLE?


1.                  ‘normalize’ metadata in EMMI

2.                  add metadata ‘provenance’ from sources (e.g., one lab validation, multi-lab validation)

3.                  CAS number and synonyms are not a problem

4.                  after #1 and #2, port data to USGS

5.                  Add more method comparability/PBMS information such as interferences?

Reference sources?

Preservation requirements?

Sample preparation?

6.                  Add field analytical methods from EMSD

7.                  Add Help


Next Steps


8.                  Develop rules/options for database structure

9.                  EMSD and EMMI — fit data to database structure and split up data (EMSD) or combine data (EMMI)

10.              Develop algorithms and generate consistent metadata for 350 water methods

11.              Identify critical fields that need to be filled when absent in EMMI and fill them.


Resource Discussion


Cliff: Resources need to be dealt with for this project since central to Board’s mission.


Chuck: Board’s proposal should identify certain assumptions regarding EMMI being underfunded in regards to Board’s needs.


Herb: To get EMMI up on the internet would cost approximately $125K - and this database won’t meet a substantial portion of the Board’s needs. The Board should revisit with Mat Brin the possible use of a CRADA to help with developing NEMI.


EMMI was designed for regulatory community-compliance.  What’s the difference in cost between the update/on-line of EMMI with or without the board’s input?  USGS and EPA should not be the only funding organizations.  This is an interagency effort.  NEMI will be presented to ACWI next week. There is interagency agreement on goal and objectives for NEMI.  We need to keep going and not let a temporary lack of resources stop progress.


Tony —  could have two “entrances” to the same database — one for EMMI and one for NEMI users.  The workgroup needs to quantify the real budget and get appropriate resources.


Chuck — The Board is preparing a new institutional arrangement to put NEMI on another server (not EMMI).  There may be a policy choice regarding funding being unsupportive of EMMI.


Larry Keith —  The Board has come to terms with what changes need to be made to EMMI to make it useful to non-chemists.  There is still no committment for funding to do what the Board recommended.


John Klein— USGS has money available, but has moved it this year. NAWQA has an intense interest. 

Use both EMMI and EMSD in NEMI.  EPA estimates $45K to develop rules/options for database structure.  We need to develop algorithms and generate consistent metadata.  Limit the pilot to 350 water methods; estimated costs is $30K.


For Radioactive methods, the detection limit is method-specific (time duration/half life).  Perhaps detection limit field may need to be modified from other chemical methods. Is it possible to obtain DOE funding  to include these methods?


Funding should come from beyond USGS and EPA such as other feds or CRADAs, etc.


Proceed with NEMI as planned. With funding create the NEMI interface that builds off of  EMMI and EMSD.  Don’t shortchange how NEMI is done or it won’t be used.  We need a detailed proposal that states what is needed and what will be provided for the funding. 


Other Workgroup Interface


Biology - Jerry gave a summary of the biological database structure under discussion.  All agreed that there should be more emphasis on Tiers 2 and 3 (Attachment B) for now; i.e., more emphasis on method characteristics and specifics and not as much on Tier 1 which is an “expert system” level.  That will come later.


Nutrients – Only about 3 of  40 EMMI methods had any precision data. Is that representative of all methods?  There are probably about 10 fields with no data for most nutrient methods.  The workgroup identified 2 or 3 other fields to add.


Water Quality Data Elements Workgroup Meeting


Attendees: Chuck Job, Lynn Bradley–by phone, Glenn Patterson–by phone, John Rumble –by phone, Charlie Peters, Bob Berger, Charlie Patton , Ed Santoro, Dennis McChesney, Andy Eaton, Barabara Erickson, Michalann Harthill, Roger Stewart– by phone, Tom Maloney  – by phone


Review of March 15th Minutes

The first conference call/meeting of the committee was held March 15, 1999, from 1:00 to 3:00 pm  EST. 


The agenda was:  Introductions, Purpose of the Committee, Discussion of Materials attached to email, Approach to Developing a Recommended Set of Data Elements for Water Quality Data Reporting, Next Steps


Include a summary of the March minutes (attachment C) when obtained from Chuck Job.


Organizational Considerations


Workgroup will communicate with each other via email through the internal website email archive. The archive email address is: type WQDE Workgroup in the subject line to allow the archival file system to organize the Workgroup emails.


To view WQDE mail in this archive :


1) get on the internal Board website -URL -

2) When queried type in id = methods Password = nocarp99

3) in the Directory –click on email archive

4) find emails with the subject line of NCOD or WQDE Workgroup, click on them and read them

5) if you reply to the email the emails will be filed together via a thread index in the archive


There is also a WQDE Workgroup home page in both the internal Board website and the Public Board website ( The public website WQDE page includes a guestbook that can also be used to exchange information or to obtain comments from interetsted non-Workgroup members.


The Workgroup will operate by consensus, however, there need not be unanimous agreement.


Should attempt to get Alden Henderson the Board member from the CDC involved in this Workgroup. Also should try and get some State Fish and Game people and a USDA and NOAA individual on the Workgroup.  Also should invite State Environmental Commisioners to participate in the workgroup.


Will take about 1 to 1.5 years to develop the list and definitions and hold requisite public meetings.


Review Meeting Materials


-      list of WQDE groups –includes 13 suggested elements

-         AZ DEQ data elements tables (roughly comparable to data element groups) – from Mario Castenada

-         Des Moines EMPACT project data elements permissible values – 141 elements

-         Environmental Data Registry data element definitions format (linked to HL7 format)

-         HL7 format –from Barbara Erickson

-         Environmental Council of States – developing data elements list  (State Env. Comm)

-         Recommended NCOD data elements – includes 47 data elements and relates them to Appendix M data elements from ITFM report (available on the internal website)

-         USGS NWIS data elements and definitions


Discussion about “What data element groups should be considered to apply to a recommended core water quality data element set?”


-         Need to be sure we connect the water data elements to human health data as well. There is currently a problem linking the health locations geospatially.

-         These data elements should be viable for compliance and ambient data sets.

-         We need a commonly acceptable set of data elements and definitions for a “core” or “mimimum” or “critical” set.

-         Appendix M and proposed NCOD lists are on the website

-         Must have consistency between data element names and definitions.

-         Charlie Peters will get a NAWQA data elements list

-         We should also get the NEMI, Chesapeake Bay, SIDWIS, and DOE data elements list

-         Previous minimum set of data elements required was 5 or 6. STORETX minimum is much larger

-         We should start with a large (complete) date set and then winnow the list

-         Some data eleements apply to samples and some to methods, we need to indicate which apply to each.

-         Will use the “bounded” list of draft list of 13 elements that was adapted from the EPA env. Date reg. As the pilot list and add and subtract from it.

-         Need to add reason for sample collection under sample info element. Also need to add constituent concentration. Need to add time, duration, and sample analyses date to the date element.

-         Chuck will send electronic copy of the 13 proposed set to Charlie top put on the website.

-         Roger Stewart will provide the CIMS and STORET date alement module.

-         Perhaps should eliminate Chemical Substance as an element in the list.

-         Glenn Patterson will crosswalk USGS data elements with the env. Data reg list (available at

-         Bob Berger will get Canada’s system for the Workgroup


Next Steps

-         Develop the criteria that defines  “core” Minimum” “critical” set of data elements –Conference call near the end of June or early July (target date 6/23, 11am EDT)

-         Develop definitions for each data element and common names

-         Continue to compile data elements lists


Meetings with EPA Staff


            Individuals attended various meetings with EPA Staff. The topics were: MDLs with Dick Reding and Paul Britton, OGWDW Methods Development with Dave Munch, ORD Methods Development with Tom Behymer, and ORD Microbiology methods with Gerry Stelma.


Nutrient Workgroup Meeting


Attendees: Rick Dunn, Charlie Patton, Ed Santoro, Katherine Alben, Bob Carlson, Charlie Peters, Michalann Harthill, Merle Shockey -by phone

Suggested the Nutria or a guinea pig for the workgroup mascot.


Action since last meeting

1) Talked to Batelle concerning CRADAs - Workgroup should think about specifics and prepare a proposal to pursue this.

2) Is Tetra Tech money allocated?  They are tasked to do an exploratory literature review.

3) Work with EPA to produce a methods manual/white paper overview.  This will need funding.


Discussion of Approach

NEMI fields of information were provided as a handout and discussed. EMMI Nutrients methods were provided for review as a template.  We need to look through the list and determine which of these methods are most appropriate.


Charlie Patton provided a handout that suggested that a database to compare methods will not be feasible because meta data are often unavailable with any of these methods.  This is what prevents sharing of nutrient data now. Instead, he suggested 4 products: 1)suggest minimum QC requirements for nutrient analyses; 2) establish reference methods for common nutrients; 3) develop replacement methods for antiquated methods; 4) promote filtration and chilling as an alternate preservation approach.


Who will be the users of NEMI? Are nutrient methods important to NEMI?  We need to identify user levels - volunteer groups, those who choose a lab, those running the labs.


Other issues:  Are we interested in helping people with historic data?  We need to include a field for hazardous toxic concerns/waste byproducts.   Should we include any new method that is in the published literature?  A field perhaps should be added that describes the litigation history of the method. Three responses are possible for a new field - historic, approved for drinking water/NPDES (i.e., ambient or compliance) or developmental. There are variations in compliance methods between states.


The Workgroup observed that there were no differences between 80% of methods examined.  Einstein said “keep things as simple as possible but not simpler”. How do we do this without irritating folks whose methods we trash? We need to keep the data fields quantitative and the facts will speak for themselves. People will choose screening tools when appropriate.


The analytical nutrient strategy handout includes a list of issues.


EMMI template review for nitrate


The Workgroup reviewed the brief method summary for ASTM method number D3867 using NEMI data element fields.  The narrative of the method itself should provide any information for the data elements that we know – i.e., various references about interferences, method detection limits, etc.


1) method source - year, reference list-literature review (Charlie Patton can provide), last modification date, organization or agency - ASTM

2) method number - ASTM, or EPA, or USGS etc. - D3867a

3) brief title - nitrate/nitrite automated cadmium reduction

4) official title - nitrate nitrite reduction in water by automated cadmium reduction.

5) chemical name - nitrate/nitrite

6) synonym - 2+3, etc

7) CAS registry number -

8) applicable matrices - what is in this reference -water, waste water,  saltwater. narrative for other matrices?

9) brief method summary -Charlie Patton has canned  

10) MDL and how determined - none given - if not mentioned specifically in method, suggest using appendix b of CFR 40-136, for proper concentration range.

11) concentraion used for method detection limit - none given - would be user chosen, see 10 above

12) applicable concentration range - .05 to 1 mg/l, again as written in the method

13) interferences - based on literature available - suspended sediment

14) instrumentation, including detectors and columns, etc. - as in the method - photometer/spectophotometer, embellished - continuous flow analyzer

15) precision - should be discussed relative to time - within run, between day, also dependent upon concentration. 24 hr or within run - none given.  Precision has been shown to be 1% between days and .3% within run, for example.

16) accuracy -  none given by 3rd party certified reference material check stock, reference material cited and what method has been shown to apply for that concentration in percent

17) sampling methods - none given.  in  narrative include - bottle type , grab or composite, etc  

18) sample preservation - given as 2 ml concentrated sulfuric acid per liter or 24 hrs at < 4°C.  No narrative - acidify to pH less than 2

19) maximum holding time - none given. CFR says 28 days

20) sample prep - remove turbidity by filter or oil and grease by liquid extraction and adjust pH to between 6 and 8 if needed. mention prep for other matrices?

21) quality control requirements “meta data” - none given. narrative - good analytical practice would require duplicates, blanks, spikes, and reference sample with each analytical run.

22)     reference source - none given.  See #16 above - standard reference material.

23)      ruggedness - none given. see Cliff Annis and Ann Strong definition, hopefully this will be an a,b,c, etc - ability to vary various parameters without sacrificing method performance.

24)  relative cost -  none given. on a four dollar sign scale - start up costs and or reagent costs, etc.

25) ASTM address and phone number

26) lab sampling - shake bottle, etc.


Next steps


-     Prepare two tables - First cut between similar methods, then more specific tables that include a number of these equivalent methods.

-     Review current EMMI methods and decide whether to keep in or put others in

-     Have Tetra Tech do a first cut of narratives and then have workgroup review

-     Tetra Tech literature review and outline for a manual

-     Determine what other techniques not in here - through board and council



-         What money is available for nutrients through Tetra Tech?

-         What are plans for writing narratives?

-         Who will be writing the conversions from EMMI to NEMI?


PBMS Workgroup Meeting


Attendees: Andy Eaton, Larry Keith, Barbara Erickson, Marion Thompson, Bernie Malo, Rob Henry, John Klein, Cliff Annis, Tony Scadora, Elaine Streets, Herb Brass, Ann Strong, Jerry Diamond, Dennis McChesney, Dick Reding, Marianne Lynch – by phone, David Friedman – by phone


General discussion

Version 5.2 Dog is being submitted to ACWI for endorsement.

The Board needs to do a PBMS pilot but such a study is probably expensive.  John Klein proposed a nutrient pilot which could be popular with ACWI and others. USGS and other organizations could be used who are already sampling and conducting analyses.


Cliff sent an email to ACS to find out about difficult matrices and other possible analytes.  Total nitrogen is a good example, since we need to examine different digestion techniques and sample preservation methods.


CRADA possibilities:  get Matt and/or Debra from ETCC to help early on in the process.  There is also a good opportunity for WERF in their July meeting.  Need to get in a preliminary proposal. No one had a problem conceptually using a  CRADA mechanism.


EPA Workshops – Marianne Lynch/David Friedman


David described EMMC workshops on PBMS training.  Several workshops are scheduled across the U.S.  These are open meetings to educate people.  The last one was held in Philadelphia. 3-4 modules are in development; in the outline phase.  This is a 3-day training.  David is coordinating the workshops.  Details are not yet worked out on what’s being included.  ERG is the contractor.  Comments and input from the Board are welcome.  EPA is  hoping  to have training modules flushed out by the end of July with training beginning the latter half of this year.  Training for lab auditors will probably be a module but will be one of the last ones to be developed.


PBMS is defined in terms of MQOs and DQOs, and issues unresolved. Reference method - defines performance objectives as part of the method.  At the Philadelphia workshop, two speakers talked about reference method approaches and boundaries of PBMS - method-dependent parameters - not PBMS.  Approximately 30 people attended.  EPA will have different speakers from the region depending on where the workshop is held.  Office of Water and compliance and legal staff are meeting to work out details of the presentation.  Overheads from workshops will be on the web.


Upcoming ACWI Meeting


Herb will present information on the PBMS paper and approach and ask for support.  He will also discuss federal lab accreditation issues developed by the lab accreditation group.  Herb will present bullets summarizing key points of the paper.


ACS PBMS Pilot Study


Larry reported that the pilot changed objectives along the way as they couldn’t evaluate DQOs as hoped.  There was not enough funding for sufficient samples to obtain statistical confidence intervals.  Therefore, the pilot is addressing MQOs.  They used Youden pairs of samples instead of triplicates as planned.  They found that some labs were not able to participate because they can’t take instruments off-line and already have instruments committed for samples.  Samples are being shipped out now or soon.  Analytes include herbicides, metals, semivolatiles, and volatiles  in water.  Matrices include treated municipal wastewater, groundwater - petroleum contaminated, soil - “easy” (high sand/loam), and soil - challenging (some clay).  The total budget is ~ $220K.  Average cost is $20K per lab for water and $25K per lab for soil.  A total of 4 labs are involved.


Barbara noted that it would be good to have samples or sites for which “reference” methods don’t exist but need site-specific methods (PBMS).  This could make PBMS more receptive in a court of law.


Khouane (EPA ATP Program) offered to find data for validation protocols.  Some chemical methods do not have electronic data – they exists as raw data in study reports.  However, the docket for rulemaking is electronic.  For example, Cyanide methods:  compare reference method and MQO approach.  Khouane will email to Charlie the docket for distribution to the workgroup. Khouane will also find budget information; why the instrument manufacturer developed a PBMS method and what happened.  He can perhaps pull out some examples from the lab end as well.


Andy will distribute letters from NEIC and EPA to the rest of the workgroup.  The workgroup should also review Office of Water data before going further on pilot details.  We still need to look at funding for a PBMS Pilot.

Bernie observed that ASTM has found that inter-laboratory site validation studies are very costly.


A PBMS pilot needs to examine different aspects of PBMS.


--         Need to explore how a CRADA might work. Possible sources include CMA, instrument vendors, and standard reference material vendors. 

Need to see the economic benefit of being involved.  We believe this could be demonstrated through lower costs, future revenues, and greater data availability.  In-kind contributions could be important.


--         Explore methods and parameters that have been modified for site-specific reasons, such as matrix, or used in other pilots.  This could be a good opportunity for a CRADA too.


John:  Perhaps it would be a good idea to do a multi-lab field study, with a national view on a simpler parameter such as a nutrient.  This may be very sellable to the Department of Agriculture, USGS, and others.  If include spiking, samples will need to be sent to a central lab for spiking and then the subsamples sent out to labs for analysis.


Total nitrogen measurements are constrained to kjeldahl measurements rather than the simpler total nitrogen persulfate method (alkaline persulfate) which minimizes waste produced. NPDES doesn’t recognize this method. Prerequiste for a pilot is that the lab does a lot of samples by an approved method so that sample results for that side would already be available. There is a beginning model from ITFM for a nutrient pilot. Jerry will look at that to determine whether this is appropriate for a pilot. NWQL does many samples as it is, so the additional samples would be relatively minimal. CRADA players would be fed labs with good data sets, private sectors, non profits, states, etc. The Dupont CRADA model may be a good place to start.  It would be good to get a pilot started by the fall.  What kind of funding might be needed?  No one was opposed to pursuing the use of a CRADA.  Jerry will pull out the ITFM nutrient pilot study and circulate for comment.


The workgroup reached some consensus for John’s idea using nutrients.  CMA might be a potential partner as well.  The workgroup needs to look at the USGS CRADA to see how to do a similar idea for a PBMS pilot.  


National Enforcement Investigation Center (NEIC) comments on PBMS


NEIC is very opposed to PBMS


--         Concerned about liability and notification of regulatory authority before collecting data.


--         Admisability of scientific information into court case — peer review needed — PBMS may not be peer-reviewed.  But definition of peer-review is vague.  The current prescriptive system is more easily enforceable.


Lab Certification Presentation –Ed Glick

CFR 141.28 requires certification for labs that make measurements in drinking water. Only EPA has a certification program which was initiated  in 1943 and developed its current form in 1978.  An information manual on this program is on the internet at ( revised in 1997).


Requirements for prescriptive or promulgated methods are that a lab must pass a PE sample (annual) and on-site audits.  Audits include review of QAPP and data audits.  This is a cooperative program by states, EPA, and EPA regions.  EPA audits regional programs which audit states which audit labs.  States do most of the certification work.  Certification is by analyte and by type of method:  microbiology, chemistry, radiochemistry, VOC’s


Four types of certification are possible:  fully, provisional, none, and interim (not a negative category - certification granted until EPA gets their act together such as for new contaminants, no PE sample available, or timely on-site audits not feasible)


            --         Performance testing samples rather than PE samples.  These should be more available in the future.

            --         Minimum education and training for auditors so they can be proficient

            --         Reciprocity and 3rd party auditors - EPA hires auditors since not done internally


PBMS is consistent with drinking water certification program.  PE samples would still be used but the process would be a lot tougher for auditors because they will now  need SOPs for the lab, a QAPP in place and the information examined prior to the audit.  PE samples need to be verified when using a new method.  If there’s no PE sample, can a lab use a PBMS method?  EPA will publish which parameters will be regulated in the future so as to keep providers up-to-date and so PE samples can be developed.  Training for certification officers will need to be done for PBMS and written into the rule.  Once signed, EPA will need to fund training and certification somehow.


Board may/could take action to recommend side-by-side PE data generation to judge/evaluate quality of contractor-produced PE samples in EPA samples.  A letter may be useful but to whom?  Larry:  ACS policy issue panel would be good recipient of the letter.  Also could visit with congressional staff in order to help the Board get an ear for their recommendations or issues.


Ed noted that vendors of PE samples will be NIST-certified.  NIST was supposed to analyze a certain percentage of samples they produced but they are no longer doing this.  Instead, NIST will rely on lab data generated.  Ed noted that while PE samples are only a part of the certification program, they are a big part.

Barbara made a motion to write a letter recommending scientific examination of the current externalization program and use of side-by-side PE sample testing.  Ed - seconded the motion.


Discussion:  Need to list out scientific issues/concerns in the letter.  Board voted and unanimously agreed.  Harold’s Certification workgroup will draft a letter for review by the Board.  We need to examine the information available before writing the letter.


Board Business


Charlie distributed minutes of conference calls and last meeting and action items.




Delegates were appointed for 2 and 3 year terms. Alternate term length was not determined. Second year will be completed in January 00.  Two-year delegates are – Cliff Annis, Andy Eaton, Bob Carlson, Bob Berger, Mike Miller, Ann Strong, Keith McLaughlin, Katherine Alben, Barbara Erickson, Merle Shockey.  We need to ask alternates and delegates if they want to be reappointed.  All delegates and alternates should evaluate their status and interest.  For the August meeting, specific proposals should be prepared and discussed.


Upcoming meetings


Steering committee meeting - 6/23 1pm EDT


Full board conference call - 7/28 - 1 pm edt


NWQMC - conference will be held week of 4/24/00 in Austin, TX. 


ACWI meeting in 2 weeks. Board presentations to ask for PBMS issue paper support, pilot help, accreditation position paper support, announce that the fact sheet is out, provide copies of workplan, describe a little about other activities.


NWQM Council meeting in 3 weeks – Board will have 45 minutes for a presentation - brief update and emphasize funding?


NELAC – about a month  - Have 20 minutes at opening session to provide a Board overview. Revise current power point slide show, attach to email for Board comments prior to meeting. Needs to be omewhat basic. Use headings from workplan.  Make sure website url is available for them.


WTQA conference in several weeks - Andy presenting on PBMS from Board perspective




-      NWQMC - has a new logo

-         There is travel money in the Dupont CRADA to provide specific individuals with money to travel to the August meeting if we have the CRADA as a significant agenda item.

-         Liked formats for separate workgroups. Biology group making progress. 

-         Happy the reservoir dog has been put to sleep.

-         More was accomplished with this meeting format.  Encouraged by what has been getting done between meetings and the good cooperation.

-         PBMS progress and consensus on NEMI was impressive.

-         Funding is major concern.  Continue to focus on value added work so folks will utilize our work.

-         D19 ASTM committee meeting coming up June 17th in Louisville – on website.

-         Tough to hear on conference lines. Options for videoconferencing but too expensive. Maybe we could use video conferencing for critical issues?

-         Thanks to Herb for hosting the meeting.

-         Solid foundation to workgroups - need to find the funding or it will be in vain.

-         Hold monthly conference calls for both the full Board and the Steering Committee.
































Attachment A.


Interagency Methods and Data Comparability Board


May 11th – 13th, 1999 Meeting Agenda


Andrew W. Breidenbach Environmental Research Center


26 W. Martin Luther King Drive, Cincinnati, Ohio



Meeting Goals:


·         Closure on NEMI database format and approach

·        Approve public Web site

·         Discuss ACWI, NWQMC, NELAC presentation content

·         Prepare Federal Lab Accreditation position paper annotated outline

·         Further develop Biology, Nutrient, and Data Elements Workgroup plans

·        Determine best approach to responding to position paper reviews

·        Develop funding possibilities to pursue

·        Discuss Delegate and Alternate Rotations



Conference Line Call-In Number

513-569-7897 (access code next to specific meeting session)



May 11th, Tuesday


8:30 – 8:45        Welcome/Introductions, discuss agenda - Room 130/138, 6529#


8:45 – 9:45        Outreach Workgroup Meeting – Room 130/138, 6529#

                                    Review and Approve Public web site and Board slide show


9:45 – 10:30       Board funding discussion – Room 130/138, 6529#


10:30 – 11:30     CRADA discussion – Room 130/138, 6529#


11:30 – 13:00     Lunch


13:00 – 17:00     Accreditation Workgroup Meeting- Room 636, 0385#

                                    Marianne Lynch – EPA NELAC funding (by phone)


13:00 – 17:00     Biology Workgroup Meeting– Room 130, 6529#

                                    Wisconsin Pilot and Biology PBMS papers discussed


13:00 – 17:00     NEMI Database Format Subgroup Meeting – Room 138

                                    Subgroup decision on database format approach


17:00                Adjourn


19:00 –              Baseball Game if weather cooperates (tickets purchased at game time)


May 12th, Wednesday


08:30 – 10:30     Water Quality Data Elements Workgroup Meeting– Room 130, 6529#


08:30 – 12:00     NEMI Workgroup Meeting– Room 138, 0385#


Meetings with EPA staff – 08:30 – 12:00


08:30 – 09:30    MDL Discussion – Room G-51

Dick Reding and Paul Britton


09:30 – 10:30    OGWDW Methods Development – lab 184

Dave Munch


10:30 – 12:00    ORD Chemistry Methods Development  - Room 564

Tom Behymer and staff


ORD Microbiology Methods Development – lab 366

Gerry Stelma and staff


12:00 – 13:00     Lunch


13:00 – 17:00     Nutrients Workgroup Meeting – Room 130, 6529#

                                    Rick Dunn on Nutrient Field Kits


13:00 – 17:00     PBMS Workgroup Meeting – Room 138, 0385#

 Marianne Lynch – EPA PBMS Training (by phone)


1700                 Adjourn



May 13th, Thursday


08:30 – 09:30     Lab Certification Presentation – Room 130/138, 6529#

                                    Ed Glick, Carol Madding, and Pat Hurr


09:30 – 11:00     Workgroup report backs – Room 130/138, 6529#


11:00 – 12:00     Board Business  - Room 130/138, 6529#

                                    January Minutes Approval, Action Items Review, Roster,

Steering Committee, Workplans and Resources,

Content of upcoming ACWI, NWQMC, and NELAC presentations


12:00 – 13:00     Lunch (order food in?)


13:00 – 14:00     Board Business (cont.) – Room 130/138, 6529#


14:00 – 15:00     Roundtable – Room 130/138, 6529#


1500                 Adjourn




Numbers where you can be reached :


5/11 - 513-569-7938 or 569-7925

5/12 – 513-569-7925 -  Susan Hagedorn

5/13 – 513-569-7938 – Lillian Holmes

5/11th to 13th – Herb’s voicemail – 513-569-7936










Attachment D.


Methods Board and Workgroup Action Items 5/13/99


Board Activities

High priority

·        Develop a more detailed workplan for the board that includes workgroup plans and financial issues. - Charlie, Herb, Merle, Jerry

A revised draft integrated workplan is on the website for review.


·        Finalize October meeting minutes. Prepare January minutes, executive summary, and action items list for review – Herb, Charlie, Jerry

Final October minutes are on the website. A draft of January executive summary, meeting minutes, and action list are also on the website for review.


·        Consider means to obtaining additional resources from other agencies and organizations. Provide Dept. of Commerce (Dave G), World Bank (Larry K), TVA (Herb -ITFM member), Rural Water Assn. (??), Colorado Lab (Barbara), State possibilities (Herb - NM Conference attendees) names for possible board involvement. Check on developing funding agreements with USGS, EPA, COE, DOE, etc. – Board

Stan Morton (DOE) and Robin Nissan (US Navy) have joined the Board.

Herb and Charlie talked with Larry Fradkin and Tony Ingebritsen about CRADAs.


·        Review Board roster and determine if changes to delegates and alternates is appropriate – Steering Committee and Board

There is a table on the website for review.


·        Address NCOD data elements issue – Board

A Workgroup has been formed with Chuck Job as the chair.


·        Review Board priorities and develop a strategy to address – Board

A strategy discussion has been prepared and is on the website.


Medium priority


·        Wisconsin Pilot study presentation to Board. -Charlie

Probably will be presented at May Board Meeting. Mike Miller will present a part of this.


·        Provide review for LTMDL papers – Board


·        Get an electronic copy of Lee Mannings STORETX presentation for the website – Charlie


·        Provide Board with rough schedule of DuPont CRADA reports for review. - Charlie, Bill Battaglin


Low priority


·        Select and appoint a Board delegate to represent American Indian Nations.-Merle, Herb, Charlie

Oneida and Menominee Nations contacted, still need to contact Great Lakes Tribal Council, Red Lake Tribe, Indiana Health Service (Ellen Meyer), etc. Delegate Profiles will be sent and a delegate selected.


·        Put Alden Henderson (CDC) in contact with Ron Hoffer (EPA) about EPA/CDC connection in microbiology. – Herb


·        Provide a link to the volunteer monitoring website - Charlie


PBMS Workgroup

High priority


·        Revise final PBMS paper and send to ACWI for approval – Andy, Jerry


·        Provide information on EPA's PBMS training strategy. – Jan, Lew

Andy and Lew will contact Jan Jablonski (EPA) to get a training presentation at May meeting.


·        Test chemical PBMS approach (perhaps use current ACS round robin study as the pilot) –Workgroup


·        Read EPAs powerpoint PBMS implementation plan on the website – Workgroup


·        Provide DQO and EPA ICR links on website – Charlie, Herb

DQO links have been made.



Medium priority


·        Subgroup will develop definitions for PBMS. - Larry, Merle, Jan


·        Attend January 14th NELAC meeting and represent Boards position and better define ELAB position and differences – Lew, Merle, Herb, and Andy.

Merle, Harold attended and made presentations.

·        Put summary of v4.0 to v5.1 changes and comparison of ELAB and Board position papers on the website – Charlie, Jerry



Low priority


·        Make biology PBMS presentation to Board. - Jerry in May


·        Contact ES & T and prepare position paper for publication – Andy, Herb, Jerry




NEMI Workgroup


High priority.


·        Summarize NEMI design specs for IVV review – sub group


·        Provide Larry with list of prioritized top 50 analytical methods by agency. - Charlie (USGS field), Stan (DOE), Giles (State of Kentucky- field), Katherine (NY State) any others with lists should also send them. Lists should include: Method name and source; whether it is a lab, field or preparation method; the type of analyte - chemical, toxicological, VOC, etc.


·        Prioritize methods to include in database/compendium. – Workgroup


Medium priority


·        Obtain a copy of the EMMI meeting handout for the website – Charlie, Marion


·        Obtain a copy of EMMI workgroup on precision and accuracy strawman – Marion



Low priority


·        Meet with STORET nomenclature workgroup and report to NEMI - Chuck Spooner







Outreach Workgroup


High priority


·        Prepare a questionaire to solicit Board and NWQMC comments on public website. Update external website based on questionaire for review at May meeting. – Bob, Charlie


·        Revise slide show. - Jerry


Medium priority


·        Prepare draft of updated Fact sheet - Charlie and Jerry


·        Prepare Wisconsin Pilot Water Quality Fact Sheet - Charlie and Jerry


·        Research means of getting interested individuals to website - Charlie


Low Priority


·        Develop and Prepare a Board Information packet - Charlie and Jerry.


·        Develop and produce a mobile display - Jerry



Laboratory and Field Accreditation Workgroup


High priority


·        Prepare an annotated outline of a position paper on lab accreditation- Harold


·        Invite Jean Mourrain to May meeting to discuss position paper draft.- Harold


Medium priority


·        Get the LTMDL power point presentation to NELAC. – Gary


Low priority


·        Provide a website link to ITFM report accreditation appendix. - Charlie




Nutrient Workgroup 


High priority


·        Provide information on nutrient kit methods– Dave Gustafson


·        Prepare a letter to Board and NWQMC requesting nutrient methods (provide format instructions)  – Bob


·        Prepare a letter to Cantilli volunteering Board help – Bob



Low priority


·        Provide a reference from marine chemistry for website - Charlie Peters from Charlie Patton


·        Talk to EPA nutrient criteria team and report to Workgroup – Bob


·        Look at Wisconsin pilot effort reports (on website) – Workgroup


·        Obtain a copy of the ion selective electrode presentation for the website – Charlie, Jack


·        Conduct a literature review and summarize – Dave G., Charlie Patton, Jerry, Bob



Biological Methods Workgroup 


High Priority


·        Develop list of Microbiological and Field contacts and select workgroup members

Gene Rice (EPA), Chris Yoder (OHEPA) and Donna Francy (USGS) have agreed. Bob Carlson will check with Laura Leff (KSU), Jerry Diamond will talk to Chris Faulkner, Chris Ingersoll will contact ASTM, and Charlie Peters will talk to NAWQA.


·        Obtain a copy of the DOE chemical methods compendium to review it's organizational structure - Charlie and Stan Morton


·        Develop specific short term workgroup objectives – Workgroup


·        Fill out Katherines table for each subgroup – sub group chairs


Medium priority


·        Add ITFM Biology technical appendix, NAWQA interagency workgroup report, Jerry’s NABS report, Donnas table, Katherines power point presentation on website – Charlie


·        Writeup ideas on workgroup organization and purpose and send to Jerry to collate into a strawman – Donna, Bob





High Priority


·        Form Workgroup – Chuck


·        Prepare Workplan – Chuck, Charlie


Water Quality Data Elements Committee — Conference Call Summary (Aug. 31, 1999)


The Committee (workgroup) has developed a draft set of generic data elements.  A draft set of selection criteria were also circulated.  These materials have been developed with an eye to several existing federal/state data systems.  A list with several of these data system is show below.  The bold-faced abbreviations or acronyms are used in the subsequent summary.



                                                     Select Data Systems or Initiatives


These have been used to define current working lists of data elements and criteria or could be used for comparison purposes to refine notions of a “core” list of elements and criteria.



USGS National Water Information System II replaces NWIS I and the WATSTORE system.  A major source of flow and other hydrology data.  See list included below from Glenn Patterson. Also see the overview at <>.  There is a WEB-based version of NWIS (NWIS-W) at this url <‑w/US/>.USGS Contact: Alan Lumb



Focus is on “modernized STORET” as opposed to “legacy” STORET (similar to USGS and NWIS versus WATSTORE).  EPA preparing a data elements summary list.  Goos background information at the following url <>



Useful Information in final reports, especially in several Appendices (e.g, Appendix M).  On the WEB at the following url <>.



Minimum Data for Groundwater Data.  Contact C. Job.


National Drinking Water Contaminant Occurrence Database (NCOD)

Required under SWA Reauthorization.  Overview at the following  url <>.  Contact: C. Jobs


Unregulated Contaminant (Monitoring) Rule (UCM or UCMR)

SDWA also requires revisions of the Unregulated Contaminant Monitoring Rule (UCMR) — information at the following url <>. Contact: C. Job.



The CALFED Bay‑Delta Program, a cooperative effort among state and federal agencies and California’s environmental, urban and agricultural communities, was initiated in 1995 to address environmental and water management problems for the Bay‑Delta system.  Overview at the following url <>.  EPA Region 9 for contact.


Section 319 National Monitoring Program (NPS 319)

The Section 319 National Monitoring Program projects comprise a small subset of                     NPS pollution control projects funded under Section 319 of the Clean Water Act as                     amended in 1987. The goal of the program is to support 20 to 30 watershed projects                     nationwide that meet a minimum set of project planning, implementation, monitoring,                     and evaluation requirements designed to lead to successful documentation of project                     effectiveness with respect to water quality protection or improvement. Overview at the following url <>


Permit Compliance System (PCS)

EPA OECA.  Regulatory enforcement oriented.  “Monitoring Data” components abstracted in several other databases and in BASINS.


EMAP — contact: Bob Shepanek, ORD


Chesapeake Bay Program.  Contact: Ricky Banner/C. Spooner. A list (data element dictionary) available on some WEB page — but Spooner will check to see if this is their most current list.


Other large Regional./State System?? E.g., from TVA or efforts in South Florida?


NEMI (National Environmental Monitoring Initiative or Index)

National Environmental Monitoring Initiative aims to integrate and coordinate environmental monitoring and related research through government and private‑sector collaboration, in order to enhance the utility of existing networks and programs.  Development of the National Environmental Monitoring Initiative is under the leadership of the Committee on the Environment and Natural Resources, in the White House Office of Science &Technology Policy.  An overview of 34-35 federal systems is available. <>



A set of proposed selection criteria was circulated prior to the conference call as a follow-up from the last meeting on June 23, 1999. This draft list is given below:


                                  Selection Criteria for "Core" Water Quality Data Elements


The Water Quality Data Elements Committee developed proposed selection criteria for "core" water quality data elements on June 23, 1999, as follows:


The purpose of the selection criteria is to allow comparison with other data sets at the level of the sample test result by:


(1)        Providing the answer to or creating the possibility to answer the basic questions of:

a.    What is being measured? (Perhaps overlaps with f below?)

b.    What is the constituent's concentration?

c.    Where was the sample taken?

d.    When was the sample taken?

e.    What is the type of water source? (e.g., waterbody type; surface water versus groundwater)

f.    Is there co-occurrence with other chemical, physical or microbiological parameters?

g.   Why was the sample collected?

h.   How was the sample obtained?

i.   What is the level of confidence in the reported results for the range of methods used, including, at a minimum, QA/QC data? (And how this would apply to derived measures or metrics?)


(2)        Identifying the originating organization (i.e., allowing the possibility of request for additional data) (or changes in previous data entries)



Some points were then raised ....


Some were concerned that the Committee define its purpose as sticking to the definition of a set of CORE elements (especially for data elements). If the list were too large (for instance, some draft materials would produce a list with about 63 data elements), then this would just put people off and make it harder to get buy-in. Others thought this was a reasonable idea as an ultimate goal — but at this time its was perhaps better not to worry too much about immediately pruning down the list(s). The draft list(s) should be compared against a set of actual major data systems to see how to pick a cut-off point.  Some felt that certain existing lists/systems were already pretty close to a set of core elements (e.g., maybe NCOD?).


Noted that lists of elements/criteria should make it easy to tell if a site were a “clean” site (e.g., an reactively undisturbed “reference” site).  The aim also seems to be to include both “natural” waters as well as treated water.



A set of “minimum data elements” was prepared by the USGS based on their NWIS data system.  This data element summary is included below ...



# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #

Glenn G. Patterson, Hydrologist               E‑mail:

U. S. Geological Survey, WRD                Voice:  (703) 648‑6876

412 National Center                    Fax:    (703) 648‑5722

Reston, VA 20192                      www:

# # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # # #






(July1, 1999)




1. Site name


Definition:         Name of sampling site

Examples:         Pine Creek near Hillsboro; Mill Valley Municipal Well no. 3;

Davis Spring near Simpsonville, Sam Turner domestic well


2. Latitude


3. Longitude


4. Lat‑Long accuracy


Definition:  Numeric code to designate the degree of accuracy of the



5. Site type


Definition:         The location type for the site

Examples:         Stream, well, spring, distribution system, lake, ocean,



6. Primary use of site


Definition:         Primary reason for the site's existence, or primary use of

the water

Examples:         Contamination monitoring well, public supply well, streamflow



7. Raw or treated


Definition:         Designation as to whether the sampled water is raw or





8. Depth to top of uppermost open interval


Definition:         Distance, in feet, from top of casing to top of  upper

screen or open interval


9. Depth to bottom of lowermost open interval


Definition:         Distance, in feet, from top of casing to bottom of lowermost

screen or open interval


10. Elevation of top of casing


11. Elevation accuracy





1. Sampling entity identification


Definition:  Name of the organization responsible for obtaining the


Example:  U.S. Geological Survey


2. Begin date


Definition:         Date withdrawal of water for this sample commenced


3. End date


Definition:         Date withdrawal of water for this sample ended


4. Sampling depth


Definition:         Depth, in feet below top of casing or land surface, to

point at which sample was withdrawn, or designation for depth‑integrated



5. Reason for sampling


Definition:         Reason for taking the sample

Examples:         Contaminant monitoring, State ambient network, compliance

monitoring, trip blank


6. Sampling method


Definition:         Method used to obtain the sample

Examples:         Automatic sampler, grab sample, 24‑hour composite, isokinetic






1. Filtered or unfiltered


2. Parameter name/code


3. Parameter  value and units


4. Data‑quality indicator


Definition:         Code to designate accuracy of the parameter value


5. Date of analysis


6. Name of lab


7. Method code


8. Minimum detection level



The Committee then went element-by-element through a table of proposed data elements.  Comments on some of the items are indicated in yellow-line.  In the last column of the table, Committee members filled in applicable SELECTION CRITERIA.  For instance, an entry such as “1c” would refer to the criterion (see list above): “Where was the sample taken?”  Examples of actual data systems were also added in this column. The original table contained 47 data element “rows.”  Additional data elements were discussed tapping other lists or materials — which would bring the total number of data elements up to around 63.  A few of these “extra” items are noted at the end of the table.






Discussion Draft  1/22/99

 Recommended Data Elements


Water Quality Monitoring Results



* ITFM Appendix M

Recom-mended Data Elements




Check-off Column

(refer to attached criteria)




Data Element




Releted Reference Name


Write in applicable criteria




Sampling Station/Facility Identification Number


The code used to identify each sampling station/facility.  The code begins with the standard two-character postal State abbreviation; the remaining seven (?) characters are unique to each sampling station/facility.   The same identification number must be used consistently through the history of monitoring to represent the sampling station/facility.


Site name, Site number







Sampling Station/Facility Type


The location type represented by the sample.  The valid choices are:

(a)  Finished/treated drinking water

       (i)  Finished Water from treatment system

       (ii)  Entry Point to the distribution system after


       (iii)  Within the Distribution System

       (iv)  End of the Distribution line with longest

               residence time

       (v)  Household/drinking water tap

       (vi) Unknown

       (vii) Other

(b) Ambient/Raw/untreated water

      (i) Dedicated monitoring station

(A) Stream

(B) Lake

(C) Wetland

(D) Reservoir

(E) Ocean-coastal

(F) Ocean-open

(G) Well

(H) Spring

(I) Precipitation

     (ii) Temporary monitoring station

(A) Stream

(B) Lake

(C) Wetland

(D) Reservoir

(E) Ocean-coastal

(F) Ocean-open

(G) Well

(H) Spring

(G) Precipitation

(c) Biological

     (i) Point

     (ii) Transect


Site type











Reason for Sample Collection



(a) Safe Drinking Water Act - Regulated Contaminant Compliance

(b) Safe Drinking Water Act - Unregulated Contaminant

(c) Clean Water Act

     (i) Routine Compliance

     (ii) Pollution Event

     (iii) Storm Event

(c) Comprehensive Environmental Response, Compensation, and Liability Act

       (i) Reconnaissance

       (ii) Routine Compliance

       (iii) Pollution Event

(d) Resource Conservation and Recovery Act

       (i) Reconnaissance

       (ii) Routine Compliance

       (iii) Pollution Event

(e) Federal Insecticide, Fungicide and Rodenticide Act

       (i) Reconnaissance

       (ii) Routine Compliance

       (iii) Pollution Event

(f) Food Quality Protection Act

(g) Other


(h) National Water Quality Assessment (US Geological Survey)

       (i) Reconnaissance

       (ii)Pollution event

       (iii) Storm Event

(i) State Water Quality Assessment (Include pick list of States)

       (i) Reconnaissance

       (ii)Pollution event

       (iii) Storm Event

(j) Research

(k) Volunteer

(l) Other












          (ii) Pollution event

          (iii) Storm event








Water Source Type

(Perhaps collapse into #2 ???)


The source type represented by the sample.  The valid choices are:

(a)  Surface water.

(b)  Ground water.

(c) Precipitation/Atmospheric


Water body type







Water Body/Aquifer Name


Name of the lake, stream, river, estuary, aquifer or other water feature related to the physical site.


Water Body Name/

Aquifer Name








River Reach Code


Code representing a section of a river or stream defined by the components of the River Reach File 3 (RF3) file.


USEPA River Reach Code







Sample medium code*


Alphanumeric code that designates the environmental material about which results are reported from either direct observation or collected samples; for example, water, tissue, and/or sediment.


Sample medium code







Substrate Code*


Code that represents the material to which sessile organisms are attached.


Substrate Code






Sample Identification number

(Perhaps could be “created” by combining other data elements ??)


A unique identifier assigned by the sampler (or sampling organization) for each sample.


Sample Number








Sample Collection Date


Date sample was collected, reported as two digit month, two digit day and four digit year.


Collection end date










Sample Collection Method


The method used to collect the sample.


Sample Collection Method









Sample Collection Depth


The depth at which a sample was collected fro a well or other source of water.


Sample Depth






Collection Depth Unit of Measure


The Unit of Measure (UOM) for the depth at which a sample was collected from a well or other source of water.








Number of Samples Composited


Indicates the number of samples combined to produce the composite









Sample Type


The type of sample collected.  Permitted values include:

(a)  Reference Sample

(b)  Field Sample (standard sample)

(c)  Confirmation Sample

(d)  Field Blank

(e)  Equipment Blank

(f)  Split Sample

(g)  Replicate Sample

(h)  Spiked Sample


QC Sample type






Taxonomic Key*


Alphanumeric designation for the unique, official scientific name of a biological organism and its position in the taxonomic nomenclature hierarchy.


Taxo-nomic Key






Biological part code*


Alphanumeric code that designates the identification of the specific anatomical part of an organism that is being measured; for example, liver, heart, cell wall, or whole organism.


Biological part code






Latitude of Sampling Station/Facility


The location of each source intake, well or wellfield centroid, and treatment plant associated with a sample expressed as decimal degrees








Longitude of Sampling Station/Facility


The location of each source intake, well or wellfield centroid, and treatment plant associated with a sample expressed as decimal degrees








Latitude/Longitude Accuracy*


Quantitative measurement of the amount of deviation from true value present in a measurement that describes the correctness of a measurement.



Longitude Accuracy






Latitude/Longitude Method*


Procedure used to determine the latitude and longitude, includes the reference datum.



Longitude Method*








Vertical distance from the National Reference Datum to the land surface, reference mark, or measuring point at the site (feet or meters).








Altitude Method*


Method used to determine the altitude value, including the National Reference Datum on which the altitude is based.


Altitude Method






Bottom Depth*

(combine with 24 ??)


Depth of  water column at station, measured from the surface of the water to the sediment/water interface.


Bottom Depth






Well Depth*


Depth of the completed well below the land surface, in feet or meters.


Well Depth






Well open interval, bottom*


Bottom of the open or screened interval of the well (feet or meters below land surface).


Well open interval, bottom








Well open interval, top*


Top of the open or screened interval of the well (feet or meters below land surface).


Well open interval, top









(or Constituent)


Need to be able to indicate filtration method(s)


The contaminant for which the sample is being analyzed.












Analysis Date


(Can sometimes be different than sample date)


Date that the analysis was completed in 2‑digit month, 2‑digit day, and four digit year.


Analysis End Date







Analytical Results - Sign


An alphanumeric value indicating whether the sample analysis result was:

(a)  (<) less than means the contaminant was not detected according to the required minimum reporting level (i.e., MRL) at the time of analysis.

(b)  (=) equal to means the contaminant was detected according to the  to the required minimum reporting level (i.e., MRL) at the time of analysis.


Value Quali-fier(s)







Analytical Result - Value


The actual numeric value of the analysis.

To include Density for microbes.











Unit of Measure


The unit of measurement for the analytical results reported. (e.g., µg/L, pCi/L, CFU/mL, etc.)










Analytical Method Number


The method number of the analytical method used.


Analytical Method; Method Refer-ences










Detection Level


Detection level is referring to the detection limit applied to both method and equipment.  Detection limits are the lowest concentration of a target analyte that a given method or piece of equipment can reliably ascertain and report as greater than zero (i.e., Instrument Detection Limit, Method Detection Limit, Estimated Detection Limit).


Detection Level Value










Detection Level Unit of  Measure


(Can get very involved; for instance, USGS uses their LTMDL approach)


The measurement units used to express the concentration, count, or other value of a contaminant level.

(e.g., µg/L, pCi/L, CFU/mL, etc.)









Detection Level Method*


Method for determining the detectable quantity of a constituent on the basis of laboratory conditions, analytical method, and/or field conditions.


Detection Level Method







Reporting Level


If the lowest numerical value that a laboratory can report reliably for a test result based on the laboratory’s experience with the method and equipment is different that the Detection Level, then it should be reported as the Reporting Level.










Reporting Level Unit of Measure


The measurement units used to express the concentration, count, or other value of a contaminant level.

(e.g., µg/L, pCi/L, CFU/mL, etc.)









Analytical Precision


Precision is the degree of agreement among a set of repeated measurements and is monitored through the use of replicate samples or measurements.  Precision is expressed as:

(a)     Standard Deviation (SD)

    SD = [{ (xi ‑ avg x)2} / (n‑1)]

(b)  % Relative Standard Deviation (RSD),

    % RSD = (SD / mean concentration) x 100  , or

(c)  Relative Percent Difference (RPD),

    RPD = [(X1 ‑ X2) / {(X1 + X2)/2}] x 100


Precision of Value








Analytical Accuracy


Accuracy is a measure of confidence in a measurement and can be assessed by calculating:

(a)   % deviation

    % deviation = [(average x - true value) / true value] x 100; or

(b)  % recovery (Rec)

    % Rec = [(amt. found in Spiked sample ‑ amt. found in sample) / amt. in spiked sample] x 100

Accuracy describes how close a result is to the true value measured through the use of spikes, surrogates, standards, or PE samples.


Bias of Value










Chemicals: Presence- a response was produced by the analysis (i.e., greater than or equal to the MDL but less than the MRL)/Absence- no response was produced by the analysis (i.e., less than the MDL).

Microbes:  Presence- Indicates a response was produced by the analysis /Absence- indicates no response was produced by the analysis.










Sample Analysis Average Period


(Delete ??? — related to the analysis, not the data)


Indicates the period over which a running average was calculated:

(a) year

(b) Quarter

(c) month

(d) week

(e) daily

(f) hourly









Sample Analysis Measurement type


(Delete ??? — related to the analysis, not the data)


(a) Direct Measure

(b) Arithmetic Mean

(c) Running Average

(d) Percentile


Result Type







Sample Result Valid Indicator


Indicates whether the sample met all Quality Assurance and Quality Control Standards








Study/Report Reference


(Published Methods ??)


Title, Reference Number, Author(s), Date, Address to receive copy of study/report




Keep but modify







(How to handle mobile “field untis”???)


Laboratory conducting the testing, including:

Laboratory name and identification number



Analyzing Lab







Laboratory Address


The laboratory location including: Laboratory address, city, state, zip code, telephone number










FIPS information or ECOREGION











HUC; Habitat Info












Sample batch ID; Information on sample spiking methods












Sampling entity; filter pore size; source for published method











Sample preservation








See C. Job on additional informaiton for items 48-63 added at last minute.

                                              Additional Discussion and Summary Notes


The workgroups covered ways to combine a set of proposed data elements with a set of listing criteria.  The goal is to arrive a a core set of data elements.  Eventually, any “consensus” set of data elements will need to be related to elements included in NEMI.  The final set of core elements should be kept a “lean” as possible to make it easier to “sell” the idea to organizations operating major data systems or other groups setting up new data systems.  To make sure there is a good rationale for including (or not including) items in a core list of data elements, it would be useful to carry out a comparison of any proposed lists with the content of several major data systems.


A comparison of proposed data elements with major data systems should help reveal what is involved in sharing data between these systems.  This comparison would highlight challenges involved in reconciling any new lists with established data standards.  These themes of data sharing and the proper application of data standards is related to the larger concerns of the Council.  There was consensus that there is a great need to achieve better use of existing information: a set of data criteria or standards should not always be equated with starting from scratch to gather new data.  Performance-base methods/standards should emphasize the effective use of existing data systems wherever possible.


In carrying out summaries of existing major data systems, a table along the following lines might be helpful:






















What is being measured


Sample Location




Sample Location






Substrate Type




Substrate Type







What is the constituent concentration?




































HomeReturn to the Minutes Page