skip to content
Seal of U.S. Department of Labor
U.S. Department of Labor
Employment & Training Administration

Photos representing the workforce - Digital Imagery© copyright 2001 PhotoDisc, Inc.

www.doleta.gov Search:
Advanced Search
About Us Advancing Your Career Business and Industry Workforce Professionals Grants and Contracts ETA Library Performance and Results Regions and States Other Topics
       ETA Home  >  Performance and Administration >  Program Guidance > 

Frequently Asked Questions on Current Performance Initiatives

[Common Measures] [Reporting] [Data Validation]




Common Measures

Question:What are common measures?
Answer:Common measures are a management tool. They are a starting point for conversations about similar training and employment activities, based on the core purposes of the workforce system. Key attributes include:
  • Universal language and standardized data.
  • Employment-focused measures for adult programs, and skill attainment measures for youth programs.
  • Designed in partnership with other federal employment and training agencies.

---------------------------------------------------------------------
Question:Why do we need common measures?
Answer:We see four key benefits from common measures:
  • Focus on the core purposes of the workforce system.
  • Break down barriers to integration resulting from different definitions, data and reports for each workforce program.
  • Resolve questions raised by GAO and other oversight agencies regarding consistency and reliability of data.
  • Reduce confusion among our customers and stakeholders who want to know about results.

---------------------------------------------------------------------
Question:What Federal agencies are implementing the common measures?
Answer: In addition to programs administered by ETA, the following Federal programs are participating in the common measures initiative:

Department of Labor
Veterans' Workforce Investment Program
Disabled Veterans' Outreach Program
Local Veterans' Employment Representatives
Homeless Veterans' Reintegration Program

Department of Education
Adult Education
Rehabilitation Services:
  • Vocational Rehabilitation Grants to States
  • American Indian Vocational Rehabilitation Services
  • Supported Employment State Grants Projects with Industry
  • Migrant and Seasonal Farmworkers State Grants for Incarcerated Youth Offenders Vocational Education (Carl Perkins):
  • State Grants
  • Tech Prep State Grants
  • Tribally-Controlled Postsecondary Vocational Institutions

Department of Health and Human Services
Temporary Assistance to Needy Families

Department of Veterans Affairs
Vocational Rehabilitation and Employment Services

Department of the Interior
Job Placement and Training

Department of Housing and Urban Development
Youthbuild
---------------------------------------------------------------------

Question:Will all the Federal agencies use the same methodologies to calculate the common measures?
Answer:All the agencies will generally use the same methodologies to calculate the common measures, but some differences will occur because of program design or reporting practices. For example, ETA will use the third quarter after exit as the measurement point for the attainment of a degree or certificate measure, while the Department of Education will use one year following program exit.
---------------------------------------------------------------------
Question:Will the common measures be the only information collected on ETA programs?
Answer:No. Measurement of performance, and management and oversight of programs, will continue to require the collection of information that is relevant and important to each discrete program. Further, some statutes explicitly state what information must be collected, and the common measures do not supercede those statutory requirements.

ETA will continue to collect all the data on program activities, participants, and outcomes that is necessary to convey full and accurate information on the performance of workforce programs. The collection of information beyond that required for common measures will also help frame and provide context to the outcomes reported through these measures.
---------------------------------------------------------------------

Question:How will required levels of performance be determined?
Answer:Implementation of the common measures will standardize the measurement of performance, but requirements for meeting performance targets or goals will be implemented in accordance with provisions for each separate program. Under WIA, for example, incentives and sanctions are currently based on the performance measures in Section 136 of the Act, not the common measures.


Back to Top Back to Top
---------------------------------------------------------------------

Question:Who is a program participant?
Answer:Individuals who are determined eligible and receive any service funded by the program in a physical location (e.g., a One-Stop career center) are participants. The criteria that are used to determine whether an individual is eligible to participate will be based on the guidelines for the program.
---------------------------------------------------------------------
Question:Can an individual be a participant in more than one program?
Answer:Yes. Individuals can be a participant in several different programs, either sequentially or concurrently. An individual is a participant in every program that funds the service received by the individual.
---------------------------------------------------------------------
Question:Are individuals who access services through the Internet considered to be participants?
Answer:States and grantees may choose, but are not required, to collect the information necessary to consider as participants those individuals who receive services that are available through the Internet and are not accessed at the physical location of the program (e.g., a One-Stop career center).
---------------------------------------------------------------------
Question:When do individuals count in the common performance measures?
Answer:All program participants will be taken into account when measuring performance using the common measures, although not all participants are included in every measure.
---------------------------------------------------------------------
Question:Currently individuals receiving only self-service or informational services are not included in the WIA core indicators of performance. Why is ETA establishing a different population to be counted in the common measures?
---------------------------------------------------------------------
Answer:The concept of program participation establishes a common policy for ETA programs regarding the population to be included in the common measures. By including even participants who receive only self-service and informational services, we will have a more accurate picture of the number of individuals being served under WIA. In addition, the policy responds to criticisms from the Office of Management and Budget and the General Accounting Office that current WIA registration policy has been implemented inconsistently across states.
---------------------------------------------------------------------
Question:What is the definition of program exit?
Answer:Exit occurs when a participant does not receive a service funded by the program or funded by a partner program for 90 consecutive days.

ETA will no longer using the concept of "hard exit," although case managers will likely continue to track completions and terminations from a program. The current WIA exit policy has been inconsistently implemented across the country. Comparability of performance information across states and grantees is only possible if a common point in time is used to begin measurement.


---------------------------------------------------------------------

Question:How are underemployed individuals treated in the entered employment measure?
Answer:Individuals that are employed at participation are not counted in the entered employment measure. The definition of employed at participation is similar to the definition currently used for the WIA entered employment indicator - individuals who did any work at all as a paid employee during the seven consecutive days prior to the date of program participation (except those who have received a notice of termination) are considered employed. Employment status at the date of participation is based on information collected from the individual.
---------------------------------------------------------------------
Question:How will the outcomes for dislocated workers be calculated?
Answer:For the entered employment measure, an individual is considered to be not employed at the date of program participation if:
  1. He or she has received a notice of termination of employment; or
  2. His or her employer has issued a WARN or other notice that the facility or enterprise will close.

For the earnings gain measure, the pre-program earnings of dislocated workers will be one quarter prior to the date of participation, regardless of the dislocation date. ETA will continue to collect data on wages at the point of dislocation for program management purposes.


Back to Top Back to Top
---------------------------------------------------------------------

Question:What is ETA trying to accomplish with the new certificate definition in the attainment of a degree or certificate measure?
Answer:The definition is intended to ensure that certificates are:
  1. Developed or endorsed by employers;
  2. Tied to specific technical/occupational skills; and
  3. Approved or awarded by an appropriate educational, governmental, or employer entity.

ETA has made every attempt to use clear language in the definition to ensure consistent application of the measure across states and grantees. ETA worked very closely with the Department of Education to develop the certificate definition.
---------------------------------------------------------------------

Question:Does the certificate definition include certificates awarded by private career schools licensed by a state?
Answer:Some of the certificates offered by state-licensed private career schools will not meet the criteria outlined in the certificate definition. This issue generally affects schools that have not been accredited by accrediting agencies recognized by the Department of Education.

Setting a high standard and using a tight definition is necessary for a meaningful and consistent outcome measure and to assess the workforce system's ability to provide individuals with the credentials needed to succeed in the labor market.
---------------------------------------------------------------------

Question:Do Individual Educational Plan (IEP) diplomas and other alternative secondary school exit documents awarded to youth with disabilities count as a degree in the attainment of a degree or certificate measure?
Answer:No. IEP diplomas are not recognized as equivalents to high school diplomas or GEDs for Federal reporting purposes or by state governments. Information on attainment of IEP diplomas and other alternative exit documents will be collected from states and grantees, but they will not count as a positive outcome for this measure.
---------------------------------------------------------------------
Question:How will gains in literacy and numeracy skills be measured?
Answer:To maintain consistency with the implementation of the common measures by the Department of Education (ED), ETA is adopting outcome measure of educational gain outlined in the National Reporting System (NRS). The NRS was developed by ED's Division of Adult Education and Literacy for implementation of an accountability system for federally-funded adult education programs under WIA.

As outlined in the NRS, there are two sets of educational functioning levels - six levels for Adult Basic Education (ABE) and six levels for English-as-a-second language (ESL) students. Each ABE and ESL level describes a set of skills and competencies that students entering at that level demonstrate in the areas of reading, writing, numeracy, speaking, listening, functional, and workplace skills. These descriptors provide guidelines for placing participants in educational functioning levels, based on performance on standardized tests.

To achieve a positive outcome for the measure, a participant who was basic skills deficient must demonstrate through post-test that he/she has advanced one or more educational functioning levels beyond the level in which he/she was initially placed at pre-test.
---------------------------------------------------------------------

Question:Do the Educational Functioning Levels equate to two grade levels?
Answer: There are six educational functioning levels (EFL), and each level roughly equates to two grade levels.

Youth who are basic skills deficient will have to increase one or more EFLs to achieve a positive outcome on the literacy and numeracy gains measure. However, this does not mean that every youth will need to increase the equivalent of two grade levels to be counted as a positive outcome in the measure. Under a normal distribution of pre-test scores, most participants' scores will place the individuals in a range where they have completed some of the skills in that particular EFL. Therefore, for a majority of participants, a positive outcome is not likely to require the equivalent of completing two full grade levels.
---------------------------------------------------------------------

Question:Are providers required to use a specific test to assess literacy and numeracy skills?
Answer:The assessment test used to determine literacy and numeracy skills must be cross-walked to the Educational Functioning Levels (EFL) outlined in the Department of Education's National Reporting System. Currently, seven tests have been cross-walked to the EFLs: CASAS, TABE, ABLE, AMES, WorkKeys, and SPL and BEST for Limited English Proficiency individuals. A different test can be used, but it must be cross-walked to the levels.
---------------------------------------------------------------------
Question:Can test results from schools be used for the literacy and numeracy gains measure?
Answer:Yes. Test results provided by schools may be used to calculate the literacy and numeracy gains measure.

However, the assessment test used by the school must be cross-walked to the Educational Functioning Levels (EFLs) outlined in the Department of Education's National Reporting System. In addition, the same assessment must be used for both the pre-test and the post-test.
---------------------------------------------------------------------

Question:Is there a waiver for counting youth with disabilities in the literacy and numeracy gains measure?
Answer:Youth with disabilities are not excluded from the literacy and numeracy gains measure. When administering assessment tests, individuals with disabilities should be accommodated according to:
  1. Section 188 of WIA;
  2. Guidelines associated with the assessment test; or
  3. State law or policy.

The Department of Labor's Office of Disability Employment Policy feels strongly that individuals with disabilities should be included in the common measures. It is important that individuals with disabilities experience full program participation, including tracking their progress in attaining literacy and numeracy skills.


Back to Top Back to Top
---------------------------------------------------------------------

Question:Will the efficiency measure be applied at the state or grantee level?
Answer:The efficiency measure will be calculated at the Federal level to determine an efficiency outcome for programs as a whole.

In addition, ETA will review efficiency levels as applied at the state and grantee levels based upon the level of program funds awarded. In practical terms, this means that ETA and states/grantees will track the number of participants.
---------------------------------------------------------------------

Question:How will the efficiency measure be used by the Department of Labor and other Federal policymakers?
Answer:The efficiency measure will likely be used as a budgeting tool at the Federal level. At the grantee level, the number of participants will be tracked and monitored.
---------------------------------------------------------------------
Question:Why is the appropriation level used to calculate the efficiency measure rather than the level of expenditures?
Answer:The efficiency measure allows Congress and other Federal policymakers to make high-level assessments of the utilization of funds by programs based upon the number of individuals receiving program services. It will frame program funding alongside program achievements, but it does not attempt to track specific expenditures.
---------------------------------------------------------------------
Question: Does the efficiency measure take into consideration non-Federal resources?
Answer:No. The efficiency measure is calculated by dividing the Federal program appropriation by the number of program participants.
---------------------------------------------------------------------
Question:Is the efficiency measure intended to place a greater value on lower cost services?
Answer:No. The efficiency measure is not intended to place greater value on low-cost services or to discourage the provision of services to individuals facing significant barriers to employment. It does, however, allow comparison of performance with program purpose, and comparison between grantees within a program.


---------------------------------------------------------------------

Question:Will supplemental sources of information be allowed to calculate the common measures?
Answer:Wage records are the data source for the employment-related common measures.

However, mechanisms are not currently in place to provide access to wage records for grantees operating the following programs: H-1B Technical Skills Training, Job Corps, Migrant and Seasonal Farmworkers, Native American Employment and Training, Responsible Reintegration of Youthful Offenders, and Senior Community Service Employment Program. For these grantees, supplemental sources of data, such as participant surveys or contacts with employers, will be permitted as an interim means of reporting until all grantees in a program have access to wage records.
---------------------------------------------------------------------

Question: Will supplemental sources of information continue to be collected and reported by states and grantees?
Answer:Yes. Supplemental data will be used for program management purposes and to gain a full understanding of program performance and activities.


Back to Top Back to Top
---------------------------------------------------------------------

Question:When will the common measures be implemented?
Answer:There are several steps to full implementation of the common measures:
  • We will publish new reporting requirements for comment in early 2004.
  • These new reporting requirements will be final around mid-2004.
  • Then we will provide guidance for necessary technical changes and begin a transition period with system-wide staff training.

The WIA, Labor Exchange, and Trade programs will be first. The National Programs and discretionary-funded grants will follow as logistical issues are ironed out.
---------------------------------------------------------------------

Question: How will the common measures be implemented?
Answer:The actual implementation of common measures will be through new reporting instructions.

The next generation performance information system will replace all current reporting requirements with a single, comprehensive system.

The new reporting requirements will apply to WIA (including state programs and National Emergency Grants), Employment Service, VETs, Trade, Migrants, Native Americans, Older Workers, H-1B Skill Grants, and other training and employment grants.
---------------------------------------------------------------------

Question:Does the implementation of the common measures depend upon WIA reauthorization?
Answer:No. ETA will implement the common measures as part of its new record keeping and reporting system. However, the WIA reauthorization bills passed by the Senate and the House of Representatives both include some parts of the common measures.
---------------------------------------------------------------------
Question: Will states be required to collect data and report on both the existing WIA measures and common measures in PY 2004?
Answer:The new record keeping and reporting guidelines propose collection of data that will allow computation of performance according to current statutory criteria or the common measures.
---------------------------------------------------------------------
Question: What measures will be used for accountability purposes for WIA programs in PY 2004?
Will ETA negotiate state WIA performance goals for the common measures for PY 2004?
Answer:Accountability under WIA is established within the Act in Section 136. ETA will negotiate WIA performance goals for PY 2004 in accordance with the statutory provisions in effect on April 1, 2004.

(Note: Even with reauthorization scheduled for early in 2004, there will likely be a transition period for implementation of any new or different provisions, including performance measures.)


Back to Top Back to Top


Reporting System Overview

Question:What is the foundation of the proposed reporting system?
Answer:Individual records. ETA needs to be able to discuss the program components of the workforce system both in terms of what is currently going on in the system (i.e., who is being served and what services are being provided) and in terms of the outcomes the system achieves (i.e., what happens after the services are provided).

We propose to collect a core set of information (demographic characteristics, service provision information, and common measure components) for all participants in all programs. These core elements will be defined consistently across all programs. Additionally, there are program-specific elements that will be collected.
---------------------------------------------------------------------

Question: Does submitting individual records replace required reports?
Answer:No. While individual records help organize information and can provide the basis for some analysis, they cannot replace the official reports to be prepared and submitted by the recipient of federal funds.
---------------------------------------------------------------------
Question: Will ETA assist states and grantees in designing systems to calculate required reports?
Answer:Yes. Piggy-backing on the ongoing data validation initiative, ETA will provide states and grantees with software that calculates the aggregate reportable information from individual records compiled by the state or grantee. This software will also allow the grantees to run their own reports based on special populations they would like to examine, just as ETA will be able to do from the individual records.
---------------------------------------------------------------------
Question: What programs are involved?
Answer:All programs implementing the common measures will also implement the comprehensive reporting system.


---------------------------------------------------------------------

Question:When will guidance be released on the new reporting system?
Answer:ETA expects to publish a public comment notice in the Federal Register in January 2004. Office of Management and Budget clearance of the new reporting system is expected by the end of June 2004.
---------------------------------------------------------------------
Question: What is the timeline for implementation of the reporting system?
Answer:Implementation of the new reporting system will proceed once Office of Management and Budget approval has been received. Further, circumstances unique to each program will likely mean that not all programs will be expected to implement the new system on the same time frame. Adjustments for grantee cycles and degree of change between current and proposed systems will account for a sliding implementation schedule.
---------------------------------------------------------------------
Question: Will states and grantees receive additional funding to implement common measures and make adjustments to reporting systems?
Answer:No. Grantees will be expected to implement any new record keeping and reporting requirements with available funds.


Data Validation

Question:Can states use supplemental data to calculate workforce investment performance under the Department of Labor's common measures? (June, 2004)
Answer:

No. The Employment and Training Administration's Training and Employment Guidance Letter (TEGL) 15-03 indicates that Unemployment Insurance (UI) and other wage records will be the only allowable data sources for calculating the employment ¿related common measures. To maintain the integrity of common performance measures, ETA must ensure that the same data sources are used by each state to calculate outcomes. Self-employment is not covered under state UI systems. If supplemental data sources were allowed, the performance outcomes would not be calculated consistently across Federal programs.

ETA understands that the common performance measures provide only part of the information necessary to effectively oversee the workforce investment system. In order to convey full and accurate information on the performance ETA will continue to collect all data on program activities, participants and outcomes. States should continue to collect and report supplemental data on employment outcomes of individuals not covered by UI wage records, such as the self-employed. When possible by state law, information on individuals who are self-employed that is obtained through record sharing or automated matching of state tax records may be used to calculate the common measures.

The common measures policy is intended to ensure the comparability of the measures on a national level, not to discourage self-employment or entrepreneurship. ETA understands that excluding supplemental sources of information, such as that on self employment, from the calculation of the common measures may cause reported performance outcomes to be somewhat lower than if these sources were used. However, by continuing to collect supplemental information, ETA and the states will be able to determine the precise impact this information has on the common measures outcomes, permitting the adjustment of performance expectations as appropriate and potentially avoiding the unintended consequences of the measurement system.
---------------------------------------------------------------------
Question:When must states and grantees complete data validation?
Answer: Data Validation Update:
Please be advised that we are in the process of finalizing the reporting instructions for having states submit their validation summary and analytical report outcomes for WIA, LX and TAA. We are also working with our contractor staff and our Office of Technology to develop the process for state agencies to electronically transmit those reports. Until ETA has these instructions finalized, state agencies will not be able to transmit such information to us.

TEGL 3-03, issued in August 2003, advised state agencies of the requirement to validate reports by April 1, 2004. We have received several requests for extensions of this due date, and have referred such requests to the appropriate regional office.If you have questions, please feel free to contact Traci DiMartini at 202-693-3698 or Gail Eulenstein at 202-693-3013. (updated March 24, 2004)
---------------------------------------------------------------------
Question: The software is putting a 1 in the Occ Code and Skills Training Code fields (Reference numbers 29 and 39) on the worksheet and pass/fail screen even though there is no Occ/Skills training code in the extract file. How should the validator deal with this issue?
Answer: The validator should put a checkmark in the pass box for any Occ Code and Skills Training Code fields that display a "1." DOL is advising states to mark "PASS" for these elements. (March 2004)
---------------------------------------------------------------------
Question: The sample worksheets display 999999999 for Skills Training Code and Occupational Skills Training Code. This value means that no specific occupational skills' training was received. How do we validate this value for these fields?
Answer: Validators should pass any Training Code and Occupational Skills Training Code fields that display 999999999. (March 2004)
---------------------------------------------------------------------
Question: When I import my reported counts or export the validation values for table L, I do not get the correct results for the dislocated worker "Entry into unsubsidized employment related to the training received of those who completed training related services" measure."
Answer: We know about this problem, and will correct it in the maintenance release of the WIA validation software. This release should be available in late March. (March 2004)
---------------------------------------------------------------------
Question: The record layout in the software and the user's guide is missing element 152.
Answer: Users should still follow the record layout. The record layout is correct, even though it is misnumbered. We know about this problem and will correct it in a maintenance release of the WIA validation software which should be available in late March. (March 2004)
---------------------------------------------------------------------
Question: The Diploma Rate for the second quarterly report, due in February is too high.
Answer: The date range for the diploma rate for the ETA 9090 report for the second quarter of the program year is incorrect. The correct numerators and denominators can be found in the performance outcome groups. The calculation will be corrected in the maintenance release of the WIA validation software. This release should be available in late March. (March 2004)
---------------------------------------------------------------------
Question: The skill attainment rate does not match your calculations.
Answer: ETA recognizes that the guidance and specifications for the younger youth skill attainment measure are vague, which can lead two people to calculate the measure in two different ways. States will not be held responsible if their values do not match the validation values. National Office program staff may provide additional policy guidance. (March 2004)
---------------------------------------------------------------------
Question: Are states that use SPRA or DART software to calculate WIA annual reports required to use MPR/SAIC software to satisfy reports validation?
Answer: Yes. In TEGL 3-03 states were advised that they were not required to use MPR's software. States were advised that if they chose to use a different methodology for validation, they must to notify their regional office or national grant program office of their intent to use a methodology other than MPR and to provide information that would support that the chosen methodology met the current validation requirements. To date, no state has made such a request. (March 2004)
---------------------------------------------------------------------
Question: Can states submit separate records in the WIASRD for individuals enrolled in more than one funding stream? How does this affect data validation?
Answer: According to TEGL 14-00 Attachment E, change 1, "if an individual is served jointly by multiple WIA title I-B funding sources/programs (e.g., youth and adult funds), only one record should be submitted. However, all sections relevant to each funding source/program must be completed. If the individual is served independently by multiple funding sources or local areas, separate records may be submitted."

The Validation software rejects records that are duplicates. The validation software's duplicate detection routine differs slightly from the WIASRD's. A duplicate is two or more records that have the same SSN/ID, date of exit, and funding stream. If two or more records match on all three of these variables, they are considered duplicates and rejected. Thus, for participants who are served by multiple funding streams, states may use either consolidated records or separate records for each funding stream.

States, however, should use consolidated records for participants who received services from one funding stream and two or more WIBs. Otherwise, the software will reject the records as duplicates.

As a result, the validation software does not automatically calculate Table O. To calculate Table O, states will need to load separate files for each local area. (March 2004)
---------------------------------------------------------------------

Question: Can the WIB number for each customer be included on the Error listing?
Answer: The maintenance version of WIA 2.1 will include separate error reports - by WIB, by Office, and by Case Manager. This version should be available by late March. (March 2004)
---------------------------------------------------------------------
Question: Is there a quick way to find a customer after the data validation is completed?
Answer: The upcoming maintenance release for WIA software, scheduled for release in late March, will include a Find function so that users can find records based upon the Observation Number or SSN. (March 2004)
---------------------------------------------------------------------
Question: The TAA data validation worksheets and instructions require the validation of date of birth. TAA, however, does not require the collection of documentation for date of birth. What should I do?
Answer: The program office has decided that date of birth does not need to be validated. Validators should put a checkmark in the pass box for this field. We are meeting with all program offices to confirm the need for the current data elements, and will make necessary changes to the software prior to validation for PY03 reports. (March 2004)
---------------------------------------------------------------------
Question: A state reports it has double quotes around each of the fields in the extract file, and the software will not allow them to import the file. What should the state do?
Answer: Version 1.2 does not allow for double quotes in the extract file. The user should remove these quotes, and then the file will import. A new version of the TAA validation software, which will be released in late April, will allow for double quotes. (March 2004)
---------------------------------------------------------------------
Question: Fields 30 and 43, occupational code system, and Field 44, Employed in the Third Full Quarter after Exit, on the layout and validation instructions do not match the TAPR. Which record layout should I follow?
Answer: The software uses the TAPR record layout. Consequently, there printed record layout is incorrect. Users should follow the format listed in the TAPR. (March 2004)
---------------------------------------------------------------------
Question: What happens if a record that was sampled for data element validation cannot be found? Or, what happens if a record for a participant who did not receive WIA Title I services is wrongly classified as a Title I recipient and then selected as part of the data element validation sample?
Answer: If individual records cannot be found or are wrongly included in the extract file and then sampled, the validator should fail the record for the registration date and not validate any other elements present on the worksheet. (March 2004)
---------------------------------------------------------------------
Question: What happens if a set of records cannot be found? For example, what happens if an office burns down or a contractor who has the records goes out of business and the records selected as part of the data element validation sample are not available?
Answer: The state should contact its regional representative to obtain approval to redraw the sample. Upon gaining approval, the state should create a new extract file that excludes the problematic office. This file should be imported into the software, and the sample that is selected should be validated.

Note that report validation should be performed using the original file that includes all the records. (March 2004)
---------------------------------------------------------------------

Question: When monitors go on site are they allowing staff to make changes to the documentation in the files?
Answer: Validators/Monitors should not allow staff to make changes to the documentation in the case files during the validation review in order to pass a particular item. (March 2004)
---------------------------------------------------------------------
Question: Who performs data validation, the state or the local area?
Answer: It is the State's responsibility to validate their program's annual report. (March 2004)
---------------------------------------------------------------------
Question: How do States request a deadline extension?
Answer: States should contact their regional representative to request an extension. (March 2004)
---------------------------------------------------------------------
Question: How do States submit the data?
Answer: We are working with SAIC/MPR now to develop the process for states to electronically submit the data validation findings by uploading the file to EIMS using standard EIMS procedures. We anticipate having this process in place mid-April.

In the interim, when states have finalized the validation, they should send an e-mail message to their regional representative with a copy to Gail Eulenstein with a brief narrative about the validation outcomes.

States may also export the report and data element validation results into PDF format, and including them in an e-mail. To PDF a report, choose the export option - the envelope in the upper left corner of the screen. Click on OK. Type in the filename and save the PDF file. Contact MPR if additional assistance is needed. (March 2004)

If you have additional questions, call Gail Eulenstein or send an email to Eulenstein.gail@dol.gov.
---------------------------------------------------------------------

Question: When and how must states and grantees submit the results of data validation to ETA?
Answer: ETA will be requesting Office and Management and Budget (OMB) approval for reporting data validation results. A public comment notice will be placed in the Federal Register in January 2004. States and grantees are not required to report the results of data validation until this has been approved by OMB. (January 2004)
---------------------------------------------------------------------
Question: What training and technical assistance will be given to states and grantees?
Answer: ETA and Mathematica Policy Research (MPR) conducted regional training sessions for states for the WIA, Labor Exchange, and Trade Adjustment Assistance programs during summer 2003. In addition, several Web-based data validation seminars were presented to state staff in November and December 2003. Training for National Program grantees will be provided separately for each program as follows: MSFW and Native American in winter 2003/2004 and SCSEP in spring 2004. On-going technical assistance is available to states and grantees from MPR. States and grantees are encouraged to contact MPR for assistance to avoid unnecessary delays and problems when implementing data validation. (January 2004)
---------------------------------------------------------------------
Question: Will states and grantees receive additional funds to complete data validation?
Answer: No. The requirement to perform validation derives from states' and grantees' responsibility to provide accurate information on program activities and outcomes to ETA. States and grantees are expected to provide resources for conducting validation from their administrative funds. ETA has taken a number of steps to minimize the costs of data validation, including developing software that states and grantees can use, providing on-going technical assistance, and using a sampling approach that minimizes the number of participant case files that must be reviewed. (January 2004)
---------------------------------------------------------------------
Question: Will states and grantees be held accountable for performing validation and for the accuracy of data?
Answer: States and grantees will be required to validate data annually. Failure to complete data validation and submit the results to ETA will be deemed a failure to report, and subject to corrective action or sanction, as appropriate. States and grantees will be held accountable for meeting acceptable levels for the accuracy of data, once these standards are established. States and grantees that fail to meet accuracy standards will receive technical assistance from ETA and develop and implement a corrective action plan. In addition, data that do not meet accuracy standards may keep states and grantees from being eligible for incentive grants. Significant or unresolved deviation from accuracy standards may be deemed a failure to report. (January 2004)
---------------------------------------------------------------------
Question: When will standards for the accuracy of data be established?
Answer: Accuracy standards will not be established until the third year of validation. Therefore, there will be no sanctions based on discrepancies during the first and second years of validation. The first year of data validation is intended to be a learning experience and to focus on identifying and resolving problems with reporting systems. Data collected in the second year will be analyzed, and based on this information, accuracy standards will be established prior to the third year of validation. (January 2004)
---------------------------------------------------------------------
Question: How do states and grantees know what is the acceptable documentation for data elements?
Answer: The source documentation for each data element is described in the data validation handbooks that have been developed for each program. ETA will develop guidance on acceptable documentation for data elements as part of the implementation process. States and grantees will not be held accountable for data accuracy until the third year of validation. (January 2004)
---------------------------------------------------------------------
Question: Are electronic records of documents acceptable documentation?
Answer: Yes. Source documentation that has been scanned and is only available electronically is acceptable documentation for data elements. (January 2004)
Question: Can states continue to use self-attestation for eligibility determination purposes?
Answer: Some states currently are using self-attestation to determine program eligibility. Validation in these states will therefore consist of verification of the oversight of the self-attestation process. Regional offices can assist with this effort. (January 2004)
---------------------------------------------------------------------
Question: Does the review of the sampled participant case files have to be conducted on-site?
Answer: Although highly recommended, the review of participant case files does not have to be conducted on-site. If states do plan to conduct the review at a central location, they should advise regional office staff of their procedures. (January 2004)
---------------------------------------------------------------------
Question: Will ETA conduct monitoring of states' and grantees' validation efforts?
Answer: Yes. After data validation is fully implemented, a monitoring guide will be developed to ensure consistency among all states and programs in conducting the validation process. (January 2004)
---------------------------------------------------------------------
Question: What validation software is currently available?
Answer: Validation software is currently available for states to complete validation for WIA, Labor Exchange, and Trade Adjustment Assistance programs. Software for the Migrant and Seasonal Farmworkers program is currently being piloted. Software for the Native American Employment and Training program and the Senior Community Service Employment Program is under development. (January 2004)

[Download Adobe Acrobat Reader]          [PDF to HTML Conversion]