National Evaluation of Welfare-to-Work Strategies

How Effective Are Different Welfare-to-Work Approaches?
Five-Year Adult and Child Impacts for Eleven Programs:

Appendix H:
A Comparison of Impacts Estimated from Survey and UI Earnings Data

[ Main Page of Report | Contents of Report ]

Employment and earnings impacts in this report are estimated from statewide-automated unemployment insurance (UI) earnings records and from responses to the Two-Year Client Survey and Five-Year Client Survey. This appendix compares employment impacts from these sources over five years and during year 5 and investigates why they differ in some programs. The results demonstrate that surveys sometimes recorded jobs that were missed by statewide UI earnings reporting systems and other times underreported employment. Further, in some sites program and control groups varied in the degree to which employment was underreported on the survey or with UI records.

I. Possible Reasons for Differences Between Survey and UI Earnings Data

Survey data are self-reported. They include jobs that are not covered or not reported to the state UI system, such as self-employment, some domestic work, federal government or military jobs, informal employment, or out-of-state jobs. UI earnings data, however, may include jobs that respondents fail to recall or are reluctant to report on the survey. Survey respondents may also have had problems recalling start and end dates of some jobs, particularly those that started early in the follow-up and lasted for a short period of time. On the other hand, some employers may have delayed reporting employment to the UI system until after the files were created for this report.(1)

Furthermore, survey and UI earnings data presented in this report cover somewhat different time periods. UI earnings data were available for each quarter of follow-up, whereas the two survey interviews recorded employment in a less complete way. The two-year survey collected information on up to six jobs during years 1 and 2, but the five-year survey recorded only information about respondents' current or most recent job.(2) In addition, the indicator of "ever employed during years 1 to 5" used in the report covers somewhat different follow-up periods, when measured with UI earnings or survey data. UI data cover quarters 2-21 after random assignment, whereas the follow-up period for survey data extended from the random assignment month to the interview date, which for some respondents occurred several months after the end of year 5.(3) Moreover, year 5 survey impacts cover months 48-60, starting and ending slightly earlier than the follow-up for UI earnings for most sample members.(4)

II. Reporting Discrepancies for Sample Members with Both Survey and UI Earnings Data

One potential source of differences in impact estimates from survey and UI earnings data is discrepant reporting. To see if this were a problem, for each sample member in the survey respondent sample, employment reported any time after random assignment from the survey data was directly compared with UI earnings recorded during quarters 2-21 (or years 1 to 5).(5) A second comparison was made for year 5. The results are shown in Table H.1.

Appendix Table H.1
Percentage of Survey Sample Members Having Earnings on Survey or UI Records, but Not on Both

Site and Program

Five-Year Survey Only (%) UI Earnings Data Only (%)

Years 1 to 5

Atlanta 5.9 6.2
Grand Rapids 5.9 2.4
Portland 12.9 1.5
Riverside 15.3 2.1

Year 5

Atlanta 8.5 10.0
Grand Rapids 11.1 8.7
Portland 17.4 9.5
Riverside 18.5 5.7

SOURCE:  MDRC calculations from the Five-Year Client Survey and unemployment insurance(UI) records.
NOTES:  See Appendix A.2

For each comparison, a match occurred if both sources recorded earnings during at least one quarter of follow-up or if neither source recorded earnings during any quarter. Match rates over five years ranged from 82.6 percent in Riverside to 91.7 percent in Grand Rapids when program and control group members were considered together. As shown in Table H.1, patterns of discrepancies differed by site. In Atlanta and Grand Rapids, relatively few mismatches occurred. In Atlanta, neither source recorded a higher incidence of employment, and in Grand Rapids the employment level was only slightly higher when recorded with survey data. Discrepancies in reported employment occurred most often in Portland and Riverside, and nearly all mismatches resulted from employment that was recorded only on the survey.

Results for year 5 show a similar pattern, except that the overall match rates were lower for each site, ranging from 73.1 percent in Portland to 81.5 percent in Atlanta. As with the five-year employment measure, in Atlanta and Grand Rapids about the same percentage of respondents had information on employment recorded only on UI earnings records as had information recorded only from their survey responses. However, in Portland and, especially, in Riverside survey responses provided considerably higher rates of employment than UI records.

III. Observed Patterns of Differences Between Survey and UI Earnings Impacts

Table H.2 compares the incidence of employment in years 1 to 5 for program and control group survey respondents in each program, as well as program impacts, estimated from UI earnings (row 1: records impact) and survey responses (row 2: survey impact). A comparison of these two rows highlights the difference in estimates from survey and UI earnings data.

Appendix Table H.2
Comparison of Impact Estimates from Survey and UI Earnings

Site and Program

Program Group (%) Control Group (%) Difference (Impact) Percentage Change (%)

Ever employed in years 1 to 5

Atlanta Labor Force Attachment

Records impact: survey sample 84.3 84.6 -0.3 -0.4
Survey impact: survey sample 83.3 82.9 0.4 0.4

Atlanta Human Capital Development

Records impact: survey sample 83.4 84.6 -1.2 -1.4
Survey impact: survey sample 81.2 82.9 -1.7 -2.0

Grand Rapids Labor Force Attachment

Records impact: survey sample 90.9 88.4 2.5 2.8
Survey impact: survey sample 94.7 91.9 2.9* 3.1

Grand Rapids Human Capital Development

Records impact: survey sample 90.3 88.4 1.9 2.1
Survey impact: survey sample 93.6 91.9 1.7 1.9

Riverside Labor Force Attachment

Records impact: survey sample 74.3 65.0 9.3*** 14.3
Survey impact: survey sample 85.7 80.8 4.9** 6.0

Riverside Human Capital Development

Records impact: survey sample 69.3 60.0 9.2*** 15.4
Survey impact: survey sample 82.8 74.2 8.6*** 11.6

Portland

Records impact: survey sample 84.6 84.0 0.5 0.6
Survey impact: survey sample 93.5 93.3 0.2 0.2

SOURCE:  MDRC calculations from state and county administrative records.
NOTES: See Appendix A.

From the standpoint of consistency, the preferred result would be (by definition) for both sources to record the same information for each person. This result occurred in Atlanta, where survey responses and UI records captured the same employment levels for LFAs, HCDs, and control group members, as well as similar impacts. Results for Grand Rapids were nearly as good. For all three research groups, survey responses display employment levels that were between 3 and 4 percentage points higher than were recorded with UI earnings data, but impacts for LFAs and HCDs were similar when calculated with either data source. The next best result occurs when both program and control group members have similar rates of discrepant reporting, because impacts estimated from UI earnings and survey data will be similar. This situation is demonstrated by results for Riverside HCD and Portland. As shown in Table H.2, the survey records higher employment levels than do UI earnings (especially in Riverside); but differences are consistent for program and control groups, leaving impact levels nearly unchanged.

Variation in rates of discrepant reporting by research group is more problematic, because it affects impact results. As shown in Table H.2, this result occurred only for Riverside LFA. In Riverside, the proportion of control group members who ever worked for pay was nearly 16 percentage points higher when recorded from survey responses than UI earnings data. LFAs also reported a higher incidence of employment on the survey, but the discrepancy was not as great (11 percentage points). As a result, survey impacts on employment were more than 4 percentage points smaller than UI earnings impacts for the survey respondent sample. Nonetheless, both sources record a statistically significant impact on employment for Riverside LFA.(6)

The results for year 5 are more problematic, in that most programs show at least a small positive difference in employment when calculated with survey data, but not when calculated with UI earnings (Table H.3). The reasons for this discrepancy differ by site and program. In general, program group members were more likely than control group members to have higher levels of employment recorded on the survey and, correspondingly, less likely to have higher levels of employment recorded on UI earnings. The difference is most extreme in Portland, where the program-control group difference changes from a loss in employment when calculated with UI records to a gain when calculated with survey data, although neither difference was statistically significant.(7)

Appendix Table H.3
Comparison of Impact Estimates from Survey and UI Earnings Data for Employment in Year 5

Site and Program

Program Group (%) Control Group (%) Difference (Impact) Percentage Change (%)

Ever employed in year 5

Atlanta Labor Force Attachment

Records impact: survey sample 68.1 69.5 -1.3 -1.9
Survey impact: survey sample 69.0 65.2 3.7 5.7

Atlanta Human Capital Development

Records impact: survey sample 69.2 69.5 -0.3 -0.5
Survey impact: survey sample 64.8 65.2 -0.4 -0.6

Grand Rapids Labor Force Attachment

Records impact: survey sample 74.7 74.6 0.1 0.1
Survey impact: survey sample 79.5 75.7 3.8 5.0

Grand Rapids Human Capital Development

Records impact: survey sample 72.0 74.6 -2.6 -3.5
Survey impact: survey sample 77.7 75.7 2.1 2.7

Riverside Labor Force Attachment

Records impact: survey sample 49.1 45.7 3.4 7.5
Survey impact: survey sample 63.8 57.6 6.2** 10.8

Riverside Human Capital Development

Records impact: survey sample 47.8 41.7 6.1* 14.7
Survey impact: survey sample 58.0 49.2 8.8** 17.8

Portland

Records impact: survey sample 57.6 64.2 -6.7 -10.4
Survey impact: survey sample 60.9 56.0 4.9 8.7
SOURCE:  MDRC calculations from state and county administrative records.
NOTES: See Appendix A.

Endnotes

1.  In addition, sample members' Social Security number is needed to match to UI earnings. Some Social Security numbers are reported or recorded incorrectly at random assignment. Sometimes the incorrect number does not match to any UI earnings records; other times it matches to the records for another person.

2.  In addition, 434 respondents to the Five-Year Client Survey were not interviewed after two years.

3.  As noted in the report, UI data are recorded quarterly. Quarter 1 may include earnings from before random assignment and is therefore excluded from the analysis of program impacts. Quarters 2-21 correspond to months 2 to 61, 3 to 62, or 4 to 63 after random assignment, depending on whether sample members were randomly assigned during the first, second, or third month of a calendar quarter.

4.  Year 5 as measured with UI earnings records includes quarters 18-21, which correspond to months 50 to 61, 51 to 62, or 52 to 63, depending on the respondent's month of random assignment.

5.  Some sample members were interviewed after the follow-up period for UI earnings and were excluded from this comparison.

6.  The survey results for Riverside show that joblessness was not as pervasive as indicated by UI data. Interestingly, in all four sites respondents' most recent jobs that were reported only on the survey provided fewer hours of work per week on average and lower hourly pay and were much less likely to provide medical coverage than jobs that were reported on both sources. This nonexperimental finding suggests that sample members in Riverside relied more on self-employment, household employment, and service jobs for small employers than those in the other sites in the evaluation. This result is still consistent with the overall characterization of Riverside as having a weaker labor market than the other sites in the evaluation.

7.  The p-value of the UI earnings impact was .13.


Where to?

Top of Page

Home Pages:
National Evaluation of Welfare-to-Work Strategies (NEWWS)
Human Services Policy (HSP)
Assistant Secretary for Planning and Evaluation (ASPE)
U.S. Department of Health and Human Services (HHS)