Skip ACF banner and navigation
Department of Health and Human Services logo
Questions?  
Privacy  
Site Index  
Contact Us  
   Home   |   Services   |   Working with ACF   |   Policy/Planning   |   About ACF   |   ACF News Search  
Administration for Children and Families US Department of Health and Human Services

Wisconsin’s Self-Sufficiency First/Pay for Performance Program:

Results and Lessons from a Social Experiment

 

 

 

 

 

 

Maria Cancian, Thomas Kaplan, and Ingrid Rothe

with the assistance of Hwa-Ok Park and Dan Ross

Institute for Research on Poverty

University of Wisconsin–Madison

 

 

 

 

 

 

March 2000

 

 

 

 

 

This report was prepared under a contract between the Wisconsin Department of Workforce Development and the Institute for Research on Poverty. The opinions expressed are those of the authors and do not reflect the views or policies of the sponsoring agencies.

I. INTRODUCTION

 

Wisconsin implemented its Self-Sufficiency First and Pay for Performance programs in March 1996. Self-Sufficiency First (SSF) was a diversion program requiring applicants for AFDC to complete an interview with a county staff member holding the title of "financial planning resource specialist." After the interview and within the 30-day AFDC application processing period, the applicant had to participate in the state JOBS program for at least 60 hours, including 30 hours of direct employer contact. The applicant could enter AFDC only after meeting these requirements.

Pay for Performance (PFP) was an intensive JOBS program requiring 20–40 hours per week of participation from AFDC case heads. In comparison to the regular JOBS program it replaced, PFP specified a minimum level of participation and changed the way in which penalties for failure to comply with its requirements were calculated. Noncompliance in the regular JOBS program could result in the removal of the noncomplying adult from the grant, which reduced the size of the grant by a limited amount. Under PFP, for each missed hour of JOBS participation, a penalty equal to the federal hourly minimum wage was imposed, first on the AFDC grant, until it was reduced to zero, and then on the Food Stamp benefit.

The Wisconsin Department of Health and Social Services (DHSS) originally conceived of SSF and PFP as separate programs and submitted separate waiver requests to the federal government for them. In its waiver authorization, the U.S. Department of Health and Human Services (DHHS) consolidated the requests into a single waiver package. Eventually, the programs came to be thought of as a single welfare reform program—known as SSF/PFP—for purposes of both management and evaluation.

The experimental evaluation called for under the "Terms and Conditions" of the federal waiver operated in four counties (Dane, Dodge, Jefferson, and Waukesha). In March 1996, at the time of SSF/PFP implementation, the four counties together contained an AFDC caseload of 4,081, which represented 6.5 percent of the total Wisconsin caseload of 62,888. Most of these cases, 2,722, were in Dane County; 933 were in Waukesha, 277 in Dodge and 149 in Jefferson County.

A total of 1,500 randomly selected active AFDC cases in these four counties were assigned to a control group, the members of which continued in the AFDC program. All remaining AFDC cases active on that date in the four counties were placed in the PFP program, and 1,500 of these cases were randomly identified as the experimental treatment group, to be followed as part of the demonstration. Starting on March 1, new applicants for AFDC in the four experimental counties were randomly assigned either to the SSF program or to a control group, which could apply for AFDC without participating in the SSF program and were not subject to the requirements of PFP. This random assignment of new applicants was to continue until 1,500 cases were placed in the control group and 1,500 had been assigned to an SSF/PFP treatment group whose experiences were monitored. After reaching these numbers, all remaining applicants would become unstudied participants in the SSF/PFP program.

Using its customary bid solicitation and selection process, DHSS selected MAXIMUS to conduct the evaluation of SSF/PFP. The evaluation contract became effective on July 1, 1996, and was to continue through 2001. It was to include an impact study, a process study, and a cost-benefit study. However, in November 1996 DHSS notified MAXIMUS that the evaluation contract would be terminated on December 31, 1996. At the time of the notification, testing of data from CARES (the primary source of data for the evaluation, described in more detail below) was incomplete, and no CARES data were supplied to the evaluator. MAXIMUS conducted a half-day site visit to Dane County in November and submitted its only report based on that visit.

In July 1997 the Wisconsin Department of Workforce Development (DWD), the agency responsible for the operation of AFDC and TANF (Wisconsin Works) programs, received funding from DHHS to analyze selected impacts of the SSF/PFP program. Owing in part to concerns raised by county staff that difficulties with CARES and differences in local practice may have affected the state’s ability to evaluate the SSF/PFP experiment, DWD contracted with the Institute for Research on Poverty (IRP) at the University of Wisconsin–Madison to conduct a field reconnaissance of the SSF/PFP experiment in Wisconsin.

The field reconnaissance report is attached as Appendix 1. While that report revealed several difficulties (also discussed in this report) during the implementation process, the evidence did not suggest that the evaluation effort should be abandoned. In July 1998 DWD contracted with IRP to conduct an impact analysis based on data collected in CARES from March 1996 through April 1997 (May 1997 was eliminated from the analysis because members of the control group were converting to SSF/PFP requirements during that month). Initial evaluation plans called for a DWD programmer/analyst to provide IRP with extract files from CARES which would permit the appropriate analysis. Almost immediately, extensive data problems (described below) in the extract files were identified; eventually IRP researchers used a combination of files developed by IRP programmers from a more standard CARES extract and by DWD analysts for this report.

The next section of this report identifies the data difficulties and limitations that IRP encountered. This is followed by a section which examines selected outcomes for AFDC participants who moved to PFP on March 1, 1996. (Limited data on SSF/PFP entrants after March 1, 1996, are presented in Appendix 2, owing to our lower level of confidence in these data.) The last section provides a summary and conclusion and offers a recommendation concerning future large-scale efforts to operate controlled experiments and monitor their impacts with operational data systems.

 

 

II. DATA LIMITATIONS

 

Sources of Data Difficulties

Four sources of problems and questions, all of which raise questions concerning the validity of experimental results, became apparent during the data analysis.

1. DWD policy decisions on size of the control group and early termination. The size of the control group received much discussion as the experiment was being implemented. State and local officials wanted a control group large enough to provide adequate statistical power but not so large as to deprive many people of program reforms that might be helpful or that could place an experimental county at a disadvantage in meeting state or federal performance targets. State officials were also concerned that some local officials who had opposed SSF/PFP might use a control group to evade true implementation of SSF/PFP. Interest in minimizing the size of the control group contributed to acceptance of the federal consolidation of SSF and PFP and their evaluation as a single program.

The decision to end SSF/PFP in May 1997, almost three years short of the originally intended four-year duration, apparently stemmed from a desire to prepare for Wisconsin Works (W-2), the successor to AFDC in the state. While that may have been a sensible decision (considering the information systems difficulties discussed below, which pertain as much to W-2 as to SSF/PFP), the premature end of SSF/PFP meant that the evaluation design had to be readjusted in mid-course and that participants had a shorter exposure to the requirements of PFP. In addition, at no time during the operation of SSF/PFP were all of the information systems issues resolved; had the program run longer, many of these problems would perhaps have been lessened.

2. Information systems design and implementation problems. The period of SSF/PFP operation (March 1996 through April 1997) occurred during rapid transition in Wisconsin’s efforts to experiment with and implement changes in the way assistance is delivered to low-income families with dependent children. Beginning in the late 1980s, the state implemented a variety of waiver programs, including LearnFare (relating to school attendance by dependent children), Two-Tier Benefits (creating benefit differences between longer-term residents and new arrivals to the state), the AFDC Benefit Cap (eliminating grant increases for children born while a family was on AFDC), Parental and Family Responsibility (addressing the role of the nonresident parent), and Work Not Welfare (focusing on work in exchange for assistance, with time limits).

In 1993, the state also began to convert its automated case management systems for income maintenance from the Computer Reporting Network (CRN) to the Client Assistance and Re-employment System (CARES) and to incorporate and expand the JOBS data collection system (WDS/WPRS) into CARES. This information technology transition, which occurred in 1994–95, required extensive effort and, during a transitional period, actually reduced the level and quality of management information available on welfare in Wisconsin.

Reduction in the quality of management information occurred because, by the early 1990s, the state had developed staff expertise in the transformation of case data in CRN to management information on the dynamics and demographics of the AFDC, Food Stamp, and Medicaid caseloads. Researchers and analysts at DHSS had developed their own sets of research files which they updated, often on a monthly basis, from the CRN system. With the transition from CRN to CARES, the existing file and database infrastructure needed for management tracking and analysis were rendered obsolete except for historical analysis; work to build a new management information infrastructure had to begin anew. As the state built a new on-line system to determine eligibility, to track status and compliance, to make and track client and vendor payments, to maintain audit information, and to provide workers with information and alerts concerning needed data entry, it also had to develop a management information system that could provide routine reports and permit specialized analyses in response to management questions. The state also had to train local staff on the system changes, including the use of new screens in the on-line systems, new directions for navigating among screens, new rules for specifying which screen should be used for a particular transaction, and new instructions on how to interpret the planned reports.

Wisconsin contracted with Deloitte and Touche (D&T) to build and maintain the CARES system, including, when necessary, to reprogram it to reflect changing policies and specifications. The timetable for the original construction of CARES, before it was placed in operation, did not allow time for D&T to fully integrate the AFDC eligibility determination system with the JOBS system. As long as AFDC operated under something close to its traditional policies, this was not a large problem. The AFDC, Food Stamp, and Medicaid parts of CARES involved determination of eligibility and payment of benefits; links to performance in JOBS programs could be handled through the development of limited automated switches and reliance on manual information sharing between JOBS and AFDC staff.

The need for system linkages between JOBS and AFDC intensified under SSF/PFP, however, owing to its requirements that participants engage in particular work activities to gain and retain eligibility for AFDC. Unlike the traditional JOBS program, benefit levels for PFP participants differed according to the precise number of hours they were in and out of compliance with their program requirements. Because full integration of the JOBS and AFDC portions of CARES had not occurred when SSF/PFP began, manual interventions by county workers in the operation of CARES were necessary.

Not only was the system integration between the AFDC and JOBS components not complete when SSF/PFP began, but no management information system had yet been developed. The expertise of D&T was in developing and operating on-line transactions processing (OLTP) systems, not in creating the infrastructure needed for management monitoring and analysis (one form of which is known as on-line analytical processing, or OLAP, system). Because key state staff with experience in the CRN management information infrastructure had accepted employment elsewhere, DWD also had little experience in developing a data analysis infrastructure.

In initial discussions to plan an analysis infrastructure that would allow an evaluation of SSF/PFP, the two organizations—DWD and D&T—decided to develop a series of extracts designed to capture critical data on all SSF applicants and PFP participants. However, neither the extracts nor the reports generated from the extracts were ever tested for reliability during the systems development period, when CARES programming was being tested using AFDC cases constructed to facilitate testing. Even by November 1997, when MAXIMUS received notification that its evaluation contract would terminate, no tested extracts or reports on SSF/PFP were as yet available. With the termination of the contract, DWD and D&T staff working on the development of the extracts were deemed needed on other projects thought to be of higher priority and were reassigned. The SSF/PFP extract testing process was never fully completed.

3. Evaluation design issues. During the development of an OLTP system, many decisions are made which may have unintended consequences for a planned evaluation or future analysis. One decision that would long plague analysts was the choice of the "check" digit as the basis for random assignment. The check digit was the second-to-last digit in the Request-for-Assistance number generated when a person applied, and was used as a quick check for the accuracy of the other numbers; that is, certain combinations of earlier digits were impossible given a particular check digit.

D&T and DWD staff believed that assignment to control and experimental status in the research counties was evenly divided. However, because D&T chose to use the "check," rather than some other, digit of the Request-for-Assistance number as the basis for assignment to control or experimental status, an individual selected to be in the study sample was approximately 10 percent more likely to be designated as a member of the control than of the experimental group. This selection modestly conflicted with the policy of keeping the control group small, but the implications were not recognized until after the experiment had ended, when it became apparent that the control and experimental groups were of unequal size. The assignment was still random, so far as we can now determine, but considerable analytical effort was required to resolve the (primary) reason behind the unequal sample sizes.

4. Inconsistent practices in data entry and in the treatment of cases assigned to control and experimental groups. Recent analysis of SSF/PFP data has revealed many examples of missing or "impossible" data entry. Local staff appeared to struggle with the new SSF/PFP screens and the technical requirements for moving from one screen to the next, or "driver flow." Had management reports been available earlier, some of these instances of missed or misleading data entry would have become immediately obvious. Without these reports, no mechanism was available to evaluate whether workers understood CARES and were using it correctly for SSF/PFP. Near the end of the program period, a few state staff became adept at identifying data problems, but their skills were not institutionalized to produce a systematic review of SSF/PFP data accuracy within CARES.

State SSF/PFP policy required the collection in CARES of basic data on every individual who made a personal request for assistance. This information (name, address, and social security number were the minimal requirements) was enough to generate a Request-for-Assistance number in CARES, from which a person’s status as control, experimental, or nonexperimental could be determined. Some DWD officials believe that not all local workers followed this practice, but instead waited until the applicant was "serious" before making CARES entries. This would reduce in CARES the number of individuals who could be identified as having been "diverted" and might, depending on how county staff characterized the SSF/PFP program, have systematically affected the kinds of people who entered the program and became a part of the experiment, although it should not have affected differences between the treatment and control groups who did enter the program.

Local practices with the most potential for affecting the SSF/PFP evaluation involved CARES "work-arounds," or manual interventions in system operation. Because CARES had not been fully developed when SSF/PFP began, a variety of work-arounds were necessary for local workers to use the system to carry out intended policies and procedures. As particular problems became apparent, DWD developed and disseminated some of the necessary procedures. During the first 10 months of program operation, DWD disseminated information on a variety of work-arounds. Most of the problems occurred in cases which had been recently closed or cases in which a new individual was added after the case had been already been opened. This frequently had the effect of resetting SSF requirements for all members of the case, whereas the primary adult may have already satisfied the requirements. Other shortcuts were developed by local workers who found ways to circumvent the CARES driver flow when they found it too cumbersome. These were generally not shared with other counties and did not become standard practice.

One of the state-directed work-arounds with special implications for the evaluation of SSF/PFP concerned instructions to manually override the SSF requirements for certain applicants whom CARES erroneously assigned to active SSF status. These individuals (women with children younger than 1 year, caring for a child receiving SSI, over age 60 or under age 16, assigned to a tribal jobs office, and a few other standard JOBS exemptions) were exempt from SSF requirements. On the basis of the instructions distributed by the state, many local staff made these changes only for cases in the experimental group, presumably since those in the control group were, by definition, exempt from the SSF requirements. This work-around had the unintended consequence of deleting these individuals from the SSF extract. Because this work-around was disproportionately applied to adults in the experimental group, it contributed to the unequal numbers of control and experimental cases that were observed in the research files once analysis had begun.

State officials also believe that once local workers learned how to release individuals from SSF requirements, they continued to do so even after CARES was correctly programmed and no longer required manual intervention. Some of these manual releases—which were again applied more often to the experimental group than to the controls, who were not subject to SSF/PFP requirements—were apparently for cases which would not have been released under a correctly programmed CARES system. That is, an analysis of the data suggests that local workers may have manually released from SSF requirements individuals for whom the requirements were deemed "unfair." Some evidence of the continued use of this work-around even after the original CARES problem was repaired can be inferred from the number of times that workers set the manual override switch in a way that exempted applicants from all SSF/PFP requirements. During April through July 1996, the number of individuals assigned to experimental status for whom this override switch was set averaged 205 per month; for individuals assigned to the control group the monthly average was 3. The problem in CARES that necessitated this work-around was repaired in August 1996. From September 1996 through March 1997, the number of individuals assigned to the experimental group for whom this override switch was set averaged 102 per month; for those assigned to the control group the monthly average was 2. It appears that local workers continued to use the work-around through the remainder of the experiment. The precise characteristics of the cases they exempted are unclear; presumably the additional exempted cases were those whom local workers thought, for whatever reason, to be inappropriate for the SSF/PFP program.

Using the numbers of individuals for whom the override switch was set is not, however, an accurate measure of those omitted from the SSF/PFP extract, because of the frequent appearance of duplicates in the exempted group. Duplication appears to have occurred when workers "practiced" entering new applications, when household composition changed and workers added new members to the group, and when individuals reapplied for assistance and were again subject to program requirements. Controlling for this, we estimate that 388 of the individuals assigned to the experimental group were omitted from the extracts; we estimate that 10 of those assigned to the control group were omitted from the extracts.

Incorporating both the missing individuals and the impact of the use of the check digit for assignment, we would expect to find 3,143 individuals assigned to SSF experimental status and 3,690 to control status. This is reasonably close to our constructed totals: 3,176 in the experimental group and 3,657 in the control group. We believe we have correctly identified most of the individuals who were assigned to experimental or control status.

Some other local practices also affected the data in ways we cannot quantify, although their impacts on the estimates of experimental impact are likely to have been modest. As noted in the field reconnaissance report (Appendix 1), the CARES system was occasionally down during business hours, owing to the need to reprogram the system for frequent policy changes. Because CARES performed the random assignment, in these instances applicants could not be assigned to experimental or control status. Without state instructions on how to handle such situations, counties used various coping strategies: some asked the applicant to return the next day, when the computer would be working; others told all applicants they would have to meet SSF requirements, believing that they could then provide the "good" news of exemption from the requirements for those later assigned to control status. However, it is possible that some of those "incorrectly" told that they had to comply with SSF may never have returned for further instructions; that is, they may have been diverted, whereas they might have returned if they had been told of their correct, control-group assignment.

In addition, for whatever reasons, there is some evidence that at the outset of the program local workers did not fully understand SSF/PFP and how cases should be handled in CARES. For example, the data extracts indicate some data entry "errors" in CARES, such as multiple—almost identical—cases built for individuals and their family members, all with the same start date. This problem at the least requires the analyst to decide what constitutes the "real" application or case.

Estimating the Magnitude of the Data Problems

IRP researchers began the analysis of SSF/PFP data well after the program had ended, and the sources of data problems described above became apparent only gradually. We have tried to determine the precise impact of the problems on data findings and have been only partially successful. After considerable analysis, we have concluded that the use of the check digit and the resulting inequality in sample sizes did not affect the utility of the experimental data for constructing an unbiased estimate of program impacts. We also conclude that the brevity of the experiment is likely to have reduced overall experimental effects, but should not by itself have affected differences between the treatment and control groups. Although the lack of testing of the SSF/PFP extracts was unfortunate, we believe we have circumvented this problem in a limited number of variables by drawing our data from another CARES extract that is frequently used for management information analysis and has been carefully tested. The SSF/PFP extracts contain more case details and would have been a desirable data source if they had been fully developed and tested, but we believe the other extract provides an adequate data source.

The data that are obviously wrong or missing or which clearly reflect duplicate entry can generally be omitted from the analysis. We have made decisions about which data to include and exclude (see Appendix 3), which we believe are appropriate. Appendix 3 describes in outline form how the research files were built, on the basis of key decisions, including: (1) decisions about when to exclude an apparent duplicate and when to exclude both duplicates; and (2) decisions about problems created by bad dates.

The problem we can neither measure nor control for with precision is the apparent inconsistency of data entry and treatment of SSF applicants assigned to the control and experimental groups in the four research counties. If local staff used their "work-around" capabilities to remove from the experimental group those judged least likely to succeed in the work force and not (or to a lesser extent) from the control group, then measured labor force differences in the experiment between the treatment and control groups are likely to bias upward the estimates of actual impacts, but by an amount we cannot estimate because we have no way of reconstructing the actual exemption behavior of local staff.

 

 

III. IMPACT ANALYSIS OF PAY FOR PERFORMANCE

 

Data and Methods

The purpose of random assignment to treatment and control groups is to allow an unbiased estimate of net program impacts. In theory, random assignment produces treatment and control groups with initially similar observed and unobserved characteristics. Thus, differences in mean outcomes after treatment may be appropriately interpreted as program effects. As the previous section indicates, we believe that most evidence suggests that the assignment to experimental and control groups was random. We do not believe, however, that the actual placements of individuals in the program were random. Because county staff overrode assignments to SSF in a substantial number of cases, there may be systematic differences in the characteristics of individuals who ultimately participated in each group.

Our data analysis and review of implementation information suggest that work-arounds were a less serious issue among AFDC cases directly assigned to PFP on March 1, 1996. The state had no plans to design a data system that would automatically exempt people with particular characteristics from the requirements of PFP, probably because most people with those characteristics would have been exempted from both SSF and PFP during the process of assignment to SSF. JOBS workers did have the flexibility to exempt people on a case-by-case basis from PFP, but they appear to have used that flexibility with restraint. Table 1 shows the percentage of people who entered PFP and were subsequently exempted. With the exception of the CA reason (caring for an infant under 12 weeks), the percentages were consistent over the life of the experiment. It is possible that informal exemptions occurred—that, for

Table 1

Table 1 continued

example, workers assigned an individual to JOBS but did not actually require participation. Our discussions with JOBS workers in Wisconsin suggest, however, that most workers viewed AFDC participants as capable of meeting the terms of JOBS assignments and generally held the participants to those assignments.

Table 2 shows the demographic characteristics of the control, experimental and nonexperimental groups in the research counties. The final two columns show the characteristics of those in all nonresearch counties, and in all nonresearch counties except Milwaukee (which includes about half the cases). Overall, there is little evidence of statistically significant differences between the control and experimental groups. Experimental group members are significantly more likely to have two children, and are marginally more likely to have a person in the case receive SSI. There are substantial differences between the experimental and nonexperimental groups, but as Table 3 indicates, these are almost entirely attributable to the overwhelming representation of Dane County participants in the nonexperimental group. There also appear to be substantial differences between the research counties and nonresearch counties. Some of these differences, however, are attributable to distinctions between Milwaukee and the balance of state, as can be seen by comparing the control and experimental group characteristics with those of nonresearch counties other than Milwaukee (last column).

Table 3 shows the same demographic characteristics for control, experimental, and nonexperimental cases in each of the research counties. As was the case for all control and experimental cases combined (Table 2), there are few statistically significant demographic differences between the control and experimental groups. Table 3 also shows fewer differences between experimental and nonexperimental groups than appeared when the four counties were considered together in Table 2; most of the differences in Table 2 resulted from the disproportionate representation of Dane County participants.

Table 2 here

Table 3 here

Note that the first row of Table 3 shows the number of cases assigned to control and experimental groups. As discussed above, we would expect 55 percent of cases to be assigned to the control group, and 45 percent to the experimental group. To test whether this assignment rate actually occurred, we included 64 additional cases originally assigned to the control group but later excluded from the analysis (generally because another adult with a different experimental/control/nonexperimental group assignment moved into the household during the experiment). Adding these cases (29 for Dane County, 2 for Dodge County, 9 for Jefferson County, and 23 from Waukesha), we found that the observed distribution of cases is significantly different from the expected assignment rate (p<.05) only in Dane County, where approximately equal numbers of cases were assigned to each group (476 control, 480 experimental, once we add the excluded cases).

We do not have a satisfactory explanation for the lower assignment rate for controls in Dane County. Because the random assignment process was originally conceptualized as producing an equal number of control and experimental participants, it is possible that the implementation of random assignment in Dane County was set to continue until that occurred, although we have found no evidence that the random assignment process worked differently in Dane County. It is also possible that a nonrandom group of individuals was excluded from the control group, though we have not found evidence beyond the size of each group to support this. Indeed, Table 3 suggests that the experimental and control groups in Dane County had quite similar characteristics. Moreover, we would expect that any possible exercise of local staff discretion in the assignment process would lead in the direction of increasing, rather than decreasing, the size of the control group.

In general, then, we believe that random assignment among participants in AFDC on March 1, 1996, who moved directly into PFP was largely successful and that differences in outcomes between experimental and control groups may be interpreted as effects of the PFP program. In part because we are somewhat concerned about the apparent lower level of assignment to the control group in Dane County, much of the analysis that follows shows outcomes for experimental and control groups by county, allowing us to distinguish effects in Dane County from those in counties. We discuss results of the PFP analysis in the following section. In the case of individuals entering SSF/PFP after March 1, we are less confident of the validity of the experiment. Because differences between experimental and control groups are not clearly interpretable, we review outcomes for SSF/PFP in Appendix 2.

 

Results

In this section we review outcomes for individuals already participating in the AFDC program on March 1, 1996. This "stock" of cases in the four research counties was assigned either to the control group (not subject to PFP requirements) or to the experimental or nonexperimental groups (subject to PFP). We focus on four outcomes: AFDC receipt, Food Stamp receipt, earnings, and total measured income (the combination of AFDC, Food Stamps, and earnings). Although the experiment ended in May 1996, 14 months after initial assignment to treatment and control groups, we consider outcomes over six quarters (18 months) starting with the second quarter of 1996 (April-June), owing to the possibility that effects on earnings or welfare program participation could have continued after the experiment. We stopped following outcomes at approximately the point at which the W-2 program began.

AFDC Participation and Benefits

Table 4 reports AFDC exit rates in the quarters following assignment to PFP experimental or control groups. Since PFP began in the last month of a quarter (in March 1996), the first quarter of 1996 refers only to March. The second quarter is April-June; the third quarter is July-September; the fourth quarter is October-December. The basic structure of this analysis is to follow those who were assigned to PFP in March 1996 through the end of the third quarter of 1997, when W-2 began. The first column shows that 13 percent of participants assigned to the control group left AFDC by the end of the first quarter (that is, in Quarter 2, 1996), and a total of 29 percent (13.17 + 15.36) had left by the end of the second quarter. By the end of the third quarter of 1997 (the last before implementation of W-2), about

Table 4 here

two-thirds of cases in the control group had left AFDC. Among the 33 percent of cases remaining on the rolls, somewhat less than half (14 percent) left the program but returned before September 1997, the end of the observation period.

When we compare the pattern of exits in the experimental group, we see that those assigned to participate in PFP appear to be more likely to leave. This comparison of exit rates suggests that PFP is having the expected effect, encouraging more individuals to leave the program. This result is confirmed by a multivariate analysis of the probability of leaving AFDC by the end of the sixth quarter, which shows experimental status as having a significant (P<.001) positive effect on exit rates.

The patterns in the remaining columns of Table 4 are mixed. Because all AFDC participants in nonresearch counties were subject to PFP, we expect outcomes in those counties to be similar to those in the experimental group, except for differences related to county characteristics. In fact, nonresearch counties other than Milwaukee do show fairly similar exit patterns to those of experimental cases, though there is a significant difference in the total exit rate. On the other hand, we also expect "nonexperimental" participants in the research counties to have outcomes similar to the experimental group, yet the table shows that nonexperimental cases were substantially less likely to leave AFDC than experimental cases. A possible explanation is that both experimental and control group members were given more attention than nonexperimental cases, especially in the early phases of implementation, when the greatest discrepancies in exit rates appeared. This was also true among those who were subjected to the SSF experiment, which gives credence to the possibility that both experimental and control group members were given more attention than were nonexperimental cases, and that this pattern affected subsequent AFDC utilization, although during the field reconnaissance county staff denied that control group members received less attention.

Table 5 shows pattens of exit separately for each experimental county. Experimental cases were significantly more likely to leave than control cases only in Dane County, though there were also statistically insignificant differences in the expected direction in Jefferson and Waukesha Counties as well. In Dodge County, experimental cases were actually less likely to leave AFDC.

Under PFP, AFDC benefit amounts are tied to hours of participation. The policy change might therefore be expected to affect not only the proportion receiving benefits, but also the level of benefits among recipients. The first row of each panel in Table 6 shows mean monthly benefit levels among those receiving benefits in that quarter. There is no consistent pattern of differences in benefit amounts over the quarters, although in Quarter 3, 1996, controls received a statistically significant lower benefit than experimentals. Thus, although experimental cases, subject to PFP, do appear less likely to participate in AFDC, there is no evidence that PFP leads to reduced benefit levels among participants. While Table 6 shows significant differences between experimental and nonexperimental cases, these again largely disappear when we consider each county separately. As shown in Table 7, a consistent difference between experimental and nonexperimental cases is apparent only in Waukesha county, where nonexperimental cases received significantly higher benefits than either control or experimental cases in five of the seven quarters.

Food Stamp Participation and Benefits

Of those who moved from AFDC to PFP on March 1, 1996, about 90 percent were receiving Food Stamps. Table 8 shows the pattern of exit from the Food Stamp program among those enrolled at

Table 5

Table 6 here

Table 7 here

Table 8 here

entry. Overall, somewhat fewer individuals left the Food Stamp program than the AFDC program over this period. In contrast to the results for AFDC, there was no significant difference in exit rates between the control and experimental groups in any quarter (though a smaller proportion of experimental cases never left the program). This result is confirmed by a multivariate analysis, which suggests no significant difference the in the proportion of experimental and control group members who exit. Table 9 shows exit rates by county, and indicates that in Dane County (but not the other counties) experimental cases were more likely to exit (and less likely to never exit). Table 9 also shows that the differences between experimental and nonexperimental cases shown in Table 8 persists only in Waukesha County.

A comparison of Food Stamp benefit amounts similarly shows little evidence of an experimental impact. As indicated by Table 10, there is no consistent difference, the only significant difference being higher benefits for experimental cases in the first quarter. While benefits for nonexperimental cases were consistently higher than those for control or experimental cases overall, there is no consistent significant difference when we consider each county separately, as in Table 11.

Earnings and Income

The goal of SSF/PFP was to increase self-sufficiency by discouraging the receipt of welfare and by encouraging employment. We now turn to effects on employment. Table 12 shows the percentage of cases with earnings, and mean and median earnings levels among earners, for the six quarters observed. In the control group, employment rates rose over time, from 46 percent in the first quarter to 60 percent in the sixth. Over the same period mean (and median) quarterly earnings rose consistently, from $2,141 to $2,827 ($1,612 to 2,617). There was no significant difference in the employment and earnings of

Table 9 here

Table 10 here

Table 11 here

Table 12 here

control and experimental groups. Again, as we see in Table 13 the differences between experimental and nonexperimental cases decline when we consider counties separately.

 

 

IV. SUMMARY AND CONCLUSION

 

The SSF/PFP program was implemented during a period of large change in Wisconsin AFDC policy and the state’s AFDC information system. When SSF/PFP began in March 1996, the state had not yet fully integrated the automated JOBS information system with the other subsystems in CARES, and CARES did not yet have a functioning management information component. Having to respond simultaneously to many systems challenges, state staff and the systems vendor with whom they contracted did not verify SSF/PFP extracts. In the daily operation of SSF/PFP, county staff were required to perform several manual interventions in the CARES system, including, for a time, manual overrides of SSF requirements for people who should not have been subject to them. County workers also appear to have manually overridden SSF for people who did not meet the precise requirements for overrides and who would not have been automatically overridden in a fully functioning CARES system. Control group members were not similarly overridden, because they were not subject to SSF requirements. As a result, it is likely that the comparability of the two samples was compromised.

Owing to these data problems, this study presents data from the SSF period of the experiment (after March 1, 1996) in Appendix 2. We are more confident of the data for control and experimental group members who were on AFDC on March 1, 1996, and converted directly into PFP, since the eligibility overrides were not used in that program. Based on comparisons between treatment and control groups participating only in PFP, the program appeared to increase the likelihood that those required to participate in it would leave and remain off AFDC, while having no statistically discernable impact on either Food Stamp utilization or earnings. However, these measured effects should be interpreted with

Table 13

caution owing to our concern about initial assignments in Dane County, the only county in which a statistically significant difference in AFDC exit rates occurred.

Wisconsin initially contracted with a single agency to perform both the process and impact evaluation of SSF/PFP, but terminated that contract six months into the experiment, at a point when data from CARES had not yet been given to the evaluator. In another and more recent experimental evaluation (concerning the child support components of W-2), the state’s Department of Workforce Development has allowed the evaluator to obtain administrative data directly from CARES and has required the evaluator to analyze and report on the data each quarter. The quarterly reporting cycle has forced the evaluator to pay careful attention to the data as CARES produces it, which has resulted in early notification of, and responses to, unanticipated enrollment trends and system anomalies. The SSF/PFP experiment would have profited greatly from such an arrangement, and we recommend that it be replicated in future experiments using administrative data.

 

 

 

 

 

 

 

 

 

 

APPENDIX 1

 

A FIELD RECONNAISSANCE OF THE SELF-SUFFICIENCY FIRST/

PAY FOR PERFORMANCE EXPERIMENTS IN WISCONSIN

 

 

 

 

 

Thomas Kaplan

Institute for Research on Poverty

University of Wisconsin–Madison

 

 

 

February 1998

 

 

Report Submitted to the Wisconsin Department of Workforce Development

 

 

This research was supported by a contract between the Wisconsin Department of Workforce Development and the Institute for Research on Poverty. Opinions expressed are those of the author and not those of the supporting institutions.

INTRODUCTION

 

Wisconsin implemented its Self-Sufficiency First and Pay for Performance programs in March 1996. Self-Sufficiency First (SSF) was a diversion program requiring applicants for AFDC to complete an interview with a county staff member holding the title of "financial planning resource specialist." After the interview and within the 30-day AFDC application processing period, the applicant had to participate in the state JOBS program for at least 60 hours, including 30 hours of direct employer contact. If needed, child care assistance was provided during the mandatory hours.

Pay for Performance (PFP) was an intensive JOBS program requiring 20–40 hours per week of participation from AFDC case heads and enforcing sanctions on those who failed to comply. For each missed hour of required JOBS participation, a penalty equal to the federal hourly minimum wage was imposed, first on AFDC and then on Food Stamp benefits. Recipients who participated in JOBS for less than 25 percent of scheduled hours received the full penalty, which reduced the AFDC grant for the next month to zero and the Food Stamp benefit to the federal minimum of $10. Subsequent participation in JOBS restored benefits for future months.

Both SSF and PFP were implemented under a single waiver granted by the federal government. A condition of the federal waiver was evaluation of program impacts through an experimental design. The evaluation was to allow the state to determine, if possible separately for SSF and PFP, whether and how much the programs

C reduced welfare dependency

C promoted economic self-sufficiency

C affected participation in JOBS

C affected family structure and stability

C affected the well-being of children, including their long-term prospects for self-sufficiency.

The experimental evaluation was carried out in four counties (Dane, Dodge, Jefferson, and Waukesha). In March 1996, at the time of SSF/PFP implementation, the four counties together contained an AFDC caseload of 4,081, which represented 6.5 percent of the total Wisconsin caseload of 62,888. Appendix Table 1-1 shows the caseload distribution among the four counties.

APPENDIX TABLE 1-1

AFDC Caseloads in the four Demonstration Counties, March 1996

 

County AFDC Cases

 

Dane 2,722

Dodge 277

Jefferson 149

Waukesha 933

TOTAL 4,081

 

Source: Department of Workforce Development, Monthly AFDC Participation Report, March 1996.

Note: AFDC cases include both AFDC-Regular and AFDC-Unemployed Parent cases.

 

A total of 1,500 randomly selected active AFDC cases in these four counties were assigned to a control group, the members of which continued under the AFDC program. All remaining AFDC cases active on that date in the four counties were placed in the PFP program, and 1,500 of these cases were randomly identified as the experimental treatment group, to be followed as part of the demonstration. Starting on March 1, new applicants for AFDC in the four experimental counties were randomly assigned either to the SSF program or to a control group, which could apply for AFDC without participating in the SSF program. This random assignment of new applicants was to continue until 1,500 cases were placed in the control group and 1,500 had been assigned to an SSF/PFP treatment group whose experiences were monitored. After reaching these numbers, all remaining applicants would become unstudied participants in the SSF/PFP program.

The experiment actually ran from March 1, 1996, to June 1, 1997, at which time all new applicants were assigned to SSF and all existing control group participants were reassigned to the regular PFP program. The state thus ended the experiment almost three years short of its originally anticipated four-year duration and terminated random assignment about 10 months before the planned ending date of that activity. The decision to end the experiment early apparently stemmed from several factors: an increasing desire to concentrate on full-scale implementation of W-2 in all counties, a belief that the SSF/PFP demonstration interfered with the necessary focus on W-2, and the new flexibility offered by the conversion of the federal AFDC program into a block grant.

In terminating the experiment, the state also ended its contract with MAXIMUS, the firm that had been selected to evaluate SSF/PFP program impacts and processes. By agreement with the state, MAXIMUS completed only a report summarizing findings from a half-day visit to the Dane County SSF/PFP program, conducted on November 21, 1996. The state thus contained in its CARES computer system 14 months of data (the state agreed to treat May 1997 as a transitional month and ignore program statistics established in that month) on at least the earnings and public assistance utilization of SSF/PFP program participants and control group members. With the termination of the MAXIMUS contract, however, no immediate provision was made to summarize or analyze the data. Before determining the nature and scope of such an analysis, the state contracted with the Institute for Research on Poverty (IRP) for a "field reconnaissance" of the SSF/PFP program in the four demonstration counties.

The field reconnaissance had five broad purposes:

1. To obtain a basic description of how SSF/PFP was implemented in the four research counties.

2. To record the thoughts and reactions of local economic support and JOBS staff in the four counties to the SSF/PFP program and to management challenges raised by it.

3. To record the thoughts and reactions of local economic support and JOBS staff in the four counties to the random assignment demonstration, in part to learn lessons which might apply to a future experimental evaluation of changes in child support policy.

4. To determine if anomalies in implementing the experiment may have affected the ability of data analysts to make valid inferences concerning the impact of the SSF/PFP programs.

5. To determine how at the conclusion of the experiment the four counties shifted control cases to the county’s standard operating procedure.

The preparation of the field reconnaissance involved several steps: (a) reading the evaluation RFP, the federal terms and conditions, and the evaluation plan selected in the RFP review process; (b) reading or re-reading literature on experimental evaluation; (c) a review of case flow data prepared for the federal government to demonstrate cost neutrality; (d) discussions with state staff involved in the SSF/PFP program; and (e), most important, interviews with economic support and JOBS staff in the four experimental counties.

Interviews were conducted with at least one economic support and one JOBS staff person in each county. In three of the four counties (Dane, Dodge, and Jefferson), interviews were conducted with more than one of these staff members. Many of the officials interviewed were not county employees, because in all four counties, the JOBS program was the responsibility of another agency: the State of Wisconsin Job Service in Dodge and Jefferson Counties, Forward Service Corporation in Dane County, and the Kaiser Group, Inc., in Waukesha County. The interviews lasted between one and two hours each and were conducted with a pre-established set of questions. Because most local staff were busy preparing for W-2, the interviews were not always easy to schedule, and the willingness of staff in all four counties to make time for discussion of what was, by the time the interviews occurred, a no-longer operational experimental demonstration was indispensable to the investigation. Interview respondents were encouraged to offer broad answers to the interview questions; parts of many of the interviews took the form of wide-ranging conversations. The interviews were analyzed through the application of what is generally termed "grounded theory," in which a researcher first tries to record the sessions without analysis and later identifies common response categories and then codes responses into those categories (Glaser and Strauss, 1967). A draft of the field reconnaissance was then shared with state staff, who developed a written response with recommendations that were incorporated in the final report.

The remainder of this report contains three sections: a discussion of SSF/PFP implementation in the four counties; a description of problems that arose in the experimental demonstrations; and a discussion of lessons that may be learned from this experiment that could be applied to future experimental-design evaluations, such as the child support waiver evaluation.

 

 

THE IMPLEMENTATION OF SSF/PFP

 

The SSF and PFP programs were major reforms and proved administratively challenging, probably more so than any other statewide changes which the Division of Economic Support implemented under federal waiver authority. The Division has had to meet a series of significant implementation challenges in recent years, including new LearnFare policies for teens, implemented statewide in 1988, and other statewide waivers such as the 30 and 1/6 earnings disregards, the 12-month Medicaid extension, and the AFDC benefit cap. Many of these waivers required CRN/CARES programming changes, new rules, and new explanations to clients. But none altered the basic activities of county agencies in the way that SSF/PFP did, requiring fundamental changes in the way county agencies dealt with participants and potential participants from the moment they entered the welfare office. Under SSF and PFP, new notices had to be sent to clients. New explanations had to be given to applicants who questioned why they had to apply for so many jobs before they could gain eligibility for AFDC. New procedures had to be invented for determining whether clients had complied with sharply altered participation requirements and assessing whether failures to comply had been owing to circumstances beyond their control. New child care resources had to be made available to clients, and new explanations had to be given to clients concerning the expansion of the partial sanctions under JOBS into the full-family sanctions of Pay for Performance.

As might be expected, these significant changes did not always proceed smoothly in the four demonstration counties. This field reconnaissance was in no sense a formal process or implementation evaluation of SSF/PFP, an effort that would have required more field research and, ideally, some formal surveying of county staff and program participants. As long as this field reconnaissance was to involve discussions with officials in the four demonstration counties, however, it seemed sensible to discuss the implementation of the SSF and PFP programs themselves and not just the experimental aspects. Not all counties raised the same implementation concerns. The following concerns were expressed by at least one county: the difficulties of providing minimally desirable JOBS services to clients within the 30 days of the SSF program, a shortage of transportation resources, uncertainties in PFP sanction policies, and problems in the client notification process.

1. The difficulties of providing minimally desirable JOBS services to clients in the 30 days of the PFP program. Dodge County is small enough that only 5–7 people registered for AFDC each month. The SSF program presented special challenges to such a low-volume county, where people applied for AFDC in the county seat of Juneau but traditionally received their JOBS programming in Beaver Dam or Watertown. County officials did not think it efficient to pay for a permanent JOBS staff person to be stationed at the county welfare office in Juneau. Yet they needed to make it possible for applicants to receive 60 hours of JOBS services within the 30 days of SSF, and they feared that getting the applicants to existing Job Service offices in Watertown and Beaver Dam would use up critical days. County officials tried to address their dilemma by training their economic security staff to provide key JOBS programming, such as motivational programs and basic instruction in job-seeking skills. Most economic security staff in Dodge County welcomed these broader responsibilities, but their supervisors did not consider them to be fully successful at training clients for the critical first few interviews, after which the clients might improve their interview skills based on experience. In larger counties, the first few interviews can be treated as practice for later efforts. But in a small county with few potential employers, poor performances in the first few interviews can exhaust much of the pool of potential employers.

2. A shortage of transportation resources. Dodge County is also sufficiently rural that transportation to job interviews, jobs, and child care providers was an acute problem in both SSF and PFP. No public transit systems operate in the county, although for the transportation of frail seniors and people with disabilities, the county offers a network of retired volunteers who use their own cars and receive a mileage reimbursement. This volunteer network suffices for the transportation needs of a population that requires occasional visits to a physician or grocery store in full daylight. The network of private, volunteer vehicles and drivers is not so well adapted to the needs of a population requiring regular trips every day, often before the sun comes up on winter mornings and with young children who should sit in a car seat and may spill their food and drink on the way. Transportation also presented problems for PFP in Dane County. A growing number of jobs for which many AFDC recipients qualify are in parts of the county not reached by the Madison bus system, although SSF participants needing to make 30 employer contacts to qualify for AFDC could generally do that within the confines of the city bus system.

Such transportation concerns will apparently be a subject of discussion in meetings of W-2 advisory committees around the state. In Dodge County, employers may be asked to facilitate ride pools. The 1997–99 biennial budget contains $1 million in 1997–98 and $2 million in 1998–99 for employment transportation, to be released after an expenditure plan is approved by the Finance Committee. Toya Nelson, a former DILHR official who now works in the Department of Transportation and is on loan to DWD, is responsible for developing the plan.

3. Uncertainties in PFP sanction policies. At least transitional problems with client notification and sanction policies arose in all four counties, but the problems seemed of most concern to staff in Dane and Waukesha Counties, in part owing to a greater dependence on the CARES system in larger counties, which have less opportunity for informal contacts among staff and between staff and clients. The concerns may also have derived in part from a more extensive use of educational programs by JOBS participants in Dane and Waukesha, some of whom complained bitterly when the modest partial-family sanction for failure to participate 20 hours a week under JOBS became a full-family sanction under PFP.

Some important SSF/PFP sanction policies were still under development when the program was started on March 1, 1996. The first formal training documents were distributed in February 1996, and training for SSF/PFP was conducted during that month. However, the AFDC policy manual, designed to guide county policy decisions, was not updated for PFP until August of 1996. During the 14 months of SSF/PFP, the sanction fields in the CARES system were modified. They first required economic support specialists to enter the actual hours of work into a CARES field, and sanctions were generated if the number of hours entered was less than the assigned hours. Under a subsequent modification, the CARES system routinely updated the field to equal the assigned hours, and economic support specialists had to enter data only if hours were missed without good cause.

4. The PFP client notification process. Perhaps inevitably in a program that had to meet many legal requirements, client notices and client orientation documents seemed cumbersome and were doubtless hard for some clients to understand. For most clients, full understanding depended on participation in orientation sessions, which many existing AFDC clients did not attend and were therefore caught unawares by the new PFP program. Also perhaps inevitably, many clients did not respond to notices until they received a check with a reduced benefit. Counties then had discuss the new requirements with the clients after the sanction and, where necessary and possible, determine retroactively that good cause existed and adjust the level of the next monthly check. These adjustments were perhaps easier in smaller than in larger counties. Again, however, these were transitional problems apparently lasting just the first few months of PFP.

 

 

IMPLEMENTATION OF THE EXPERIMENT

 

Much of the SSF/PFP experiment worked successfully. Given policy decisions by the state to limit the experiment to four counties and to restrict the share of Dane County participants who were in the control group (an issue discussed below), the automated assignment of clients to control and treatment groups seemed generally to function well. Some county staff expressed disappointment in the assignment outcome for particular cases, either because they thought SSF/PFP would be a desirable program for a particular client who was placed in the control group or because they wanted a client placed in the treatment group to be allowed to continue a full-time educational program under a partial sanction. The automated assignments were an important accomplishment; stationing an evaluation official in each county to make the assignments would have been costly and inefficient, especially in counties as small as Dodge, which received only 5–7 AFDC applications per month.

Yet problems occurred with the experiment. Although most were minor, some are probably serious enough to threaten the validity of findings, should the state decide to proceed with an analysis of the experimental data. The problems involved (1) difficulties in explaining reasons for assignment to control or experimental groups when a program participant complained; (2) difficulties in allaying county concerns over the potential of the experiment to affect a county’s "right of first selection" under W-2; (3) logistical problems in the abrupt shut-down of the program; (4) uncertain assignment procedures when a registration for AFDC occurred at a time when the CARES system was down; (5) a disproportionately rural quality to the experiment; (6) the general administrative complexity of running the experiment; and (7) uncertainty concerning the end-date of the experiment. The following paragraphs cover each of these concerns separately.

1. Difficulties in explaining reasons for assignment to control or experimental groups when a program participant complained. One factor in selecting the four counties for the experiment was apparently a concern to minimize the number of Southeast Asian refugees, particularly Hmong, who were in control and experimental groups. The thinking was that it would be quite difficult to explain why some members of the same clan were subject to SSF/PFP and others were not. Yet other groups presented the same kind of challenge. The Dodge County public assistance program, for example, serves large numbers of migrant workers whose families travel together, and it was never easy to explain to some of these families why they were subject to the new SSF/PFP requirements and others in the same circumstances were not. Because the SSF/PFP assignments were made by person and not by case, there were even a few two-parent cases among the four counties in which one spouse was in the control group and the other in the treatment group. In every experimental county, then, people subject to PFP/SSF knew similarly situated people who were not so subject, and their reactions apparently ranged from simple wonderment to sharply expressed resentment. The resentment was greater toward SSF than PFP, especially in counties that were able to process AFDC applications quickly for those not subject to SSF. In these counties, members of the experimental group subject to SSF saw other people receive almost immediate assistance, while they had to wait 30 days and make 30 employer contacts before receiving help. At times, clients colorfully expressed their resentments.

County staff faced with these reactions apparently received little guidance from the state concerning what to do if people assigned to the experimental group complained about their status compared to those in the control group. The state’s Self-Sufficiency First orientation form did not even mention that some people in four counties would not be subject to SSF/PFP, let alone explain why that was so. Without a clear script, county staff improvised. The most common answers to client concerns seemed to be: "The computer selected you; that’s just the way it is," and "You’re really better off being subject to SSF/PFP because you’ll have a good head start when W-2 comes, and the people not facing these requirements will suddenly have to make a huge transition all at once." The first of these answers was not a satisfying explanation, and the second had the potential to compromise the integrity of the experiment, an issue discussed further under problem 7 below.

Formal experimental demonstrations create inequalities by their nature, and it is never easy to arrive at satisfying answers when the differences in treatment are as significant as they seemed under SSF. But perhaps something could have been said about the experiment being followed by people around the country, and that the results might or might not lead other states to start programs like SSF/PFP. Additional possibilities are raised in the last section of this report.

2. Some of the experimental counties worried that their participation in the experiment could adversely affect their right of first selection under W-2. All counties which met identified performance criteria in the period before implementation of W-2 were assured that they would not have to compete with private agencies to become the W-2 provider in their county, that is, that they would have a "right of first selection" for administering W-2 in their county. The right-of-first-selection performance criteria centered on AFDC caseload reductions, and counties with a high percentage of clients in the SSF/PFP control groups were concerned that their caseload reductions would not be as large as reductions in counties without control groups. Dane County officials were especially vocal on this point, merging this issue with a more general concern expressed by County Executive Phelps that W-2 could result in the shifting of costs from the state and federal governments to the local level. State officials were aware of the issue and told county staff that their right of first selection would not be affected by their willingness to participate in the experiment. They developed, however, no description of exactly how the impacts of experimental status would be factored out of the right-of-first-selection decision, saying (in a letter to Dane County) that the issue would be addressed only if an experimental county was thought to be at risk of losing its right of first selection. That situation never arose.

3. For larger counties, the abrupt decision to shut down the experiment presented significant logistical problems. As the date for implementation of W-2 grew near, the state decided, with apparent consensus from the four experimental counties, to discontinue the experiment and convert all existing control cases to regular PFP status. Having made that decision, the state and the four counties wanted to end the experiment quickly so that the control group members not previously subject to PFP would have as much time as possible under the program to prepare for the requirements they would face under W-2. Three of the four counties (the exception was Dane) were able to bring most of the control group members in for an orientation and individual sessions on the PFP policies to which they would be subjected. The counties wanted to concentrate particularly on control group members who were partially sanctioned under JOBS and would face a full-family sanction under PFP; the state’s initial CARES-generated list of cases in this category was so inaccurate as to be almost useless to counties, but was corrected within two weeks. Dane County, however, with its 513 PFP controls, did not even try to hold PFP orientations for control group members, although many of the control group members did meet individually with their economic support specialist. Instead of providing group orientations first to PFP and then to W-2, Dane County decided to have just one set of orientation sessions, which would be limited to the requirements under W-2.

4. Assignment to control and treatment groups when the computer was down. The CARES system operates reliably much of the time, but it is less reliable in periods when significant logic changes reflecting altered policies are being programmed. During SSF/PFP, CARES was down often enough, especially as the system architecture was being created for W-2, to justify standard instructions from the state on what experimental counties should do when a new registrant arrived during a period when CARES could not make an assignment to treatment or control groups. Without state instructions, counties used various coping strategies. Standard protocol in some counties was to tell the applicant to come back the next day, when the CARES system would hopefully be functioning and registration could proceed normally with routine assignment to the treatment or control group. In other counties, standard protocol was to tell the registrant that she would have to meet SSF obligations. If the computer later assigned the registrant to a control group, county staff then had "good" news to share.

Larger counties with many AFDC registrants understandably did not want to bring clients back the next day for another session with an economic support worker. But the strategy of making all assignments to SSF when the system was down and correcting later based on actual system assignment may have moderately distorted the experiment. Registrants who were told they had to meet SSF requirements before attaining AFDC eligibility sometimes reacted by avoiding future contacts with the AFDC system, apparently concluding that they might as well get a job or that AFDC was simply not worth pursuing. Any control group members who were at first told they were subject to SSF and then left AFDC before they could receive the "good" news would have distorted the experiment: the experiment would treat them as controls who left the AFDC system, but their responses were based on a belief that they were members of the experimental group. This is probably not a large concern for the validity of the experiment, but it would be desirable in future experiments if the state gave explicit instructions for making assignments when the CARES system is down.

5. Assignment to controls occurred disproportionately in the smaller counties among the four that were included in the experiment. Appendix Table 1-2 shows caseload data for the PFP program in the four counties as of the date the program was terminated (May 31, 1997).

APPENDIX TABLE 1-2

Cases Assigned to the PFP Control Group, by County, May 31, 1997

 

AFDC Cases as a % PFP Controls as PFP Controls

AFDC PFP of the Caseload in a % of Controls in as % of AFDC

County Cases Control Cases the Four Counties the Four Counties Caseload

 

Dane 1,797 513 70.3% 53.8% 28.5%

Dodge 166 107 6.5 11.2 64.4

Jefferson 107 71 4.2 7.5 66.4

Waukesha 488 261 19.1 27.4 43.4

TOTAL 2,558 952 100.0% 100.0% NA

 

Source: Author’s calculations from the May 31, 1997, monthly AFDC participation report and a DWD document, "End Pay for Performance (PFP) Demonstration Project: End Control Assignment Meeting," June 2, 1997.

 

As Appendix Table 1-2 suggests, Dane County, which had 70 percent of the AFDC caseload in the four counties, contained only 54 percent of the control group for the experiment. By the end of the experiment, two-thirds of the AFDC caseload in Dodge and Jefferson Counties were PFP controls, while less than one-half of the Waukesha caseload and one-third of the Dane caseload were PFP controls. The low representation of Dane County in the experiment was intentional. Dane County was to have no more than 1,000 members of the experimental and control groups, apparently in part owing to a concern among DWD officials that the county might use its experimental status to avoid full transition to the "work-first" orientation of SSF/PFP. Yet the decision to limit the experimental representation of Dane County means that the experiment was not representative of the AFDC populations in the four counties as a whole. Coupled with parallel decisions to exclude Milwaukee, Brown, Racine, and Kenosha Counties from the experiment, the demonstration sample was also far more rural than the public assistance population of Wisconsin as a whole.

6. The SSF/PFP treatment may have been slightly different in the experimental counties than in the rest of the state owing to the demands of running an experiment. A common concern in the analysis of experimental evaluations is that the demands of running the experiment may be so large as to compromise the treatment (Manski and Garfinkel, 1992). The experience of those who receive the experimental treatment may thus be different from what it would have been had administrators not had to operate a control group. Similarly, the experiences of control group members in the "traditional" AFDC program may not be representative of the experiences of AFDC clients in the pre-SSF/PFP period, because the county faces the administrative demands of simultaneously operating a new, experimental program.

County staff were asked if they thought the demands of running the experiment had been great enough to affect their ability to operate SSF/PFP. Dodge, Jefferson, and Waukesha staff all reported that the demands of running the experiment had not been so burdensome as to affect the quality of the program offered to SSF/PFP clients. In all three counties, some JOBS and/or economic support staff worked only with control clients, and others worked only with SSF/PFP clients. Waukesha officials, in fact, thought the program they provided to those subject to PFP/SSF might have been better with the experiment than it would have been without the experiment, because only two of their four JOBS staff had to learn SSF/PFP in its early implementation period. The other two JOBS staff had the chance to grow into SSF/PFP duties more gradually with the help of colleagues who had already made the transition.

Some Dane County staff, however, suggested that the need to continue a control group reduced service quality for those subject to SSF/PFP. Their argument was that, if the county had moved entirely into SSF/PFP, it would have been able to realize significant savings in its hearings process owing to reduced grounds for appeal under SSF/PFP. These savings could have been used to provide an enhanced experience for SSF/PFP participants.

7. The experiment had an uncertain end-date, and many of the controls were told that they should prepare themselves for an imminent conclusion to their control status and a requirement to participate in W-2. The lack of a clear date on which the experiment would end, and the stated prospect that it might end fairly soon, may well have severely biased any difference in the behavior of the control and experimental groups. That is, control group members almost certainly would have behaved differently if they believed their exemption from PFP would last only a short while. Those who thought their exemption would be brief were not, in effect, a real control group. Without surveying control group members themselves, it is impossible to determine how many thought their exemption was short-lived, but JOBS staff in Dodge and Jefferson Counties (which were disproportionately represented in the experiment), and perhaps in the other two counties as well, apparently sought to convey that impression. For JOBS staff, such an approach may have been understandable; as W-2 approached, the state increasingly emphasized that counties should prepare their clients for the new requirements soon to be imposed, apparently without reminding the experimental counties to exempt control group members from that message. As events transpired, the demonstration did last only 15 months, and JOBS staff who predicted its short duration in discussions with their control group clients may well have helped them make the necessary transition. But such warnings were surely not good for the validity of the experiment.

 

 

LESSONS RELEVANT TO FUTURE EXPERIMENTAL DESIGN EVALUATIONS

 

1. Computer systems go down occasionally. Random assignments made by computer systems should be expected to be inoperable at some times when clients are registering or making application. Consistent written guidelines should be in place covering what counties should do when this occurs. As in the SSF/PFP demonstration, CARES will make the automated assignments to treatment and control groups in the child support waiver demonstration. Because assignments will occur in all 72 counties, consistency in approach when the system is down is especially important under the child support waiver. State officials should specify what, if any, discussion about the child support pass-through should occur when the system is down.

2. Statements to participants concerning the duration of the experiment should be carefully developed and should be consistent across all counties. The dimension of time is critical to many social experiments. In the child support waiver demonstration, if participants receiving the full child support pass-through are told that it is time-limited, then both they and the nonresident parent may be less inclined to go through the formal child support system. Moreover, people receiving the reduced pass-through might behave differently in ways relevant to the experiment (for example, in the kind of job they seek), depending on whether they foresee a shorter or longer period of reduced pass-through.

Specific instructions to counties on the length of the pass-through may be hard to develop. The federal government could terminate the child support experiment if previous waiver savings are exhausted, and it is impossible to predict in advance whether or when that could occur. Also, no one can predict what federal government policy is likely to be after the experiment is concluded. But it is better that the discussion of duration be consistent across the 72 counties than to have no state instructions at all.

3. If the experiment could influence county performance, and thus the level of state aid, the state should specify in advance the analytic approach it will use to prevent the experiment from exerting such influences. In the child support demonstration, clients will be randomly assigned to treatment and control groups, presumably across the full state. But it is possible that some counties, especially small counties, could by chance contain a disproportionate number of cases assigned to the control group. Any counties in this situation might be expected to do somewhat worse in their child support collections, which could in turn affect the level of incentive payments from the state (and perhaps also affect, at least from the perspective of concerned counties, a county’s ability to compete to be a W-2 agency in the future). It may be wise for the state to specify in advance how it would adjust incentive payments and policies under this scenario.

4. Communications to clients concerning the nature of the experiment and the reasons why some people do not receive the full pass-through should be consistent across counties. The state should make it as easy as possible for local staff to give the bad news of the reduced pass-through to those assigned to it. Full child support pass-through has been a feature of W-2 plans since the beginning, and many clients have long expected to receive all child support paid on their behalf. If those receiving a reduced program benefit because they happen to be designated a control complain, it may be wise for the state to supply county staff with a consistent response, even if the response cannot be wholly satisfying. In some experiments, one possibility might be for the state to staff an 800 telephone line which county staff could encourage angry control group clients to call. Another possibility might be for the state to develop a "reward" that does not compromise the experiment for those assigned to the reduced benefit. For example, perhaps the Wisconsin restaurant industry would offer a free meal to families in reduced benefit status in future experiments, in appreciation for the sacrifice of these families on behalf of national policy analysis. When faced with anger and disappointment, county staff may need to have some good news to offer, and perhaps something like a free meal would prevent staff from cushioning the disappointing news with statements that could compromise the experiment, such as "this experiment won’t last very long anyway, and the state is likely to treat every case the same soon."

Appendix 1 References

Glazer, Barney G. and Anselm L. Strauss. 1967. The Discovery of Grounded Theory: Strategies for Qualitative Research. Chicago: Aldine Publishing.

Manski, Charles F. and Irwin Garfinkel. 1992. "Introduction." In Evaluating Welfare and Training Programs, eds. Manski and Garfinkel. Cambridge, Mass: Harvard University Press.

 

 

APPENDIX 2

SSF/PFP Impacts for the "Flow" of New AFDC Cases

 

Section III of this paper reviews outcomes for the "stock" of individuals already participating in the AFDC program on March 1, 1996. The "flow" of new AFDC applicants after that date were subject to the SSF/PFP program. With the exception of control group cases in the four research counties, individuals went through the SSF requirements prior to entering PFP.

We had hoped to be able to compare outcomes for control and experimental cases in the four research counties in order to assess the joint impacts of the SSF/PFP programs. While the design did not provide for the separate evaluation of SSF and PFP, because of random assignment any differences in outcomes among the control and experimental groups could be interpreted as the impact of the combined programs. However, as discussed above, we are concerned that while initial assignment was random, the ultimate placement of individuals into the control or experimental groups may have been nonrandom. In particular, there is some evidence that cases with the greatest barriers to employment may have been exempted from SSF/PFP and deleted from the data set, whereas similar cases were not exempted from the control group and were retained in the data set. Thus, differences in outcomes may be attributed to: (1) the impact of the SSF/PFP programs, or (2) differences in the characteristics of individuals assigned to the groups.

In the analysis that follows we examine outcomes for cases that entered SSF/PFP after March 1, by calendar quarter of entry. As stated in the text, because SSF and PFP began in the last month of a quarter (in March 1996), the first quarterly cohort covers just that month. Cohort 2 consists of cases entering SSF/PFP in April, May, and June 1996; cohort 3, those entering in July, August, and September 1996; cohort 4, October, November, and December 1996; and cohort 5, those entering January-April 1997, when the programs were winding down and counties were beginning to ready themselves for the W-2 program which began in September 1997. The basic structure of the analysis is to follow the cohorts from their entry into SSF/PFP through the end of the third quarter of 1997, when W-2 began.

Analyzing impacts by cohort of entry allows us to more clearly consider cases that had different potential lengths of exposure to PFP, and for which we have different measures of post-entry outcomes. This is important given the interest in differences in earnings, welfare utilization, and labor force participation of the treatment and control groups over time. First, the composition of the control group by county changed during the experiment: a higher percentage of control group members were from Dane County in the last few months of the experiment than at the start of the experiment. Second, as noted earlier in this report, procedures for entering SSF and recording key data seem to have become more settled (and perhaps more accurate) as the experiment progressed.

Given the inability to interpret the outcomes unambiguously, we review the results for the SSF/PFP group more briefly, focusing as much on the level of outcomes among those subject to the program as to the comparison of control and experimental group outcomes. In this section we consider two outcomes: AFDC receipt and earnings. In each case we analyze results separately for cases entering in each of five quarters, measuring outcomes through the second quarter of 1996 (April-June). Thus, we have as many as seven (for the first SSF/PFP cohort), and as few as three (for the fifth SSF/PFP cohort) quarters of post-entry data.

 

AFDC Participation and Benefits

Appendix Table 2-1 shows the percentage of SSF cases that eventually entered AFDC. In the research counties, about one-half to two-thirds of cases entered AFDC in the period observed, most within two quarters of entry. When we compare the experimental and control groups in the research

Appendix Table 2-1 here

Appendix Table 2-1 continued

counties, SSF appears to have reduced entry into AFDC. For example, among those who entered SSF in March 1996, 45 percent of the control group members, not subject to SSF, entered AFDC at some point during the next 18 months, while only about 35 percent of the experimental group members, who were subject to SSF, did so. The difference is statistically significant. Similarly, among those who entered SSF during April, May, and June 1996 (cohort 2), over 35 percent of the control group entered AFDC over the next 15 months while only 22 percent of the experimental group did so, which is again a statistically significant difference. Among control and experimental group members who entered SSF in the remaining three calendar quarters of the program, more in the control than in the treatment group went on to enter AFDC before the end of September 1997, although the difference was statistically significant at conventional levels for only one of the cohorts.

The general pattern for AFDC entry among treatment and control group members is in the expected direction (given the apparent intention that SSF would have a diversion effect), but the difference must be interpreted with caution. Some of the cases most likely to enter AFDC may have been excluded from the experimental group, and so the difference may reflect differences in initial characteristics in addition to (or instead of) differences related to the treatment. In addition, while the experimental/control difference is in the expected direction, the behavior of nonexperimental group members is more confusing. In every cohort, more nonexperimental group members than experimental group members—who in theory received the same treatment—ultimately entered AFDC, and the differences were statistically significant. Indeed, in every cohort except the last one, more nonexperimental than control group members entered AFDC. The overall pattern is consistent with outcomes for the stock of AFDC cases reviewed in the text. It may suggest that being in the experiment and assigned to either the treatment or control group made it somehow less likely that a participant would ultimately move to AFDC. However, because the difference in AFDC entry between experimental and nonexperimental group members declined over the course of the experiment, the early trend may have resulted from early mistakes in data entry or other temporary implementation problems.

As shown in Appendix Table 2-2, among those SSF cases that entered AFDC, more than half exited the program within a year. A higher proportion left the program during the period observed than among the "stock" of cases that moved directly to PFP (see Table 3). This is expected, given that the stock of cases is more likely to include long-term AFDC recipients. On the other hand, comparing those in the control and experimental groups, no statistically significant difference in the amount (not shown) or length of AFDC benefits is observed.

 

Earnings

In this section we consider the proportion of participants who had earnings, and average earnings levels, as reported to the Wisconsin Unemployment Insurance system. Appendix Table 2-3 shows that for most groups the proportion with earnings rose over time, though the pattern is not consistent among groups and periods. In addition, while SSF/PFP participants were somewhat less likely to receive AFDC benefits than the stock of AFDC cases assigned directly to PFP, there does not appear to be a consistent difference in the proportion with earnings (see Table 9). By the final quarter observed, between a half and two-thirds of most SSF participants were employed. There is no consistent and statistically significant difference in outcomes between the experimental and control groups.

Appendix Table 2-4 shows mean quarterly earnings for those with earnings. Earnings rose over the period for most groups. However, there is no consistent statistically significant difference between experiment and control group members. For cases entering in Quarter 2, 1996, the experimental group had higher earnings, and the difference was significant in three of the five quarters observed. This difference is in the expected direction, inasmuch as SSF/PFP was supposed to increase self-sufficiency and earnings. However, for those entering in the first quarter of 1996, experimental group earnings were

Appendix Table 2-2 here

Appendix Table 2-2, continued

Appendix Table 2-3 here

Appendix Table 2-3, continued

Appendix Table 2-3, continued

Appendix Table 2-4

Appendix Table 2-4, continued

Appendix Table 2-4, continued

actually consistently lower (significant in two of the seven quarters observed), and other quarters show inconsistent differences that are generally not significant.

 

 

Appendix 3

SSF/PFP Project Data

I. Universe of cases

A. Original datasets1. demp9603.ssd04 - Demographic information for AFDC cases active in 3/96

(data source: Becky People File, March 1996)

N=62,886 observations (including duplicates for three PINs)

2. combma.ssd04 (old) - SSF/PFP participants and AFDC, FS, or MA benefits during 3/96-9/97

(data sources: Becky Assistance Group File, SSF Extracts, PFP Extracts, and CARES database)

N=281,998 observations (including multiple observations per PIN based on different kinds of AG programs, e.g., ADFC, FS, and MA)

3. iassf.ssd04 - SSF/PFP participants never receiving AFDC, FS, or MA benefits during 3/96-9/97

(data source: Individuals who were only in SSF Extracts)

N=25,028 observations (one observation per PIN)

4. ui96.ssd04 - Quarterly wages in 1996

(data source: UI 1996 data)

N=358,581 observations (including multiple observations per SSN based on different employers)

5. ui97.ssd04 - Quarterly wages in 1997

(data source: UI 1997 data)

N=398,331 observations (including multiple observations per SSN based on different employers)

B. Intermediate datasets1. combma.ssd04 (new)

(data source: the old combma file combined with the iassf data file)

N=307,026 observations (including multiple observations per PIN based on different kinds of AG programs, e.g., AFDC, FS, and MA)

C. Population

1. SSF/PFP participants: Total N = 106,724 cases

a. The old combma file (N=281,998 observations on cases ever receiving AFDC, FS, or MA during 3/96-

9/97) was combined with the iassf file (N=25,028 observations on cases never receiving AFDC during

3/96-9/97) to create the new combma data file (N=307,026 observations, including multiple

observations per case based on different kinds of AG programs, e.g., AFDC, FS, and MA).

b. Keeping one observation per PIN, the total (N=106,724) number of cases includes 81,696 SSF/PFP

participants ever receiving AFDC and 25,028 SSF/PFP participants never receiving AFDC during 3/96-

9/97.

2. Earnings data files:

a. Total N=91,237 cases for quarterly earnings in 1996

Total N=94,404 cases for quarterly earnings in 1997

b. The total number of cases in the earnings data sets were obtained by reshaping the data to keep one

observation per SSN because multiple observations per case were possible based on "quarters with

earnings" in the original data sets.

c. earn.ssd04 (N=104,486 cases) was created by merging the ui96 and the ui97 files using SSNs.

3. Include in the population are:

a. Cases appearing in the SSF/PFP extract files and indicated as SSF/PFP participants in the CARES database.

b. Cases not appearing in the original SSF extract files because they were overrides; now included in the iassf file, merged with the old combma file to produce the new combma file used in the analysis.

II. Sample Composition

A. Final datasets for Analyses

1. demp9603.ssd04 - Demographic information for AFDC cases active in 3/96

N=62,883 PINs (excluding duplicates)

2. combine.ssd04 - SSF/PFP cases not tainted during 3/96-9/97

(data source: the old combma and the iassf files)

N=106,208 cases (keeping one observation per PIN)

3. earn.ssd04 - Quarterly earnings data during 1996-1997

(data source: the ui96 and the ui97 files)

N=104,486 cases (keeping one observation per SSN)4. matchfs.ssd04 - Food Stamp benefits for SSF/PFP cases ever receiving AFDC during 3/96-9/97

(data source: the new combma file)

N=81,696 cases (keeping one observation per PIN)

B. SSF/PFP sample: Total N=95,767 cases

1. The combine data file includes 106,208 cases excluding 516 cases ever tainted during 3/96-9/97 from the

total 106,724 SSF/PFP participants.

2. Decisions about whom to include in the sample (sample characteristics):

Logic of distinguishing PFP Sample (cases receiving AFDC in March, 1996 but not subject to SSF) from

SSF Cohorts (cases subject to SSF and PFP)

a. PFP Sample (cases receiving AFDC in March, 1996)

C Total N = 61,280 cases

C In March, 1996: AG_prog = ‘ADC’ AND

AG status ‘open’ = Yes AND

PFP participation = Yes AND

SSF participation = No

b. SSF Cohorts

1) Cohort 1 (cases entering the SSF program in March, 1996)

C Total N = 7,099 cases

C In March, 1996: a) AG_prog = ‘ADC’ AND

PFP participation = No AND

SSF participation = Yes

Or

b) AG prog = ‘ADC’ AND

PFP participation = Yes AND

SSF participation = Yes

2) Cohort 2 (cases entering the SSF program during April, 1996 - June, 1996)

C Total N = 9,869 cases = 9,911 - 42*

3) Cohort 3 (cases entering the SSF program during July, 1996 - September, 1996)

C Total N = 6,881 cases = 6,904 - 23*

4) Cohort 4 (cases entering to the SSF program during October, 1996 - December, 1996)

C Total N = 5,148 cases = 5,177 - 29*

5) Cohort 5 (cases entering the SSF program during January, 1997 - April, 1997)

C Total N = 5,490 cases = 5,525 - 35*

C. Cases excluded from analyses

1. Demographic information for PFP Sample

a. Excluded were 19 cases not having matched with the demp9603 data file. All 19 cases were from

Nonresearch counties (11 cases from Milwaukee).

2. Outcome analyses

a. All cases ever tainted during 3/96-9/97 were excluded: N=516

C Total population N=106,724 cases - 516 taints = 106,208 SSF/PFP cases existing in SSF/PFP extracts.

b. 10,312 PFP participants who never received AFDC in March, 1996 were not included in PFP Sample

because those cases were regarded as incompletes of PFP requirements in that month.

1) In March 1996: AG_prog=‘ADC’ AND

AG status ‘Not Open’ AND

PFP participation=Yes AND

SSF participation=No

C 106,208 SSF/PFP cases in the extracts - 10,312 PFP participants not complete the PFP requirments in March, 1996 = 95,896 SSF/PFP potential participants.

c. From SSF Cohort analysis, 129 cases were excluded because their first month of AG ‘open’ status

to AFDC was prior to the month when they entered the SSF program.

C 95,896 SSF/PFP potential participants - 129 SSF participants who had incorrect records =

Total SSF/PFP sample N=95,767 cases.

d. From the Food Stamp benefits analysis for PFP Sample, 5,641 cases never receiving Food Stamp

benefits during the 19 month period were excluded.

e. From the earnings analysis for PFP Sample, 393 cases without SSNs to be matched with the earn data file were excluded.

f. From the earnings analysis for SSF Cohorts, 1,064 cases without SSNs to be matched with the earn

data file were excluded.

 

III. Databugs - any quirks in the data itself

A. PFP Sample demographic data file (demp9603.ssd04)

1. duplications of PINs: Because the duplicates contained identical information, one of observations was

selected.

2. missing cases: 19 cases from Nonresearch counties were not matched with the demographic data file.

3. primary persons under age of 18: A primary person should be at least 18 years old. However, all primary

persons who were younger than 18 were included in the analysis because most of them were close to 18

(e.g., 15-17).

4. cases from non-research counties that had assigned groups: If a case had group assignment to control

(CL) or experimental (EX) but did not reside in research counties, those cases were treated as assigned.

This might cause unusual numbers in the demographic tables for the total number of cases with two

primary persons.

B. Outcome analysis data files (new combma.ssd04; ui96.ssd04; ui97ssd04)

1. discrepancies in records of county of residence from multiple sources: There exist differences among

variables for county of residence per case over time. Thus, the CTYPFP variable originated from PFP

extracts was used for PFP Sample; CTY1-CTY14 variables originally from the SSF extracts were used

to find out the county of first entry for SSF cohorts. Also, all untainted cases were treated as if they were

permanently in the county of their first entry.

2. duplicates based on AG subprogram (e.g., AFDC-R, U, P, etc.): When the AG status was ‘open’ for

AFDC for any month and there are duplicates, the highest benefit amount for that month was taken for

outcome analysis.

3. missing SSNs in cases ever receiving AFDC: Cases without SSN records to be matched with the earnings

data were excluded from the earnings analysis.

4. zero values for medical assistance benefits: The records for medical assistance benefits were incomplete

and so were excluded from the analysis.

IV. How best to approach this data

A. For PFP Sample, the first month of entering is always March, 1996 and AG status should be ‘open’ in that

month.

B. For SSF cohorts, the first month of entering is based on SSF appearance, regardless of AG status in the month.

C. Counties of residence of cases were determined by the counties of their first entry.

D. To determine if a case has "exited," look for two consecutive months of "blanks" in the AG status code in the

combma file or "0" in the AG status code in the combine file.

E. Do not assume that cases might complete SSF or PFP requirements only because they appeared in SSF/PFP

extracts. Only when their AG status were ‘open’ to AFDC in the month of appearing in the extracts can those

cases be regarded as completed.

F. Do not assume that once-overridden cases might be overridden automatically later.