Chapter 9: How Can You Report What You Have Learned?
An evaluation report is an important document. It integrates what you have learned
about your program from the evaluation. However, it is vital to understand that there are different
ways of reporting evaluation information, depending on how you want to use the report and who your
audience will be. In this chapter, we suggest preparing evaluation reports that are appropriate for
a range of uses. A program evaluation report can do the following:
- Guide management decisions by identifying areas in which changes may be needed for
future implementation Tell the "story" of program implementation and demonstrate the impact of the
program on participants
- Advocate for your program with potential funders or with other community agencies
to encourage referrals
- Help advance the field of human services
These uses suggest that various audiences for an evaluation report might include
program staff and agency directors, program funders, potential funders, agency boards, other
community agencies, and local and national organizations that advocate for individuals like your
program participants or for programs such as yours.
Whatever type of report you plan to develop, remember that it is critical to report
negative results, as well as significant ones. There is as much to learn from program approaches or
models that do not work as there is from those that work. Negative results should not be thought of
as shameful. Efforts to change knowledge, attitudes, and behaviors through programmatic
interventions are not always going to work. It is also important to present results that may not be
conclusive, but show promise and warrant additional study. For example, if mothers over the age of
25 seemed to improve their parenting skills after receiving home-based support services, this is
worth presenting so future evaluation can explore this further. Currently, so little is known about
what does and does not work that any information on these issues greatly increases knowledge in the
field.
Preparing an evaluation report for program
funders
The report to program funders will probably be the most comprehensive report you
prepare. Often program funders will use your report to demonstrate the effectiveness of their grant
initiatives and to support allocation of additional moneys for similar programs. A report that is
useful for this purpose will need to include detailed information about the program, the evaluation
design and methods, and the types of data analyses conducted.
A sample outline for an evaluation report for program funders is provided in this
chapter. The outline is developed for a "final report" and assumes all the information collected on
your program has been analyzed. However, this outline may also be used for interim reports, with
different sections completed at various times during the evaluation and feedback provided to
program personnel on the ongoing status of the evaluation.
Preparing an evaluation report for program staff and agency
personnel
An evaluation report for program staff and agency personnel may be used to support
management decisions about ongoing or future program efforts. This type of report may not need to
include as much detail on the evaluation methodology but might focus instead on findings. The
report could include the information noted in outline Sections II E (description of results of
analysis of implementation information), III D (discussion of issues that affected the outcome
evaluation and how they were addressed), III F (results of data analysis on participant outcome
information), III G (discussion of results), and IV C (discussion of potential relationships
between implementation and outcome evaluation results).
Preparing an evaluation report for potential funders and
advocacy organizations
It is unlikely that potential funders (including State legislatures and national
and local foundations) or advocacy organizations will want to read a lengthy report. In a report
for this audience, you may want to focus on the information provided in Section IV of the outline.
This report would consist of only a summary of both program implementation and participant outcome
objectives and a discussion of the relationships between implementation policies, practices,
procedures, and participant outcomes.
Disseminating the results of your evaluation
In addition to producing formal evaluation reports, you may want to take advantage
of other opportunities to share what you have learned with others in your community or with the
field in general. You might want to consider drafting letters to community health and social
services agencies or other organizations that may be interested in the activities and results of
your work. Other ways to let people know what you have done include the following:
- Producing press releases and articles for local professional publications, such as
newsletters and journals
- Making presentations at meetings on the results of your program at the local
health department, university or public library, or other setting
- Listing your evaluation report or other evaluation-related publications in
relevant databases, on electronic bulletin boards, and with clearinghouses
- Making telephone calls and scheduling meetings with similar programs to share your
experience and results
Many of the resource materials listed in the appendix of this manual contain ideas
and guidelines for producing different types of informational materials related to evaluations.
Sample Outline
Final Evaluation Report
Executive Summary
- Introduction: General Description of the Project (1 page)
- Description of program components, including services or training delivered and
target population for each service
- Description of collaborative efforts (if relevant), including the agencies
participating in the collaboration and their various roles and responsibilities in the project
- Description of strategies for recruiting program participants (if relevant)
- Description of special issues relevant to serving the project's target population
(or providing education and training to participants) and plans to address them
- Agency and staffing issues
- Participants' cultural background, socioeconomic status, literacy levels, and
other characteristics
- Evaluation of Program Implementation Objectives
- Description of the project's implementation objectives(measurable objectives)>
- What you planned to do (planned services/interventions/training/education;
duration and intensity of each service/intervention/training period)
- Whom you planned to have do it (planned staffing arrangements and
qualifications/characteristics of staff)
- Target population (intended characteristics and number of members of the target
population to be reached by each service/intervention/training/ education effort and how you
planned to recruit participants)
- Description of the project's objectives for collaborating with community agencies
- Planned collaborative arrangements
- Services/interventions/training provided by collaborating agencies
- Statement of evaluation questions (Were program implementation objectives
attained? If not, why not? What were the barriers to and facilitators of attaining implementation
objectives?)
- Examples:
- How successful was the project in implementing a parenting education class for
mothers with substance abuse problems? What were the policies, practices, and procedures used to
attain this objective? What were the barriers to, and facilitators of attaining this
objective?
- How successful was the project in recruiting the intended target population and
serving the expected number of participants? What were the policies, practices, and procedures used
to recruit and maintain participants in the project? What were the barriers to, and facilitators of
attaining this objective?
- How successful was the project in developing and implementing a multidisciplinary
training curriculum? What were the practices and procedures used to develop and implement the
curriculum? What were the barriers to, and facilitators of attaining this objective?
- How successful was the project in establishing collaborative relationships with
other agencies in the community? What were the policies, practices, and procedures used to attain
this objective? What were the barriers to, and facilitators of attaining this objective?
- Description of data collection methods and data collected for each evaluation
question
- Description of data collected
- Description of methodology of data collection
- Description of data sources (such as project documents, project staff, project
participants, and collaborating agency staff)
- Description of sampling procedures
- Description of data analysis procedures
- Description of results of analysis
- Statement of findings with respect to each evaluation question
- Examples:
- The project's success in attaining the objective
- The effectiveness of particular policies, practices, and procedures in
attaining the objective
- The barriers to and facilitators of attainment of the objective
- Statement of issues that may have affected the evaluation's findings
- Examples:
- The need to make changes in the evaluation because of changes in program
implementation or characteristics of the population served
- Staff turnover in the project resulting in inconsistent data collection
procedures
- Changes in evaluation staff
- Evaluation of Participant Outcome Objectives
- Description of participant outcome objectives (in measurable terms)
- What changes were participants expected to exhibit as a result of their
participation in each service/intervention/training module provided by the project?
- What changes were participants expected to exhibit as a result of participation in
the project in general?
- What changes were expected to occur in the community's service delivery system as
a result of the project?
- Statement of evaluation questions, evaluation design, and method for assessing
change for each question
- Examples:
- How effective was the project in attaining its expected outcome of decreasing
parental substance abuse? How was this measured? What design was used to establish that a change
occurred and to relate the change to the project's interventions (such as preintervention and
postintervention, control groups, comparison groups, etc.)? Why was this design selected?
- How effective was the project in attaining its expected outcome of increasing
children's self-esteem? How was this measured? What design was used to establish that a change
occurred and to relate the change to the project's interventions? Why was this design
selected?
- How effective was the project in increasing the knowledge and skills of
training participants? How was this measured? What design was used to establish that a change
occurred and to relate the change to the project's interventions? Why was this design
selected?
- Discussion of data collection methods (for each evaluation question)
- Data collected
- Method of data collection
- Examples:
- Case record reviews
- Interviews
- Self-report questionnaires or inventories (if you developed an instrument for
this evaluation, attach a copy to the final report)
- Observations
- Data sources (for each evaluation question) and sampling plans, when relevant
- Discussion of issues that affected the outcome evaluation and how they were
addressed
- Program-related issues
- Staff turnover
- Changes in target population characteristics
- Changes in services/interventions during the course of the project
- Changes in staffing plans
- Changes in collaborative arrangements
- Characteristics of participants
- Evaluation-related issues
- Problems encountered in obtaining participant consent
- Change in numbers of participants served requiring change in analysis plans
- Questionable cultural relevance of evaluation data collection instruments and/or
procedures
- Problems encountered due to participant attrition
- Procedures for data analyses
- Results of data analyses
- Significant and negative analyses results (including statement of established
level of significance) for each outcome evaluation question
- Promising, but inconclusive analyses results
- Issues/problems relevant to the analyses
- Examples:
- Issues relevant to data collection procedures, particularly consistency in
methods and consistency across data collectors
- Issues relevant to the number of participants served by the project and those
included in the analysis
- Missing data or differences in size of sample for various analyses
- Discussion of results
- Interpretation of results for each evaluation question, including any explanatory
information from the process evaluation
- The effectiveness of the project in attaining a specific outcome objective
- Variables associated with attainment of specific outcomes, such as characteristics
of the population, characteristics of the service provider or trainer, duration, or intensity of
services or training, and characteristics of the service or training
- Issues relevant to interpretation of results
- Integration of Process and Outcome Evaluation Information
- Summary of process evaluation results
- Summary of outcome evaluation results
- Discussion of potential relationships between program implementation and
participant outcome evaluation results
- Examples:
- Did particular policies, practices, or procedures used to attain program
implementation objectives have different effects on participant outcomes?
- How did practices and procedures used to recruit and maintain participants in
services affect participant outcomes?
- What collaboration practices and procedures were found to be related to
attainment of expected community outcomes?
- Were particular training modules more effective than others in attaining
expected outcomes for participants? If so, what were the features of these modules that may have
contributed to their effectiveness (such as characteristics of the trainers, characteristics of the
curriculum, the duration and intensity of the services)?
- Recommendations to Program Administrators or Funders for Future Program and
Evaluation Efforts
- Examples:
- Based on the evaluation findings, it is recommended that the particular service
approach developed for this program be used to target mothers who are 25 years of age or older.
Younger mothers do not appear to benefit from this type of approach.
- The evaluation findings suggest that traditional educational services are not
as effective as self-esteem building services in promoting attitude changes among adolescents
regarding substance abuse. We recommend that future program development focus on providing these
types of services to youth at risk for substance abuse.
- Based on the evaluation findings, it is recommended that funders provide
sufficient funding for evaluation that will permit a long-term follow-up assessment of
participants. The kinds of participant changes that the program may bring about may not be
observable until 3 or 6 months after they leave the program.
|