You are viewing a Web site, archived on 14:20:12 Nov 01, 2004. It is now a Federal record managed by the National Archives and Records Administration.
External links, forms, and search boxes may not function within this collection. Note that this document was downloaded, and not saved because it was a duplicate of a previously captured version (18:28:34 Oct 19, 2004). HTTP headers presented here are from the original capture.
Directorate for Education
and Human Resources

Division of Research,
Evaluation and Communication
National Science Foundation



User-Friendly Handbook for
Mixed Method Evaluations


The National Science Foundation (NSF) provides awards for research and education in the sciences and engineering. The awardee is wholly responsible for the conduct of such research and preparation of the results for publication. NSF, therefore, does not assume responsibility for the research findings or their interpretation.

NSF welcomes proposals from all qualified scientists and engineers and strongly encourages women, minorities, and persons with disabilities to compete fully in any of the research-related programs described here. In accordance with federal statutes, regulations, and NSF policies, no person on grounds of race, color, age, sex, national origin, or disability shall be excluded from participation in, be denied the benefits of, or be subject to discrimination under any program or activity receiving financial assistance from NSF.

Facilitation Awards for Scientists and Engineers with Disabilities (FASED) provide funding for special assistance or equipment to enable persons with disabilities (investigators and other staff, including student research assistants) to work on NSF projects. See the program announcement or contact the program coordinator at (703) 306-1636.

NSF has TDD (Telephonic Device for the Deaf) capability, which enables individuals with hearing impairment to communicate with NSF about programs, employment, or general information. To access NSF TDD dial (703) 306-0090; for the Federal Information Relay Service (FIRS), 1-800-877-8339.


Appreciation is expressed to our external advisory panel Dr. Frances Lawrenz, Dr. Jennifer Greene, Dr. Mary Ann Millsap, and Steve Dietz for their comprehensive reviews of this document and their helpful suggestions. We also appreciate the direction provided by Dr. Conrad Katzenmeyer and Mr. James Dietz of the Division of Research, Evaluation and Communication.


User-Friendly Handbook for Mixed Method Evaluations


Edited by

Joy Frechtling
Laure Sharp

August 1997


NSF Program Officer
Conrad Katzenmeyer


Directorate for Education
and Human Resources

Division of Research,
Evaluation and Communication

This handbook was developed with support from the National Science Foundation RED 94-52965.


Table of Contents


Part I. Introduction to Mixed Method Evaluations

  1. Introducing This Handbook
    (Laure Sharp and Joy Frechtling)
  1. Illustration: A Hypothetical Project
    (Laure Sharp)


Part II. Overview of Qualitative Methods and Analytic Techniques

  1. Common Qualitative Methods
    (Colleen Mahoney)
  1. Analyzing Qualitative Data
    (Susan Berkowitz)


Part III. Designing and Reporting Mixed Method Evaluations

  1. Overview of the Design Process for Mixed Method Evaluation
    (Laure Sharp and Joy Frechtling)
  1. Evaluation Design for the Hypothetical Project
    (Laure Sharp)
  1. Reporting the Results of Mixed Method Evaluations
    (Gary Silverstein and Laure Sharp)

Part IV. Supplementary Materials

  1. Annotated Bibliography
  2. Glossary


List of Exhibits

  1. Common techniques
  2. Example of a mixed method design
  3. Advantages and disadvantages of observations
  4. Types of information for which observations are a good source
  5. Advantages and disadvantages of indepth interviews
  6. Considerations in conducting indepth interviews and focus groups
  7. Which to use: Focus groups or indepth interviews?
  8. Advantages and disadvantages of document studies
  9. Advantages and disadvantages of using key informants
  10. Data matrix for Campus A: What was done to share knowledge
  11. Participants’ views of information sharing at eight campuses
  12. Matrix of cross-case analysis linking implementation and outcome factors
  13. Goals, stakeholders, and evaluation questions for a formative evaluation
  14. Goals, stakeholders, and evaluation questions for a summative evaluation
  15. Evaluation questions, data sources, and data collection methods for a formative evaluation
  16. Evaluation questions, data sources, and data collection methods for a summative evaluation
  17. First data collection plan
  18. Final data collection plan
  19. Matrix of stakeholders
  20. Example of an evaluation/methodology matrix

Back to Top | To Chapter 1