Skip Navigation
Funding Opportunities | Search Funded Research Grants and Contracts

IES Grant

Title: Center for the Analysis of Postsecondary Readiness
Center: NCER Year: 2014
Principal Investigator: Bailey, Thomas Awardee: Columbia University, Teachers College
Program: National Research and Development Centers      [Program Details]
Award Period: 5 years (7/1/2014–6/30/2019) Award Amount: $9,989,803
Goal: Multiple Goals Award Number: R305C140007
Description:

Project Website: http://postsecondaryreadiness.org/

Purpose: During the past 40 years, the United States has made major advances in expanding access to postsecondary education, but many students arrive at college without the requisite English and math skills to perform college-level work. Community colleges and other open-access institutions typically respond by placing such students into developmental (or remedial) education courses. Unfortunately, recent studies indicate that students who are placed into developmental courses make slow progress in improving their skills and rarely earn college degrees.

The Center for the Analysis of Postsecondary Readiness will conduct research that will document current practices in developmental English and math education across the U.S., identify innovative practices in assessment and instruction, evaluate the efficacy of practices that show promise of improving student outcomes, and assess the scalability of these models. In addition to its focused program of research, the Center will engage in leadership and outreach activities that will convene policymakers, practitioners, and researchers interested in improving developmental education and assist efforts by States, colleges, and universities to bring effective models to scale.

Research Projects:

There are three major components of the focused program of research for the Center: one descriptive study and two evaluation studies. The goal of the descriptive study is to provide policymakers with better information on developmental education practices that are currently being used. The goal of the evaluation studies is to determine whether particular innovations in instructional practices or assessments are more likely to lead to improved student outcomes.

Descriptive Study

The descriptive study will be built around a nationally representative survey of open access two-year and nonselective four-year colleges. The total sample size required for the survey is 1,690 institutions. The researchers will collect detailed information on how widely colleges and states have adopted or are adopting specific reforms (e.g., changes in teaching methods) as well as on comprehensive reforms that affect their entire set of developmental education offerings or their connection with high school and college-level programs. As part of the descriptive study, qualitative interviews with institutional and state-level representatives will also be conducted. The Center plans to interview 40 institutional personnel and 40 state-level leaders.

In addition, information drawn from existing data and research, and a detailed analysis of developmental students at a large multi-campus for-profit institution will be conducted. The study will provide information on the number and characteristics of developmental education students, describe current reforms and trends, and address questions focused on assessment practices, institutional choices, instructional strategies, differential approaches, and college readiness standards. In addition, the study will address questions on the open-door policies and how potential changes may restrict eligibility. It will also explore efforts to increase engagement of faculty in remedial reform, the role of new technologies and online education, and remediation at for-profit institutions.

Assessment Evaluation – RCT Study of a Data Analytics Assessment System

In the assessment evaluation, to be carried out in partnership with the State University of New York (SUNY), the Center will conduct a randomized control trial (RCT) to evaluate a data analytics method whereby colleges use multiple measures to predict student performance in college-level math and college-level English courses. In addition to placement test scores, these predictive measures will include high school GPA and possibly high school course-taking patterns and non-cognitive assessments. The two-year eligible population at each of the SUNY colleges ranges from approximately 3,200 to 14,000 students. Because not all entering students take the placement exams, and because some students may decline to participate, a conservative sample size is estimated at around 10,000 participants (approximately one quarter of all eligible students). It is hypothesized that students placed using the data analytics method will be more successful in their college-level courses than students placed using the business-as-usual method. The outcomes of primary interest will be completion of the first college-level courses in the relevant areas and total college-level credits earned. Qualitative data from site visits will be collected to document how colleges are implementing the data analytics method. To test the hypothesis that the data analytics method is an improvement over business-as-usual placement, the average outcomes for students assigned to the treatment and control groups will be compared using OLS regression. As part of the evaluation, the researchers will also conduct both cost and cost-effectiveness analyses.

Instruction Study – RCT Study of a Comprehensive Instructional Reform: The New Mathways Project

In the instruction evaluation, the researchers will conduct an RCT to test the effectiveness of the New Mathways Project (NMP), an innovative math reform developed by the Charles A. Dana Center at the University of Texas at Austin that is poised to scale across the state of Texas. NMP uses a student-centered, activities-based pedagogy to better engage students. It also contrasts with the traditional approach by accelerating progression through remediation, providing alternative curricula based on students’ academic and career goals, and by tying the developmental content more closely to college-level programs of study. Under NMP, developmental math students with appropriate career and academic goals will be assigned to one of two pathways—Statistics or Quantitative Reasoning—depending on their intended major. These students take an accelerated one semester developmental math course, followed by college-level statistics or quantitative reasoning.

The evaluation will involve four-to-six community colleges in Texas to measure the causal effects of the intervention. It will also feature an implementation study to assess implementation fidelity and program-control contrast. The research sample will include about 2,000 students who are assessed as in need of one or two levels of developmental math. Students who agree to participate and meet the study criteria will be randomly assigned into a treatment group that enters the year-long NMP program or into a control group that enters the colleges’ traditional developmental and college-level math sequence. Students in the program group will be assigned to either the Statistics or the Quantitative Literacy pathway based on their expected major.

The evaluation outcomes include both proximal and distal measures in two broad categories: academic progression and skills. The main analysis will focus on confirmatory indicators of the success of NMP. Such success will be identified as progression in math courses (proximal), measured by the percentage of students who complete the developmental math sequence, the percentage who complete their first college-level math course, and the average number of math credits earned (developmental and college-level). In addition, the Center will consider additional confirmatory outcomes: overall academic progress (proximal and distal), measured by the average number of any college-level credits students earn after one and two years in college; and overall academic completion (distal), measured by the percentage of students who earn a degree or certificate or who transfer to a four-year institution. Finally, the Center will examine learning skills in math by comparing some common questions across math tests administered to both treatment and control students.

Leadership and Dissemination Activities:

The Center will host two policy forums, along with annual meetings of Center researchers and selected institutional and state representatives to present findings and review progress. It also plans a large national conference in Year 5 to showcase the work of the Center and to identify promising areas of research and development for the field. The Center will also work with a small group of stakeholders to assist in their efforts to improve developmental education. The work will address feasible policy design, applicability of promising reform models to local contexts, improvement or adaptation of particular models, potential barriers and obstacles to implementation, and proposals for evaluation.

Dissemination and outreach will also occur through various media and via an interactive Center website. The Center will produce and distribute a variety of publications on the Center’s research, including reports, working papers, and peer-reviewed journal articles. To reach a broader audience, the Center will also seek media coverage for important findings in both education trade journals and popular news outlets.

Key Personnel

Thomas Bailey (Teachers College, Columbia University), Lashawn Richburg-Hayes (MDRC), Elisabeth Barnett (Teachers College, Columbia University), Clive Belfield (CUNY), Eric Bettinger (Stanford University), Angela Boatman (Vanderbilt University), Nicole Edgecombe (Teachers College, Columbia University), Robert Ivry (MDRC), Shanna Jaggars (Teachers College, Columbia University), Michal Kurlaender (University of California, Davis), Susanna Loeb (Stanford University), Alexander Mayer (MDRC), Lisa Rothman (Teachers College, Columbia University), Elizabeth Zachry Rutschow (MDRC), Judith Scott-Clayton (Teachers College, Columbia University), Doug Slater (Teachers College, Columbia University), Michael Weiss (MDRC)

Publications

Book chapter

Moore, J.L., Allen, J.M., and Camara, W.J. (2017). Empirically Based College- and Career-Readiness Cut Scores and Performance Standards. In M.N. Gaertner, K.L. McClarty, and K. Mattern (Eds.), Preparing Students for College and Careers: Theory, Measurement, and Educational Practice. New York: Routledge.

Reddy, V., and Barnett, E. (2017). College Placement Strategies: Evolving Considerations and Practices. In M.N. Gaertner, K.L. McClarty, and K. Mattern (Eds.), Preparing Students for College and Careers: Theory, Measurement, and Educational Practice. (pp. 100–111). New York: Routledge.

Working paper

Natow, R. S., Reddy, V., and Grant, M. (2017). How and Why Higher Education Institutions Use Technology in Developmental Education Programming. A CAPR Working Paper. Community College Research Center, Teachers College, Columbia University.


Back