<
 
 
 
 
×
>
hide
You are viewing a Web site, archived on 07:54:46 Nov 18, 2004. It is now a Federal record managed by the National Archives and Records Administration.
External links, forms, and search boxes may not function within this collection.
survey methodology

Survey Methodology:
National Survey of Recent College Graduates

1. Overview top

a. Purpose

The National Survey of Recent College Graduates (NSRCG) provides information about individuals who recently obtained bachelor's or master's degrees in a science or engineering field. This group is of special interest to many decision makers, because it represents individuals who have recently made the transition from school to the workplace. It also provides information about individuals attending graduate school. The results of this survey are vital for educational planners within the Federal Government and in academia. The results are also used by employers in all sectors (education, industry, and the Government) to understand and predict trends in employment opportunities and salaries in S&E; fields for recent graduates and to evaluate the effectiveness of equal opportunity efforts. This survey is also a component of the Scientists and Engineers Statistical Data System (SESTAT), which provides data on the total number and characteristics of individuals with training or employment in Science and Engineering (S&E) in the United States.

b. Respondents

Respondents are individuals who recently received bachelor's or master's degrees in an S&E field from a U.S. institution and were living in the U.S. during the survey reference week.

c. Key variables

2. Survey Design top

a. Target population and sample frame

The population of the 2001 survey consisted of all individuals:

The sample frame of schools for inclusion in the first stage of the sample is obtained from the Integrated Postsecondary Education Data System (IPEDS) data base maintained by the National Center for Education Statistics. The sample frame for the selection of graduates is obtained from representatives of the institutions selected at the first stage.

b. Sample design

The NSRCG sample is a two-stage sample. The first stage consists of selecting U.S. institutions that grant bachelor's or master's degrees in science and/or engineering fields. The second stage consists of selecting the bachelor's and master's degree recipients who received science and engineering degrees from the institutions selected in the first stage. Institutions that produce relatively large numbers of S&E degrees are selected with certainty. Other institutions are selected proportionately to a measure of size, which reflects the maximum percentage of graduates in each of the degree fields within level of degree categories. The measure of size is adjusted to increase the probability of selection of institutions with relatively high percentages of graduates in targeted minority groups. In 2001 there were 280 institutions selected in the first-stage sample; 173 were selected with probability proportional to size, and 107 were selected with certainty.

Target sampling rates for each degree field are determined. Target rates are higher for small fields than for large ones. These target rates are used in conjunction with the first-stage sampling rates for the institutions to determine the sampling rate to be used at the second stage for each institution, graduation year, degree field, and level of degree combination. In addition, graduates identified by the institutions as black, Hispanic, or American Indian were oversampled by a factor of three.

A total of 13,516 individuals were selected in 2001.

c. Data collection techniques

Westat, Inc. conducted the 2001 survey for SRS under contract. Data collection for individual respondents in 2001 was conducted primarily by computer-assisted telephone interviewing (CATI). Mail data collection procedures were used in a small number of cases when an address could be obtained but no phone number was available or the respondent requested a mail instrument.

The 2001 instrument used in the mail phase of the survey was very similar to the CATI instrument. Both instruments were designed to be as similar as possible to the instruments used in the National Survey of College Graduates (NSCG) and the Survey of Doctorate Recipients (SDR) to facilitate combining results into estimates of the total S&E; population. A few questions in the NSRCG, however, obtain information of special interest for the population of recent graduates. For example, the NSRCG has more information related to education than does the NSCG.

Information in the 2001 survey was collected for the week of April 15, 2001. Data collection took place between June 2001 and May 2002.

d. Estimation techniques

Weights are attached to each responding graduate record to make it simple to estimate characteristics of the population of graduates. The weights were created in the following stages:

  1. the first step was to create an institution base weight that was the inverse of the probability of selecting the institution;
  2. the institution base weight was then adjusted for nonresponse by creating nonresponse adjustment cells based on institutional control and size;
  3. the institution weights were further adjusted by a ratio adjustment factor based on the number of graduates reported in the IPEDS by degree and major;
  4. the institution weight was then multiplied by the inverse of the probability of selecting the graduate within the institution to form a graduate weight;
  5. the graduate weight was adjusted for nonresponse by using weighting cells based on year of graduation, degree, and major field of study (foreign address graduates were a separate cell); and
  6. the graduate nonresponse adjusted weights were further modified to account for the possibility that the graduates could have been selected twice. Graduates who obtained more than one degree during the time period (both a bachelor's and a master's degree, for example) could have been sampled twice.

In addition to creating estimation weights for each graduate, a hot deck imputation procedure was used to estimate missing item values, using responses from other graduates who had similar characteristics (age, major, gender, etc.).

3. Survey Quality Measures top

a. Sampling variability

The sample size is sufficiently large that estimates based on the total sample should be subject to no more than moderate sampling error. However, sampling error can be quite substantial in estimating the characteristics of small subgroups of the population. Estimates of the sampling errors associated with various measures are included in the methodology report for the survey and in the basic publications.

b. Coverage

The major source of coverage error is the failure of institutions to identify someone as having received a degree of interest. This failure can arise when institutional records are incorrect (e.g., when incorrect dates for degree receipt are recorded or incorrect degree fields are recorded). It also can arise because of the difficulty in correctly classifying the degree fields granted into the taxonomy that NSF uses to identify whether the degree field is in-scope. In order to minimize the impact of this latter problem, individuals with ambiguous degree fields are included in the sample and eliminated if their responses indicate they are out-of-scope.

c. Nonresponse

(1) Unit nonresponse - The response rate for the first stage of this survey (institution-level response rate) in 2001 is 99 percent and for the second stage is 80 percent. In order to minimize the impact of this source of error, results are adjusted for nonresponse through the use of statistical weighting techniques.

(2) Item nonresponse - Item nonresponse rates were low for the 2001 survey. The average nonresponse rate for all items, excluding employer address, was 2.6 percent. All missing data were imputed using a hot deck imputation procedure.

d. Measurement

Several of the key variables in this survey are difficult to measure and thus are relatively prone to measurement error. For example, individuals do not always know the precise definitions of occupations that are used by experts in the field and may thus select occupational fields that are technically incorrect. In order to reduce measurement error, the instrument was pretested, using focus groups and a CATI pretest. The NSRCG instrument also benefited from the extensive pretesting of the NSCG instrument, since most NSRCG questions also appear on the NSCG.

The study of measurement reliability for the NSCG should provide relevant information for the NSRCG.

As is true for any multimodal survey, it is likely that the measurement errors associated with the different modalities are somewhat different. This possible source of measurement error is especially troublesome, since the proclivity to respond by one mode or the other is likely to be associated with variables of interest in the survey. To the extent that certain types of individuals may be relatively likely to respond by one mode compared with another, the multimodal approach may have introduced some systematic biases into the data. The fact that such a large number of NSRCG respondents responded by CATI and a very small number of respondents responded by mail reduces nonresponse bias when analyzing the population of recent college graduates but might increase nonresponse bias when combining NSRCG data into the integrated SESTAT file where respondents from other SESTAT surveys provided data using a variety of different modalities. SRS and the Census Bureau have designed a special study to investigate the extent of this bias for the NSCG. Due to the similarities between the NSRCG and the NSCG we expect these results to provide insights about the NSRCG.

4. Trend Data top

There have been a number of changes in the definition of the population surveyed over time. For example, the surveys conducted in the 1980s included individuals receiving bachelor's degrees in fields such as engineering technology; these are excluded from the surveys conducted in the 1990s. The survey improvements made in 1993 are sufficiently great that SRS staff believe that trend analyses between the data from the surveys conducted during and after 1993 and the surveys in prior years must be performed very cautiously, if at all.

5. Availability of Data top

a. Publications

The data from this survey are published biennially in Detailed Statistical Tables in the series Characteristics of Recent Science and Engineering Graduates, as well as in several Data Briefs and Issue Briefs.

Information from this survey is also included in Science and Engineering Indicators and Women, Minorities, and Persons With Disabilities in Science and Engineering.

b. Electronic access

Data from this survey are available on the SRS Web site and on SESTAT. Selected aggregate data are available in public use data files upon request. Access to restricted data for researchers interested in analyzing microdata can be arranged through a licensing agreement.

c. Contact for more information

Additional information about this survey can be obtained by contacting:

John Tsapogas
Senior Program Analyst
Human Resources Statistics Program
Division of Science Resources Statistics
National Science Foundation
4201 Wilson Boulevard, Suite 965
Arlington, VA 22230
(703) 292-7799
via e-mail at jtsapoga@nsf.gov
Last Modified: Nov 04, 2004 Comments to srsweb@nsf.gov