This web site was copied prior to January 20, 2005. It is now a Federal record managed by the National Archives and Records Administration. External links, forms, and search boxes may not function within this collection. Learn more.   [hide]
survey methodology

Survey Methodology:
Survey of Public Attitudes Toward and Understanding of Science and Technology

Overview top

a. Purpose

NSF's Survey of Public Attitudes Toward and Understanding of Science and Technology is used to monitor public attitudes and understanding of science concepts and the scientific process. The survey provides information used by education policy makers and researchers. The survey has been closely coordinated with surveys in other countries to facilitate international comparisons.

b. Respondents

The survey is completed by adults residing in the United States.

c. Key variables

2. Survey Design top

a. Target population and sample frame

The target population is noninstitutionalized adults, age 18 or older, residing in the United States. Residential households with working phones are in the sample frame.

b. Sample design

The Survey of Public Attitudes Toward and Understanding of Science and Technology calls for 2,000 completed interviews with adults residing in households with working telephones in the United States. Persons residing in group quarters and institutions (including military barracks) are excluded. Military personnel residing off-base are included. A list assisted random digit dial design is used in the study. The 2001 sample was generated using the Genesys sampling system from Marketing Systems Group (MSG) incorporating a list-assisted, one block frame. Ten thousand pieces of random sample were generated to represent the population of the United States, including those residing in Alaska, Hawaii, and the District of Columbia. The sample was divided into 50 replicates of 200. One replicate was utilized during a pretest of the survey instrument while the remaining 49 were used in the primary fielding.

Respondents within households were selected using the most recent birthday technique. The individual over the age of 18 with the most recent birthday was considered the eligible respondent at the number dialed. Interviews were conducted in English or Spanish.

c. Data collection techniques

The 2001 survey was conducted by ORC Macro (under contract to SRS). Primary data collection was done using computer-assisted telephone interviewing (CATI).

d. Estimation techniques

The Genesys random-digit sample has no design effect (a design effect of 1.0), so it is not necessary to weight for primary sampling units or any other sample stratification. The basic sample produces a national random sample of households, not respondents. At the conclusion of interviewing, a weight was created for each case in the system file. The weighting algorithm was developed to correct for two distortions: first, there are different numbers of eligible respondents in each household, and only one respondent was selected from each household. Second, differential response rates within an RDD sample produce a disproportionately high number of college graduates and a disproportionately small number of high school dropouts. To correct for differential rates of participation in the interviews, an initial 60-cell weighting matrix was used that includes five age strata, two racial-ethnic strata, two sex strata, and three educational strata. Estimates for the US population were obtained from the Bureau of the Census' Current Population Reports. The 60 cells were then collapsed into 32 cells to control for cells in which only a small number of respondents occurred.

3. Survey Quality Measures top

a. Sampling variability

The coefficient of variation for a percentage estimate of 50 percent in the total population in 2001 was approximately 2.5 percent. The coefficients of variation were larger for subgroups of the population, but decreased as estimates approached extremes.

b. Coverage

Households without phones are not included in the sample frame and thus are not covered. There is also some undercoverage of individuals with recently installed phones due to the time lag between the selection of phone numbers and interviewing. Adjustment was made for multiple phone lines associated with a given household, correcting for differences in the probability of selection of that household. In addition, the benchmark to the Current Population Survey is designed to correct for some of the known biases introduced by coverage errors.

c. Nonresponse

(1) Unit nonresponse - The cooperation rate for the 2001 survey was 51 percent using standard definitions for final outcome codes from the American Association for Public Opinion Research (AAPOR) with non-resolved records averaging 36 attempts.[1] The overall response rate was 39 percent utilizing the CASRO formula.

(2) Item nonresponse - There was only minimal nonresponse for non-critical items in the survey. For the vast majority of the questions included in the 2001 survey, no respondent refused to answer the question. Few questions had a non-response of greater than one half of one percent.

d. Measurement

Opinion and attitude questions are by their nature relatively prone to measurement error, since slight changes in question wording or changes in question ordering can have a significant impact on response. A large number of items in the survey have been repeated, and analyses indicate that these items are stable and tend to correlate with other related information points, suggesting that they are measuring the same underlying constructs.

4. Trend Data top

Science and Engineering Indicators has contained information on public attitudes toward science and technology in every biennial edition since 1972 (except 1978). A significant restructuring of the survey was undertaken in 1979, which has provided the framework for subsequent surveys. Time trends for many of the variables can be constructed for the years 1979, 1981, 1985, 1988, 1990, 1992, 1995, 1997, 1999, and 2001.

5. Availability of Data top

a. Publications

The data from this survey are published biennially in Science and Engineering Indicators, available on the SRS Web site.

b. Electronic access

Data from this survey are available on the SRS Web site.

c. Contact for more information

Additional information about this survey can be obtained by contacting:

Melissa Pollak
Senior Analyst
Science and Engineering Indicators Program
Division of Science Resources Statistics
National Science Foundation
4201 Wilson Boulevard, Suite 965
Arlington, VA 22230
(703) 292-7808
via e-mail at mpollak@nsf.gov

6. Questionnaire top


Footnotes

[1] The cooperation rate for this survey is defined as the number of respondents divided by the number of in-scope households for which there was a response to the phone within the allotted six tries. Note that, by definition, the response rate is lower than the coverage rate.

Last Modified: May 30, 2002 Comments to srsweb@nsf.gov