<
 
 
 
 
×
>
hide
You are viewing a Web site, archived on 07:59:28 Nov 18, 2004. It is now a Federal record managed by the National Archives and Records Administration.
External links, forms, and search boxes may not function within this collection.
survey methodology

Survey Methodology:
Survey of Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions

Overview top

a. Purpose

The Survey of Federal Science and Engineering (S&E) Support is a congressionally mandated survey that is the only source of comprehensive data on Federal science and engineering funding to individual academic and nonprofit institutions. It is used by Federal policymakers and others interested in S&E trends, including State and local government officials, university policy analysts, R&D managers, and nonprofit institution administrators. It is used by NSF and several other Federal agencies for internal administrative purposes. For example, NSF's EPSCoR (Experimental Program to Stimulate Competitive Research) uses the information to target States receiving relatively low Federal R&D funds.

b. Respondents

The survey is completed by Federal agency representatives. In most cases separate submissions are made for subagencies.

c. Key variables

Note that the variables in this survey use definitions comparable to those used by the Office of Management and Budget and the Survey of Federal Funds for Research and Development.

2. Survey Design top

a. Target population and sample frame

The target population consists of as many as 21 Federal agencies that incur virtually all of the obligations for Federal academic R&D. Note that some agencies not included in this survey may account for a significant percentage of total funding received by some institutions covered in this report, even though those funds may constitute a small proportion of total academic R&D.

The list of agencies to be surveyed is obtained from information in the Federal Funds survey.

b. Sample design

All 21 Federal agencies in the target population are surveyed.

c. Data collection techniques

The 1998 survey was conducted by SRS with assistance from the QRC Division of Macro International under contract to SRS. Data collection commenced with a phone call to verify name, address, fax and phone number, and e-mail address, followed by a small, mailed packet consisting of an informational brochure on FSSWEB, the Web-based data collection system. The FSSWEB system is part of NSF's effort to enhance survey reporting and reduce data collection and processing costs by offering respondents direct online reporting and editing. Because the survey code book and instructions are now part of FSSWEB, the contractor no longer delivers diskettes or hard copies of those documents to agencies. A few agencies do not use FSSWEB because they submit their data electronically. Information was collected for the Federal fiscal year (October 1 through September 30). Data collection started in February with a requested due date in May. Data editing was performed, using both manual reviews for obvious errors and automated data checks. The automated checks included a comparison of current-year obligations by category of support and its prior-year obligations. Problems were referred back to the agency submitting data, when necessary.

d. Estimation techniques

Since this is a census of the 21 major Federal agencies that obligate R&D funds for academic and nonprofit institutions and since there is no unit nonresponse or known item nonresponse, no weighting or imputation techniques are used.

3. Survey Quality Measures top

a. Sampling variability

Since all Federal agencies in the population are surveyed, there is no sampling error in the survey.

b. Coverage

Since identifying the appropriate Federal agencies is straightforward, coverage is considered excellent. However, it should be noted that if one is interested in total Federal S&E obligations to academic and nonprofit institutions, there is a minor coverage problem. While this undercoverage is not believed to have a significant impact on total funding, it may be significant for understanding the funding for some institutions.

c. Nonresponse

(1) Unit nonresponse - The response rate for this survey is 100 percent, indicating that there is no nonresponse bias.

(2) Item nonresponse - There is no known item nonresponse.

d. Measurement

We believe that the major source of nonsampling error in this survey is measurement error. The survey instrument is complex, and agencies may not always be able to provide the precise information desired. For example, Federal agencies are not always able to identify the branch of a university system to which its funds go. This means that complete disaggregation by actual university may not be feasible for some university systems.

Problems have also been noted at points in the past in determining the correct institutional code to be assigned to some universities. These coding errors can lead to considerable measurement error in the estimate of funding for these institutions in the years affected.

Other problems include agency difficulties in matching program descriptions to the proper funding category (R&D, facilities and equipment for instruction, etc.) in the Federal S&E Support database. NASA, for example, has placed increased emphasis on including education components to projects and education is always reported as "other S&E". At least one agency has said that the general support for S&E and "other S&E" categories are a catchall for programs that do not fit anyplace else. (See the "Report on the NSF Federal Support Survey Issues Workshop" held on May 20, 1999.)

4. Trend Data top

The survey has been conducted annually since 1965. The initial survey only elicited information about academic institutions. Information on nonprofit organizations was added in 1968. In 1971 the survey was expanded to include information on S&E field. The survey no longer collects data on non-S&E and by S&E field. Beginning for FY 1999, data on FFRDCs were no longer being collected.

The list of nonprofit institutions for which information is requested has increased substantially over time.

In studying trends other than published trends, analysts are encouraged to discuss their plans with the SRS survey project officer. When the review of data for consistency between each year's data and submissions for prior years reveals discrepancies, it is sometimes necessary to modify prior-year data.

5. Availability of Data top

a. Publications

The data from this survey are published anually in Detailed Statistical Tables in the series Federal Science and Engineering Support to Universities, Colleges, and Nonprofit Institutions, available on the SRS Web site. Data for major data elements are available starting in 1963.

Information from this survey is also included in Institutional Profiles.

b. Electronic access

Data from this survey are available on the SRS Web site and on WebCASPAR. Selected aggregate data are available in public use data files upon request.

c. Contact for more information

Additional information about this survey can be obtained by contacting:

Richard Bennof
Program Analyst
Research and Development Statistics Program
Division of Science Resources Studies
National Science Foundation
4201 Wilson Boulevard, Suite 965
Arlington, VA 22230
(703) 292-7783
via e-mail at rbennof@nsf.gov
Last Modified: Aug 21, 2000 Comments to srsweb@nsf.gov