Click for DHHS Home Page
Click for the SAMHSA Home Page
Click for the OAS Drug Abuse Statistics Home Page
Click for What's New
Click for Recent Reports and Highlights Click for Information by Topic Click for OAS Data Systems and more Pubs Click for Data on Specific Drugs of Use Click for Short Reports and Facts Click for Frequently Asked Questions Click for Publications Click to send OAS Comments, Questions and Requests Click for OAS Home Page Click for Substance Abuse and Mental Health Services Administration Home Page Click to Search Our Site

Services Research Outcomes Study (SROS)

Previous Page TOC Next Page


  1. FIELDWORK AND DATA PREPARATION

FIELDWORK AND DATA PREPARATION

This section describes how the National Opinion Research Center (NORC) organized and conducted SROS data collection activities. Stage 1 involved facility-level data collection and records abstraction with the cooperation of a nationwide sample of 120 drug treatment facilities. Stage 2 consisted of respondent interviews with a target sample of 3,000 clients who were discharged from treatment at those facilities during a 12-month index period during 1989–1990. This appendix summarizes survey protocols and procedures used during both data collection stages; it also provides operational outcomes and results.

Stage 1: Facility-Level Data Collection

During Stage 1 data collection, NORC attempted contact with the 120 facilities that contributed client-level abstraction data to DSRS, to request cooperation with the SROS followup. When located facilities were approached, SROS field staff requested that directors facilitate completion of four data collection tasks:

·Complete the Program Director Interview;

·Locate client sampling lists left at facilities by DSRS field staff;

·Reselect from client discharge lists in the index year, as needed, for the supplemental sample design or for replacement of missing DSRS client sampling lists; and

·Locate all selected client records for abstraction of fresh data.

Pretest of Facility-Level Data Collection at Eight Sites

Before launching the national SROS facility-level data collection, NORC conducted a field test of all associated survey instruments and procedures. The test occurred in the spring of 1992 at eight facilities chosen from the DSRS facility sample — two from each treatment modality, clustered in an eastern and a midwestern metropolitan area. The pretest demonstrated the feasibility of followup contacts with the targeted facilities. Most Program directors remembered the DSRS study and agreed to participate in SROS. Site visits for data collection tasks were readily scheduled at seven of the eight facilities. One facility was lost because it was in the process of changing owners, and no responsible spokesperson could be identified in the brief course of the pretest fielding. The Program Director Interview was conducted at the seven other facilities. At six of the seven remaining facilities, the index year discharge lists earmarked by the DSRS sampling operation were recovered, and the sampling information was determined to be usable. When the link to the DSRS client sample could not be found, it appeared that reconstruction of the index year discharge lists for fresh sampling would be feasible.

Two instruments for client record abstraction were successfully pretested — one for drawing, identifying, and locating information from all selected client records, and one for duplicating DSRS client data with clients newly selected for the supplemental sample.

Preparing for Facility-Level Activities at All 120 Sites

Following the pretest, final plans were made, interviewers were recruited, and training programs were prepared so that field operations could be launched promptly after approval by the Office of Management and Budget (OMB), which occurred in early 1994.

There were several criteria for recruitment of field staff: experience on drug outcome studies like the California Alcohol and Drug Treatment Assessment, and studies approaching disadvantaged populations, experience in abstraction from medical records, and proximity to facility clusters. Interviewer training focused on all protocols involved in implementing site visits to complete the client sampling, records abstraction, and conduct of the Program Director Interview. The three-day training for facility site visits gave the field staff practice with forms and procedures to structure client sampling in a variety of situations, ranging from effortless retrieval of a DSRS sampling list in facility files, through relisting index-year discharges and reconstructing the DSRS sample.

In addition, interviewers were trained on forms and procedures for selecting a supplementary sample once the DSRS client sample for a facility was identified and set aside. Practice with the Program Director Interview and with two abstraction instruments — one designed to obtain identifying and locating data for every selected client and one to replicate DSRS abstraction for freshly selected clients — was an important element in the training. Lectures, discussion, andpractice with mocked-up materials covered such issues as general steps in listing and drawing a sample, confidentiality guidelines, maintaining good working relationships with facility staff, understanding typical client records and facility record systems, and editing and quality control of completed instruments and sampling materials.

Structuring Facility-Level Contacts

Initial telephone contacts with facilities to secure cooperation and arrange site visits were the responsibility of a staff of field managers. They monitored production and field costs of the interviewers who completed data collection tasks. The seven field managers, coordinated by a field project manager, reported weekly on cost and progress of the work to the SROS central office staff.

Facility-Level Field Activities

The field management staff supervised the team of interviewers who made the site visits for the purposes of sampling and abstraction. Each field manager initiated preliminary contact with staff at the facilities which interviewers visited. At some facilities, the field manager was in prolonged telephone contact with the staff persons serving as informants for the Program Director Interview. (Completion of that instrument was often extended well beyond the time period in which other site visit tasks were completed.) The progress of the work at a site was subject to daily monitoring. Interviewers worked with computer-generated Facility Information Sheets giving sampling rules for each facility, and Client Face Sheets summarizing demographic data and admission/discharge data for each treatment episode in the DSRS database. They were instructed to telephone their field managers from the site as soon as they had located DSRS sampling materials or otherwise defined sample lists for the index year. When work proceeded smoothly, field managers compiled reports showing the following information:

·Discharge list count before DSRS sampling;

·Total clients identified in DSRS database;

·Guidelines for supplementary sample selection:

— Discharge list count after subtraction of DSRS selected lines, and

— Sampling interval used to select supplementary sample;

·Total locating abstraction instruments completed; and·

·Total supplementary abstraction instruments completed.

The figures were entered and updated on a spreadsheet transmitted weekly to the project staff. When discharge lists at a facility appeared to exceed upper and lower limits furnished on the Facility Information Sheet, interviewers were instructed to telephone their field managers immediately. The managers would check for problems with the completeness or the accuracy of the listings. At sites that had no supplementary sample, discrepancies were matters for inquiry, provided that the DSRS sampling materials clearly identified the DSRS-selected clients. The field managers had an additional resource in a DSRS sampling report that listed everyselected case by line number of the discharge listing. At supplementary sample sites, any significant problems with anticipated limits of index-year discharges were immediately referred to the project sampling team before work proceeded. As appropriate, the sampling team would issue revised sample selection rules.

The material mailed to the project staff after a site visit included a set of forms documenting the sample selection and abstraction process, including the following:

·SROS Sampling Record, a list of the DSRS client sample by name and facility record number, with an SROS client ID number from the computer-generated Client Face Sheet summarizing DSRS client data. At facilities where DSRS sampling forms were recovered, this SROS form would duplicate the DSRS Sampling Worksheet except for the project-assigned client ID. Otherwise, the SROS Sampling Record was copied from earmarked discharges on a DSRS Listing Form or reflected a DSRS client sample reconstructed by means of data on the Client Face Sheets from a freshly created discharge listing.

·SROS Sampling Worksheet, a list of client names, discharge dates, facility record numbers, and SROS client ID numbers for a selected supplementary sample. Where the supplementary sample was large, an SROS Sampling Worksheet Continuation was attached.

This additional sampling form was left with facility staff where its use had been required, to be kept for six months in case of follow-up inquiry.

·SROS Listing Form, used to make fresh index year discharge lists at facilities where the DSRS Listing Form was not recovered and a facility-generated list was not available.

At the conclusion of Stage 1, a letter of thanks was mailed to facilities with a brief questionnaire on interviewer performance in completing site visit tasks. Most facilities responded, and the feedback on SROS interviewers was very positive. As a final quality control measure, the project office placed follow-up calls to those facilities whose case materials, as submitted, were missing critical items or contained ambiguous information. This final case reconciliation and cleanup corrected inconsistencies, omissions, and errors in client identifying data and ensured that SROS had a complete data record for every respondent reported as a completed abstraction from a site visit.

Results of Facility-Level Data Collection

Facility data collection began in the spring of 1994 and continued into the fall of 1994. Many facilities agreed to early site visits; by mid-June 1994, more than 2,400 record abstractions were complete. Analysis of sampling materials and abstraction instruments from final site visits revealed unanticipated sample attrition due to loss or duplication of individual client records. Some individual records could not be located in facility files and some client selections were duplicates; that is, the same individual had been sampled for more than one treatment episode in the index year. To compensate for attrition of individuals, the project sampling department augmented the client sample by adding 200 supplementary cases for fresh selection at nine of the larger facilities. By early August 1994, site visits had yielded almost 2,800 completed abstractions.

Achieving the final goal of 3,000+client abstraction cases within the given time frame depended on finding "lost" facilities and reassuring doubtful facilities. Tracking down the stored client files of programs that had gone out of business since 1990 took much field effort. Some facility directors needed reassurance that the confidentiality and privacy of their discharged clients would be honored before they permitted the abstraction of identifying information from their records. A few facilities stipulated that the SROS sample design be approved by their own institutional review processes before agreeing to site visits; this subordinated the data collection schedule to scheduled quarterly meetings of boards. Additional descriptions of SROS design and confidentiality protocols were developed for review by Institutional Review Boards (IRB) or state agencies. Reassurances about the confidentiality procedures and protocols built into the study by NORC included such items as:

·Rulings from the University of Chicago’s IRB and from legal counsel to the Department of Health and Human Services that provisions in the Code of Federal Regulations authorizing disclosure of records for research purposes apply to the SROS design;

·Safeguards to respondent privacy and confidentiality in all locating inquiries, such that the subject of the study and source of the sample are never described except to properly identified respondents in private settings;

·Training designed to enhance interviewers’ sensitivity to confidentiality and privacy issues;

·A Certificate of Confidentiality obtained from the Department of Health and Human Services, protecting SROS interviewers and other research staff from any efforts to compel them to release data collected in the interview or in any operations connected with the interview;

·Consent statements read to and signed by respondents who agree to participate in the SROS interview, informing them that their data will be released in statistical summaries only and reminding them that they have the right to refuse the interview or to refuse response to specific items in the questionnaire; and

·NORC’s standard procedures to maintain confidentiality of data, including:

— Dissociation of respondent names from all data,

— Removal of information with potential to identify individuals before release of data,

— Restricted circulation of completed data collection instruments and forms to SROS project staff, and

— Maintenance of all files in locked and secure places.

If client sample selection and record abstraction had yielded insufficient cases or had been distributed disproportionately over the treatment modalities, project staff were prepared to augment the facility sample by approaching programs from the backup sample interviewed by DSRS. This step proved unnecessary. At conclusion of Stage 1 data collection, SROS had completed record abstractions for a total client sample of 3,047 individuals discharged during the index year at 99 DSRS facilities: 22 hospital inpatient, 27 residential, 26 outpatient methadone, and 24 outpatient drug free. 1,706 client abstractions were linked to the DSRS database, a recovery rate of 77 percent of the 2,222 clients abstracted by DSRS. 1,341 abstractions were from clients freshly selected from index-year discharges at 32 of the larger facilities, making the SROS sample more proportionate to the national number of discharges from treatment.

Stage 2: The Client Survey

Stage 2 of SROS consisted of field interviews with a target sample of 3,000 clients who had been discharged from designated drug treatment facilities. Before launching the Main Client Survey, NORC engaged in an extensive period of design and testing. The focus of these preliminary activities was to: (a) develop a coherent and analytically sound client questionnaire, which was done by means of cognitive testing, and (b) field-test protocols for locating, contacting, and interviewing the respondents, which was addressed by a Pilot Test with 90 clients sampled from six facilities.

Stage 2 operations are presented below in the following sections. The initial section gives a synopsis of NORC’s approach to cognitive testing. The Pilot Test is highlighted next. The final section focuses on the Main Client Survey.

Cognitive Testing of Client Questionnaire

The Client Questionnaire underwent cognitive testing with nine respondents discharged from drug treatments unrelated to the facility sample during the SROS index year. In administering the questionnaire, interviewers asked respondents to reflect on the questions and their answers. The results were studied to obtain insight into respondents’ understanding of terms, theirstrategies for recall, and their confidence in the accuracy of their answers. SROS staff tested respondents’ abilities to anchor memories around a past treatment episode — that is, their ability to recall accurately events before, during, and following a sample treatment episode. The outcome of cognitive testing indicated that collecting five-year retrospective data anchored by a treatment episode was feasible. Respondents could recall their drug treatment in detail and discuss sample episodes and other events confidently, in chronologically correct and consistent ways.

Pilot Test

The Pilot Test, which occurred during a six-week period in late spring 1994, was designed to answer the following questions:

·Can individuals be located for interview four to five years post-discharge from a sampled treatment episode by using the dated information abstracted from 1989–1990 facility records?

·How will clients react to their inclusion in the survey?

·Will those who respond to the Client Questionnaire be able to recall events related to the sampled treatment episode after five years?

·Will respondents be able to distinguish effectively between behavior and life events of 1990 and later, perhaps including additional treatment episodes?

·Will a sufficient percentage of respondents be willing to furnish a urine sample at the conclusion of the interview?

The client sample for the Pilot Test was selected from six DSRS facilities clustered for cost-effective field operations in Michigan/Indiana and Maryland/Delaware. The pilot facilities were representative of treatment modalities, with two inpatient, two outpatient methadone maintenance, one outpatient drug free, and one residential treatment center. Interviewers visited the pilot facilities and obtained the DSRS discharge listings for 1989–1990, on which a special pilot sample (separate from the DSRS client sample) had been earmarked during the SROS pretest. They then used the SROS locating instrument to abstract identifying data for field use. The 90 cases in the pilot sample were distributed so that the inpatient and methadone maintenance facilities contributed 30 cases each, and the drug-free and residential facilities, 15 cases each.

Pilot Test interviewers attended a three-day training in May 1994. The training included several practices with the Client Questionnaire and an associated calendar using mocked-upclient situations, augmented by lectures on questionnaire structure and content. Other lecture and discussion sessions covered confidentiality and locating protocols for approaching clients, informed consent forms and procedures, management of locating problems and use of special client locating resources, management of the interview with impaired respondents, ensuring interviewer well-being and safety in approaching a drug-using population, and materials and procedures for obtaining urine specimens.

The SROS Pilot Test was highly successful, establishing the feasibility of locating the respondents and securing cooperation and valid data from them. Fifty interviews were completed, with an additional case completed the week following close of the six-week field period. The total completions exceeded the overall target of 25 to 45 interviews set for the field work, and the number met or exceeded targets set for clients discharged from the individual treatment modalities. Eighteen of the 30 inpatient clients and 18 of the 30 methadone maintenance clients in the pilot sample gave interviews, while 7 of the 15 clients discharged from outpatient drug-free treatment and 8 of the 15 discharged from residential treatment were interviewed.

Methodological concerns about respondents’ ability to recall significant treatment events and behavior over a five-year time span were allayed by the richness of the Pilot Test data, the infrequent missing items, and the coherence between data abstracted from facility records and the responses to the questionnaire. Thirty-six of the pilot respondents who completed interviews gave urine specimens, for a 71 percent success rate. Ten of the 15 who did not provide urine samples were in circumstances inappropriate for the request of a specimen (three were interviewed by telephone and seven were incarcerated). Only five of the pilot respondents refused the request for a urine sample.

Based on the Pilot Test, the OMB approved clearance for the full client survey.

Main Client Survey

The Main Client Survey was scheduled for a nine-month period from June 1995 through March 1996. The interviewing staff for this effort included a number of field staff experienced from SROS Stage 1, augmented by interviewers with experience on other drug outcome studies.

Training Interviewers

The three-day training for the Main Client Survey was closely modeled on the earlier training for the Pilot Test. Several practice runs through the instrument using mocked-up client data and working with associated calendars were augmented with lecture and discussion sessions on confidentiality protocols and procedures, on resources and techniques for locating discharged clients in the community, on approaching and identifying the client sample and gaining trust and cooperation, on working effectively with a disadvantaged drug-using population, and on procedures for obtaining and documenting urine specimens and mailing them to the NIDA-certified laboratory subcontracting to SROS.

Structuring Data Collection--Field Management Staff

A staff of six field managers responsible for monitoring the data collection and reporting weekly production and cost to project staff was coordinated by a field project manager. Each field manager worked with 10 to 12 interviewers to locate and interview clients residing in a geographical region. In general, respondents were scattered in western and midwestern field assignments and more tightly clustered in the East and the South. The initial staffing assumption that most clients would be found in the same general area as the facility that discharged them was not borne out by data collection results, and a great deal of locating work was required before field goals were met, due to many transfers of cases between field regions.

Locating Client Respondents.

Concerned for the quality of address data abstracted from five-year old client records, the project staff mounted a prefield location effort in May 1995 to verify and update the latest known addresses from facility records. Two credit bureau databases, rich in information about charge card users, were searched for address data of those for whom the SROS abstraction showed a social security number or a complete name and an address. Locating clerks reviewed the information files returned from the credit bureaus and clarified addresses as necessary with calls to directory assistance. The names and social security numbers of clients in Illinois, Michigan, and New York who were not located through credit bureau search were submitted to Departments of Motor Vehicles in those states to be checked against their databases. This prefield inquiry confirmed the original addresses of about 17 percent of clients and yielded new or updated address information for more than 60 percent. In addition, the inquiry found that 106 clients, or three percent of the sample, were deceased since their selected treatment episodes.

About 18 percent of the sample was classified as unlocatable at the conclusion of prefield client tracking. The project team mailed a letter that introduced the client sample to the study by a letter that described their inclusion in a "health study" sponsored by the U.S. Public Health Service in envelopes marked Address Correction Requested; Return to Sender. The project staff thus received very early word about clients unknown at their given addresses, along with any forwarding addresses on file at local post offices. Where the letters were delivered, clients had some preparation for personal contact by SROS interviewers.

Early in the field period, the assumption that better than 75 percent of assigned clients lived at known addresses proved too optimistic. Many addresses updated by the credit bureau search were discovered to be obsolete at interviewer contact.

Field protocols for SROS imposed confidentiality safeguards on the most casual locating inquiries. The interviewers were trained to communicate the minimum in asking neighbors, relatives, professionals, or acquaintances the whereabouts of the respondents they sought. Field staff needed some explanations to attest their good reasons for inquiring. SROS interviewers were permitted to use the explanation given in the introductory letter to clients — they wished to reach respondents in connection with a health study conducted for the U.S. Public Health Service.

Interviewers were encouraged to distribute a business card with the project name and a toll-free telephone number to locate informants and other persons contacted in the community, requesting that the respondent or anyone who had knowledge of an address contact the project. The project staff expected that as interviewing continued, some general understanding of the ongoing study and its legitimacy would circulate in the clients’ communities. The business card gave respondents easy telephone access to make appointments; it was expected that some of them would be responsive to the modest payment represented by the fee of $15 once they were assured of the legitimacy of the research.

The business card was directed especially at the part of the client sample that lacked ordinary ties to the community — the group that would not leave post office forwarding addresses or that would not use credit cards. The best locating field method for reaching this group, once the interviewer had some general clues about an area where people knew of them or their families, was "hanging out" — or blending into the street or the neighborhood, looking harmless, waiting for someone who might respond to a casual inquiry about the whereabouts of a respondent.

Managing Client Contact and Obtaining Respondent Cooperation

When SROS interviewers had reason to believe they had located a respondent, a protocol guided their explanations of the survey and their requests for cooperation. First, interviewers needed to verify identity. The project guidelines specified that interviewers could begin an explanation if a person answering to the first and last name of an SROS respondent, matching in age, sex, and ethnicity (the demographic data on the client Face Sheet), was located at an address or telephone number associated with that person in the locating abstraction record. The explanation followed a set introductory script to be read in English or Spanish, as appropriate, only when privacy was secured. In the protocol, the study’s sponsorship by the Substance Abuse and Mental Health Services Administration was specified for the first time, with reference to the U.S. Public Health Service guidelines that permitted gathering information about the client’s treatment for substance abuse in the index year. The script went on to describe the research purposes and the confidentiality safeguards for maintaining data and concluded by presenting the SROS General Consent Form. Respondents were asked to sign the form to indicate their informed consent, their understanding that the data would be used for statistical analysis only, and that their confidentiality and privacy would be safeguarded, and that their participation was voluntary.

For persons located away from addresses/telephones in the abstracted database, or for persons who did not appear to match the demographic data in the Face Sheet, further verification was required before interviewers could read the Introductory Script. Requests for date of birth and for mother’s maiden name that could be matched to Face Sheet data were the first filter. If the match remained ambiguous, the interviewer was told to request a hard copy ID to verify the person’s first and last name. As a last resort, the interviewer might request the person’s Social Security Number when the number was present in the Face Sheet data. If none of these steps produced satisfactory evidence that the person was the client named on the Face Sheet, the interviewer was instructed to break off explanation without proceeding further. Great tact and judgment were required from interviewers in managing this complex introduction without alarming or antagonizing individuals while maintaining their privacy against any possibility that confidential personal history be communicated to a stranger.

The great majority of those who heard the explanation of SROS sponsorship and purposes agreed to give an interview. Not quite nine percent of sampled clients were final refusals. Interviewers were instructed to make additional reassurances about confidentiality procedures and to stress the need for nationally representative drug outcome data when they encountered resistance. Very strong refusal conversion efforts were not appropriate for this population given their privacy rights.

Conducting the Interview and Collecting Urine Specimens

Having obtained the respondents’ informed consent and given them a copy of the signed Consent Form, interviewers proceeded to administer the SROS questionnaire. A project calendar, often partly set up in advance of the interview, showed distinctively highlighted time frames for the sample treatment episode, and for the five years preceding and the five years following the index episode. The 12-month period immediately preceding the interview date was likewise distinctively highlighted on the calendar. The first set of questions checked demographic data and gathered fresh data about gang membership and current educational status. The Treatment History section of the questionnaire initiated use of the calendar to confirm or correct data about the sample treatment episode. Respondents were referred to the calendar as interviewers perfected and explained the color coding that identified Time Before, Time During, and Time After the sample episode, with additional color coding for Last Year. Respondents were given the calendar to hold during the course of the interview and were referred to the specific color-coded reference periods at appropriate points in the questionnaire. Once respondents were oriented, the interview protocol proceeded systematically to gather data for the following:

·Treatment experience in the sample episode;

·Treatment experiences, if any prior to the sample episode, with emphasis on the prior five year period;

·Treatment experience, if any, after the sample episode;

·Use of main drug or drugs (including alcohol) specified for treatment episodes in the five years before and the five years after the sample episode, with special questions about use in the last year and the last 30 days;

·Use of other drugs (including alcohol) not specified as main drugs in the same time frames;

·Use of needle injection for drugs in the same time frames and during the sample episode;

·Legal history, including arrests and incarcerations and specific illegal activities — ever, and in the same time frames and during the sample episode;

·Marital status and living arrangements — ever, and in the same time frames;

·Health history, including mental health history, physical illness, use of health care services, sexual behavior, and victimization by physical attack or attack with a weapon — ever, and in the same time frames;

·Employment history — ever, and in the same time frames;

·Income sources in the last year; and

·Locating information for followup.

Respondents agreeing to furnish urine samples signed another consent form indicating that their cooperation was voluntary and they understood the test results would be held confidential and used for research purposes only. Then interviewers requested that the respondent furnish a urine specimen, providing respondents with a kit that the respondent used in privacy, returning the closed specimen bottle to the interviewers. After a visual check of the sample, interviewers inserted a documentation slip with client ID and interview date and sealed the kit for immediate mailing to the NIDA-certified laboratory that subcontracted to SROS. More than three-quarters of respondents completing the interview cooperated with the request to furnish a urine specimen.

Identifying Deceased Clients and Verifying Death

A total of 277 clients in the SROS sample died between discharge from the sample treatment episode and conclusion of client interviews. Eventually all reported deaths were validated against the National Death Index (NDI). Abstraction from facility records had turned up evidence of the death of 29 clients post-discharge, and credit bureau address checks before Stage 2 fielding brought news of 106 additional deaths. The balance of the deceased were identified or confirmed in the course of contacting and locating sampled clients to complete the Main Interview or through computer matches with the NDI of individuals not located by interviewers.

Field staff documented reported client deaths with a form specifying clearly the source of the information: (1) a relative, with name, address, telephone number, and precise relationship specified; (2) a death certificate obtained from the Vital Statistics Bureau; (3) an obituary published in the local press, copy attached; (4) credit bureau information obtained in the course of a request for address update; (5) SROS facility records.

Results of Data Collection

At conclusion of the nine-month field period, 68 percent of client cases were completed. This figure consisted of the approximately 59 percent of eligible clients who had given interviews and nine percent who had been documented as deceased. The greater number of noninterview cases were classified as final unlocatables; approximately 18 percent of the client sample had that final status, while almost 9 percent were final refusals, with very modest numbers classified as final unavailables (just more than one percent), final refusals by prison (one percent), and other final nonresponse (less than one percent). The course of the field work followed a pattern typical of samples where the major burden of the field work is slow and involves painstaking location inquiries. It took one month to complete 15 percent of assigned cases, and six more weeks to double that to 30 percent. It then took about two months to complete each 15-percent increment. Production patterns of this sort reflect the field time that must be invested to nurse each case to the point of interview — field time spent tracking lost respondents or patiently developing the trust of the hesitant ones.

Special Issues from the Client Survey

Concern for respondent confidentiality was a prominent part of the Client Survey field protocols, as the description of procedures for client identification makes clear. Project staff felt special concern about confidential management of the Locating Abstraction Record. Because of its field function, it necessarily carried much data identifying the client and it clearly tied individuals to named facilities where they received drug and/or alcohol treatment. This document was the major resource for interviewers in the course of field-tracking, but taking it into the field where it might be lost or accidentally exposed to view — even the view of respondents — was totally unacceptable. Field protocol accordingly asked that interviewers keep the document securely filed away with their project supplies at home. Interviewers transferred the details they needed to shape inquiry in the community, in cryptic notes if necessary, to a Record of Calls that identified the client by ID number only. The relatively modest client refusal rate suggests that the very cumbersomeness of SROS confidentiality procedures, with prescribed readings of scripts and multiple signed consent forms, encouraged respondents.

The completed work of Stage 2 interviewers was validated using NORC’s standard procedures. Ten percent of each interviewer’s completed caseload was randomly designated before data entry for a validation telephone call to the respondent from project staff. The validators explained that they were calling in relation to a recently completed health study and confirmed that the SROS interview had been conducted with the sample respondent when and where represented. The validation script had other questions about the elapsed time of the interview and a few data items, like highest educational level, asked to confirm questionnaire content. The calls concluded by asking respondents for any comments they had about the interviewer. If a respondent were to deny cooperating with the interview or ifthere were other serious discrepancies between validation responses and interviewer report, the entire completed caseload of the interviewer would be subjected to validation callback. Any cases determined invalid would be set aside from data entry and refielded for completion by other interviewers. However, the SROS validation procedures were completed without uncovering any evidence of misrepresented data.

Systems, Data Processing, File Preparation

For both Stage 1 and Stage 2 data collections, interviewers returned their completed documents and forms, including transmittals and edit checklists, to NORC’s receipt control center in Chicago. Clerks there reviewed each abstraction instrument, questionnaire, and associated document for completeness and registered receipt of the case ID using NORC’s Survey Management System (SMS).

Software Systems Overview

The SMS is an integrated software system that tracks all events related to a particular case. SROS required an SMS link between identification numbers at the facility level and client-level

IDs. Receipt of every document related to a case was registered by the SMS, which also tracked by case ID post-field operations like validation, urine test results, and data entry.

A computer-assisted data entry program (CADE) was used to capture and check the data in completed abstraction instruments, Program Director Interviews, and main Client Survey questionnaires. The CADE system was programmed to check for acceptable variables, inter-item consistency, critical item entry, and accurate numerical calculations. CADE entries were verified by randomly rekeying 10 percent of each data entry operator’s cases and automatically comparing for consistency. Data entry supervisors rekeyed the initial set of cases looking for problem items and common errors and passed the information back to the operators. The CADE system generated both case-specific and aggregate error reports. After data entry of all SROS material was complete, the entire database was subjected to a post-capture editing program that ensured that questionnaire skips were followed correctly, ranges were observed, and all sample cases were represented in the data set. Frequencies and other descriptive statistics were used to review the data and ensure the quality of the final files prepared for analysis.

NORC Central Office Security and Confidentiality Procedures

NORC maintains a secure facility for data preparation and hard copy instrument storage. The entrance is monitored by a secretary who buzzes in and registers visitors; only the employees of the data preparations center can enter at will. Locked filing cabinets are provided for storage of all hard copy forms and instruments. Data preparation personnel are subject to the same confidentiality protocols and give the same confidentiality pledges as other project staff. While keypunching abstraction instruments and questionnaires, operators stored only the case materials on which they were immediately working at their computer terminals. A login code and a password were required to access the SROS CADE program in the computers. Documents of cases awaiting processing and cases that had completed processing were locked away in the data preparation facility’s library.

File Preparation

Files were prepared for delivery to NORC’s analysis team, for delivery to SAMHSA, and for public use. Analytic files included a client data file with client questionnaire information for all completed cases, with facility names removed for separate delivery. Abstraction data from the SROS abstraction file was appended for all completed and deceased cases, with weights, provider IDs, and modality stratum plus abstraction data for the remaining cases.

Previous Page Page Top TOC Next Page

Go to SAMHSA Home Page

Click to Return to OAS Home Page 

 Click to Email OAS Data Questions 

  Click For Non-frames / text version of site

This page was last updated on August 15, 2003.

SAMHSA, an agency in the Department of Health and Human Services, is the Federal Government's lead agency for improving the quality and availability of substance abuse prevention, addiction treatment, and mental health services in the United States.

       Privacy Statement  |  Site Disclaimer  |   Accessibility