Fast Response Survey System (FRSS): Teachers' Use of Educational Technology in U.S. Public Schools, 2009 (ICPSR 35531)
The Fast Response Survey System (FRSS) was established in 1975 by the National Center for Education Statistics (NCES), United States Department of Education. FRSS is designed to collect issue-oriented data within a relatively short time frame. FRSS collects data from state education agencies, local education agencies, public and private elementary and secondary schools, public school teachers, and public libraries. To ensure minimal burden on respondents, the surveys are generally limited to three pages of questions, with a response burden of about 30 minutes per respondent. Sample sizes are relatively small (usually about 1,000 to 1,500 respondents per survey) so that data collection can be completed quickly. Reported data are weighted to produce national estimates of the sampled education sector. The sample size permits limited breakouts by classification variables. However, as the number of categories within the classification variables increases, the sample size within categories decreases, which results in larger sampling errors for the breakouts by classification variables.
The Teachers' Use of Educational Technology in U.S. Public Schools, 2009 survey provides national estimates on the availability and use of educational technology among teachers in public elementary and secondary schools during 2009. This is one of a set of three surveys (at the district, school, and teacher levels) that collected data on a range of educational technology resources. A stratified multistage sample design was used to select teachers for this study. Data collection was conducted September 2008 through July 2009, and 3,159 eligible teachers completed the survey by web, mail, fax, or telephone.
The survey asked respondents to report information on the use of computers and Internet access in the classroom; availability and use of computing devices, software, and school or district networks (including remote access) by teachers; students' use of educational technology; teachers' preparation to use educational technology for instruction; and technology-related professional development activities. Respondents reported quantities for the following: computers located in the classroom every day, computers that can be brought into the classroom, and computers with Internet access. Data on the availability and frequency of using computers and other technology devices during instructional time were also collected. Respondents reported on students' use of educational technology resources during classes and teachers' use of modes of technology to communicate with parents and students. Additional survey topics included teacher training and preparation to effectively use educational technology for instruction, and teachers' opinions related to statements about their participation in professional development for educational technology. Respondents were also asked for administrative information such as school instructional level, school enrollment size, main teaching assignment, and years of experience.
Series: Fast Response Survey System (FRSS) Series
The public-use data files in this collection are available for access by the general public. Access does not require affiliation with an ICPSR member institution.
These data have been updated since their release as https://doi.org/ICPSR35531.v2, which you used to locate the data. If you need access to a previous version of the data (e.g., to replicate results in an article or report), you can contact ICPSR User Support to find out if a prior version is available.
United States Department of Education. Institute of Education Sciences. National Center for Education Statistics. Fast Response Survey System (FRSS): Teachers' Use of Educational Technology in U.S. Public Schools, 2009. ICPSR35531-v3. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2016-05-02. https://doi.org/10.3886/ICPSR35531.v3
Persistent URL: https://doi.org/10.3886/ICPSR35531.v3
Scope of Study
Geographic Coverage: United States
- 2008--2009 (Academic Year)
NCES does all it can to assure that the identity of data subjects cannot be disclosed. All direct identifiers, as well as any characteristics that might lead to identification, are omitted or modified in the dataset to protect the true characteristics of individual cases. Any intentional identification or disclosure of a person or institution violates the assurances of confidentiality given to the providers of the information.
Before using the data, users must read the Data Disclosure Warning section of the User Guide.
Before using the data, users are encouraged to review the Technical Notes presented in the User Guide on Sample and Response Rates; Weighting Procedures and Sampling Errors; Nonsampling Errors, Coding, and Editing; Definitions of Selected Analysis Variables; and Definitions of Terms.
NCES statistical standards and guidelines require a nonresponse bias analysis if the unit response rate at any stage of data collection is less than 85 percent. Therefore, a nonresponse bias analysis was conducted for the survey to inform the nonresponse weight adjustments. The User Guide contains the Nonresponse Bias Analysis Report as well as a summary of the results in the Nonsampling Errors, Coding, and Editing section.
Although item nonresponse for key items was very low, missing data were imputed for the items with a response rate of less than 100 percent. The missing items included both numerical data such as the number of computers in the classroom every day, as well as categorical data such as whether LCD projectors are available for teachers to use in the classroom every day. The missing data were imputed using a "hot-deck" approach to obtain a "donor" teacher from which the imputed values were derived. Under the hot-deck approach, a donor teacher that matched selected characteristics of the teacher with missing data (the recipient) was identified. The matching characteristics included characteristics of the school and district in which the teacher worked. These included categories of district enrollment size, instructional level of the school, categories of school enrollment size, locale, categories for percent of students in the school eligible for free or reduced-price lunch, the average number of computers per classroom in the school, and whether there were full-time technology staff in the school. In addition, relevant teacher questionnaire items were used to form appropriate imputation groupings. Once a donor was found, it was used to obtain the imputed values for the teacher with missing data. For categorical items, the imputed value was simply the corresponding value from the donor teacher. For numerical items, an appropriate ratio (e.g., proportion of computers in the classroom every day that have Internet access) was calculated for the donor teacher, and this ratio was applied to available data (e.g., reported number of computers in the classroom every day) for the recipient teacher to obtain the corresponding imputed value. Imputation flags are included in the data.
For confidentiality reasons, NCES did not include question 13 of the survey (grades the teacher currently teaches at the school) in the public-use file.
Data collection for the study was conducted in two stages. The first stage was the collection of teacher sampling lists, which coincided with data collection for the school survey. Materials for the study were mailed to the principal of each sampled school in September 2008. The materials introduced the study and requested that a list of eligible teachers be provided by mail or fax. The package included instructions for preparing the list and a form to be returned with the list of teachers. For confidentiality reasons, this form did not include the name of the survey or the name of the school. It contained a random ID number that allowed authorized staff to identify the school. Telephone follow-up for nonresponse and clarification of information on the lists was initiated in early October 2008 and completed in April 2009.
For the second stage of collection, questionnaires and cover letters for the teacher survey were mailed to sampled teachers at their school addresses. Sampling and mailing was conducted in batches, as teacher lists were collected and processed, beginning in January 2009 and ending in April 2009. Respondents were offered the option of completing the survey by web or mail. Telephone follow-up for survey nonresponse and data clarification was initiated in early February 2009 and completed in July 2009.
A total of 3,159 teachers completed the Teachers' Use of Educational Technology in U.S. Public Schools survey. Of the teachers who completed the survey, 63 percent completed it by web, 33 percent completed it by mail, 4 percent completed it by fax, and 1 percent completed it by telephone.
The sample for the FRSS 2009 teacher survey on educational technology consisted of 4,133 teachers from public schools in the 50 states and the District of Columbia. This survey was one of three related FRSS surveys conducted under a nested design involving a sample of schools, districts that administer the sampled schools, and teachers within the sampled schools. The selection of teachers included two stages.
For the first stage, a nationally representative sample of 2,005 regular public schools in the United States was selected from the 2005-06 NCES Common Core of Data (CCD) Public School Universe file, which was the most current file available at the time of selection. The sampling frame included 85,719 regular schools. Excluded from the sampling frame were schools with a high grade of prekindergarten or kindergarten and ungraded schools, along with special education, vocational, and alternative/other schools; schools outside the 50 states and the District of Columbia; and schools with zero or missing enrollment. To select the sample, the public school sampling frame was stratified by level (elementary or secondary/combined), categories of enrollment size, and categories for percent of students eligible for free/reduced-price lunch. To improve the representativeness of the sample, an implicit stratification was induced by sorting the schools within each stratum by type of locale and region prior to sampling. Within each stratum, schools were sampled systematically and with equal probabilities at predetermined rates that varied from stratum to stratum.
Of the 2,005 schools in the sample, 56 were found to be ineligible for the survey because they were closed, merged, or did not meet the eligibility requirements for inclusion (e.g., they were special education, vocational, or alternative schools). Of the 1,949 eligible schools in the sample, 1,563 schools provided a teacher sampling list.
For the second stage, a nationally representative sample of teachers was selected from lists provided by participating schools. The sampling frame included full-time teachers teaching at least one regularly scheduled class (other than physical education) in grades K through 12. Excluded from the sampling frame were administrators, counselors, advisors, and social workers (even if they also taught); teachers who taught only physical education; substitute, itinerant, part-time, and preschool teachers; teacher's aides; and unpaid volunteers. An average of two to three teachers was randomly selected from each participating school at rates that varied by instructional level of the school.
Of the 4,133 teachers in the sample, 150 were found to be ineligible for the survey because they did not meet the eligibility requirements for inclusion (e.g., they were physical education, substitute, itinerant, part-time, or preschool teachers). Of the 3,983 eligible teachers in the sample, 3,159 teachers completed the survey.
This data collection contains the following weight variables: TFWT (Full Sample Weight) and TFWT1-TFWT50 (Replicate Weights).
For further details regarding the base weight and replicate weights in this data collection, please refer to the Weighting Procedures and Sampling Errors section of the User Guide.
Description of Variables: The survey asked respondents to report information on the use of computers and Internet access in the classroom; availability and use of computing devices, software, and school or district networks (including remote access) by teachers; students' use of educational technology; teachers' preparation to use educational technology for instruction; and technology-related professional development activities. Respondents reported quantities for the following: computers located in the classroom every day, computers that can be brought into the classroom, and computers with Internet access. Data on the availability and frequency of using computers and other technology devices during instructional time were also collected. Respondents reported on students' use of educational technology resources during classes and teachers' use of modes of technology to communicate with parents and students. Additional survey topics included teacher training and preparation to effectively use educational technology for instruction, and teachers' opinions related to statements about their participation in professional development for educational technology. Respondents were also asked for administrative information such as school instructional level, school enrollment size, main teaching assignment, and years of experience.
For the eligible schools, the response rate for the first stage of the data collection procedure was 80 percent (1,563 schools that provided a teacher sampling list divided by the 1,949 eligible schools in the sample). The weighted list collection response rate was 81 percent.
For the eligible teachers, the response rate for the second stage of the data collection was 79 percent (3,159 responding teachers divided by the 3,983 eligible teachers in the sample). The weighted teacher response rate was 79 percent.
Extent of Processing: ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:
- Standardized missing values.
- Created online analysis version with question text.
- Checked for undocumented or out-of-range codes.
Original ICPSR Release: 2015-03-19
- 2016-05-02 The Dataset Lead-In document was updated.
- 2016-05-02 A Dataset Lead-In document was added to the data collection.
- Citations exports are provided above.
Export Study-level metadata (does not include variable-level metadata)
If you're looking for collection-level metadata rather than an individual metadata record, please visit our Metadata Records page.