School Survey on Crime and Safety (SSOCS), 2006 (ICPSR 25421)

Version Date: Mar 4, 2010 View help for published

Principal Investigator(s): View help for Principal Investigator(s)
United States Department of Education. Institute of Education Sciences. National Center for Education Statistics

https://doi.org/10.3886/ICPSR25421.v1

Version V1

Slide tabs to view more

The School Survey on Crime and Safety (SSOCS) is managed by the National Center for Education Statistics (NCES) on behalf of the United States Department of Education (ED). SSOCS collects extensive crime and safety data from principals and school administrators of United States public schools. Data from this collection can be used to examine the relationship between school characteristics and violent and serious violent crimes in primary schools, middle schools, high schools, and combined schools. In addition, data from SSOCS can be used to assess what crime prevention programs, practices, and policies are used by schools. SSOCS has been conducted in school years 1999-2000, 2003-2004, and 2005-2006. A fourth collection is planned for school year 2007-2008. SSOCS:2006 was conducted by the United States Census Bureau. Data collection began on March 17, 2006, when questionnaire packets were mailed to schools, and continued through May 31, 2006. A total of 2,724 public schools submitted usable questionnaires: 715 primary schools, 948 middle schools, 924 high schools, and 137 combined schools.

United States Department of Education. Institute of Education Sciences. National Center for Education Statistics. School Survey on Crime and Safety (SSOCS), 2006. Inter-university Consortium for Political and Social Research [distributor], 2010-03-04. https://doi.org/10.3886/ICPSR25421.v1

Export Citation:

  • RIS (generic format for RefWorks, EndNote, etc.)
  • EndNote
United States Department of Education. Office of Safe and Drug-Free Schools, United States Department of Justice. Office of Justice Programs. Bureau of Justice Statistics

school district

Inter-university Consortium for Political and Social Research
Hide

2006
2006-03-17 -- 2006-05-31
  1. Users interested in obtaining a restricted-use data file containing more detailed information can file an application for the data with the Institute of Education Sciences/National Center for Education Statistics (IES/NCES), Restricted-data Licenses.
Hide

The SSOCS is the primary source of school-level data on crime and safety for the United States Department of Education. It provides nationwide estimates of crime, discipline, disorder, programs, and policies in public schools. Data on crime, violence, and disorder in the nation's schools are collected to provide policymakers, parents, and educators with the information necessary to identify emerging problems and to gauge the safety of American schools.

SSOCS 2006 was conducted as a mail survey with telephone follow-up. Four months before the onset of data collection, NCES began working with the school districts of sample schools that required prior approval to participate in the survey. On March 10, 2006, advance letters were sent to school administrators of sample schools that included the date of the first questionnaire mailing and a toll-free number to call with any questions. On March 17, 2006, questionnaires were sent via FedEx directly to the principals of the sample schools, with a cover letter describing the importance of the survey and a promotional SSOCS pen. Upon distribution of the SSOCS questionnaire to schools, letters were mailed to chief state school officers and superintendents to inform them that schools within their states and districts, respectively, had been selected for SSOCS:2006. The letters included information about the survey and were accompanied by a promotional SSOCS pen, an informational copy of the questionnaire, and the SSOCS brochure. The letters were not designed to request permission from these officials to participate in the survey, but rather as a vehicle to enhance participation. During the two weeks following the first questionnaire mailing, a screener telephone operation was conducted to verify that sample schools had received the questionnaire and were, in fact, eligible to participate. One week after the screener ended, a reminder telephone operation began, which was conducted in two 1-week phases. The primary objective of the reminder operation was to follow up with the principal or school contact to determine the status of the questionnaire. However, during the second week, the interviewer could complete the SSOCS interview over the phone at the respondent's request. Data collection ended on May 26, 2006. Returned questionnaires were examined for quality and completeness using both manual and computerized edits. If a questionnaire did not meet predetermined levels of completeness, the respondent was contacted again to resolve issues related to the missing data, irrespective of whether the items missing data were considered "critical." The criteria used to determine completeness are detailed in section 3.1 of the School Survey on Crime and Safety: 2005-2006 Data File User's Manual (Bauer et al. 2007). If a satisfactory resolution could not be reached, imputation was used to resolve data quality issues for questionnaires in which at least 60 percent of all items, 80 percent of critical items, 60 percent of item 16, and 60 percent of item 22 had been completed. Questionnaires that did not meet the imputation criteria were considered incomplete and were excluded from the analyses in this report.

The sampling frame for SSOCS:2006 was constructed from the 2003-2004 NCES Common Core of Data (CCD) Public Elementary/Secondary School Universe data file. The CCD is an annual survey system of all public K-12 schools and school districts. Certain types of schools were excluded from the SSOCS:2006 sampling frame, including special education schools, vocational schools, alternative schools (e.g., adult continuing education schools and remedial schools), newly closed schools, home schools, ungraded schools, schools with high grades of kindergarten or lower, overseas Department of Defense schools, schools sponsored by the Bureau of Indian Affairs, schools in Puerto Rico, and schools in the United States outlying areas of American Samoa, Guam, Northern Mariana Islands, and United States Virgin Islands. Public charter schools were not excluded. The use of the CCD as a sampling frame in SSOCS:2006 deviates from the SSOCS:2000 and the SSOCS:2004, which both utilized a modified version of the Schools and Staffing Survey (SASS) sampling frame. This deviation was necessary because SSOCS:2006 occurred between SASS collections. The objectives of the SSOCS sampling design were twofold: to obtain overall cross-sectional and subgroup estimates of important indicators of school crime and safety and to yield precise estimates of change in these indicators between 1999-2000, 2003-2004, and 2005-2006. To attain these objectives, a stratified sample of 3,565 regular public schools was drawn for SSOCS:2006 using the same general sampling design as in the previous survey administrations for stratification variables, number of strata, method of sample allocation, and sorting of variables before selection. Adopting the same basic design for all survey administrations increases the precision of the estimates of change. As in the SSOCS:2004, there was no attempt to minimize overlap between the SSOCS:2006 sample and samples for other NCES surveys. The initial goal of SSOCS:2006 was to collect data from at least 2,550 schools, taking nonresponse into account. One possible method of allocating schools to the different sampling strata would have been to allocate them proportionally to the United States public school population. However, while the majority of United States public schools are primary schools, the majority of school violence is reported in middle and high schools. Proportional allocation would, therefore, have yielded an inefficient sample design because the sample composition would have included more primary schools (in which crime is an infrequent event) than middle or high schools (in which crime is a relatively more frequent event). As a result, a larger proportion of the target sample of 2,550 schools was allocated to middle and high schools. The target sample was allocated to the four instructional levels as follows: 640 primary schools, 895 middle schools, 915 high schools, and 100 combined schools. Schools in the SSOCS:2000 and the SSOCS:2004 were allocated to instructional levels in a similar manner. The same variables and categories used to create strata in SSOCS:2000 and SSOCS:2004 were used to create strata in SSOCS:2006. The population of schools was stratified (grouped) into four instructional levels (primary, middle, high, and combined), four types of locale settings (city, urban fringe, town, and rural), and four enrollment size categories (less than 300 students, 300-499 students, 500-999 students, and 1,000 students or more). These variables were chosen because they have been shown to be associated with school crime (Miller 2004). The sample of schools in each instructional level was allocated to each of the 16 cells formed by the cross-classification of the four categories of enrollment size and four types of locale. In order to obtain a reasonable sample size of lower enrollment schools while giving a higher probability of selection to higher enrollment schools, the sample was allocated to each subgroup in proportion to the sum of the square roots of the total student enrollment in each school in that stratum. The effective sample size within each stratum was then inflated to account for nonresponse. Once the final sample sizes were determined for each of the 64 strata, the subgroups were sorted by region and percent minority enrollment, and an initial sample of 3,565 schools was selected. Sorting by these variables before selection has the same effect as stratification with proportional allocation of schools to the strata. For more information on the sample design, see chapter two of the School Survey on Crime and Safety: 2005-2006 Data File User's Manual (Bauer et al. 2007).

Public elementary and secondary schools in the United States.

school

survey data

A unit response rate is, at its most basic level, the ratio of surveys completed by eligible respondents to the total count of eligible respondents. In some surveys, this calculation can be rather complicated because it is difficult to distinguish between eligible and ineligible units. For school surveys, however, the Department of Education updates its list of known schools on a fairly regular basis, so estimating eligibility among nonrespondents is relatively straightforward. Unit response rates can be unweighted or weighted and are traditionally reported because they reflect the potential effects of nonsampling error and indicate whether portions of the population are underrepresented due to nonresponse. In order to calculate any of these measures, it is first necessary to know the disposition (outcome) of each sampled case. Table B-1 in the Codebook shows the dispositions of the 3,565 cases initially selected for participation in SSOCS:2006, as well as the unweighted and weighted unit response rates by selected school characteristics. The overall weighted unit response rate was 81 percent.

Hide

2010-03-04

2018-02-15 The citation of this study may have changed due to the new version control system that has been implemented. The previous citation was:
  • United States Department of Education. Institute of Education Sciences. National Center for Education Statistics. School Survey on Crime and Safety (SSOCS), 2006. ICPSR25421-v1. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2010-03-04. http://doi.org/10.3886/ICPSR25421.v1

2010-03-04 ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:

  • Standardized missing values.
  • Checked for undocumented or out-of-range codes.
Hide

Sample weights allow inferences to be made about the population from which the sample units were drawn. Because of the complex nature of the SSOCS:2006 sample design, weights are necessary to obtain population-based estimates, to minimize bias arising from differences between responding and nonresponding schools, and to calibrate the data to known population characteristics in a way that reduces sampling error. The procedures used to create the SSOCS sampling weights are described below. An initial (base) weight was first determined within each stratum by calculating the ratio of the number of schools available in the sampling frame to the number of schools selected. In order to reduce the potential of bias from nonresponse, weighting classes were determined by using a statistical algorithm similar to CHAID (chi-square automatic interaction detector) to partition the sample so that schools within a weighting class were homogeneous with respect to the probability of responding. The predictor variables for the analysis were school instructional level, region, enrollment size, percent minority enrollment, student-to-teacher ratio, percentage of students eligible for free or reduced-price lunch, and number of full-time-equivalent teachers. When the number of responding schools in a class was small, that weighting class was combined with another class to avoid the possibility of large weights. After combining the necessary classes, the base weights were adjusted so that the weighted distribution of the responding schools resembled the initial distribution of the total sample. The nonresponse-adjusted base sampling weights were then calibrated to agree with known population counts obtained from the sampling frame to reduce bias in the estimates due to undercoverage. The calibration process, a form of poststratification, separates the sample into a number of classes (poststrata), defined by a cross-classification of variables. The known population counts may be available for the individual cells of the cross-classification or only for certain margins of it. In the latter situation, the calibration proceeds iteratively, one margin at a time, and is often called "raking." Poststratification works well when the noncovered population is similar to the covered population in each poststratum. Thus, to be effective, the variables that define the poststrata must be correlated with the outcome of interest (school crime, in this report). They must also be well measured in the survey, and the control totals must be available for the population as a whole. As in SSOCS:2004, these requirements were satisfied in SSOCS:2006 by the two margins set up for the raking ratio adjustment of the weights: (1) instructional level and school enrollment size and (2) instructional level and locale. All three variables -- instructional level, school enrollment size, and locale -- have been shown to be correlated with school crime (Miller 2004).

Hide

Notes

  • The public-use data files in this collection are available for access by the general public. Access does not require affiliation with an ICPSR member institution.