Smallest Geographic Unit:
Date of Collection:
Unit of Observation:
All United States municipal police agencies that had at least 300 sworn officers and were listed in the 2007 National Directory of Law Enforcement Administrators.
administrative records data,
Data Collection Notes:
ICPSR masked variables containing identifying information on the police agencies including street address, city, state, zip code, phone number, fax number, and census information. This information is not available to secondary users of the data even under restricted access procedures.
The primary objective of this study was to formulate evidence-based lessons on recruitment, retention, and managing workforce profiles in large, United States police departments. Specifically, the research team sought to address the following questions:
- What is the personnel situation that large agencies currently encounter? What approaches have they taken or what problems do they perceive in recruiting, compensating, promoting, and retaining personnel? How do these perceived problems vary by agency characteristics, such as size?
- What affects the supply of police recruits? What economic variables might affect how many perceived problems compare to likely actual ones? How might crime rates affect the number of recruits a police department can attract?
- What challenges might agencies face in career management? How does the profile of their officers by experience and rank compare to an ideal one that can avoid oscillations from very senior to very junior forces? What do agencies do to control oscillations once they start?
The research team conducted a national survey of 146 large police agencies on their practices in recruitment, retention, and developing a data-driven workforce profile. The survey instrument was based on the research team's experience in working with large personnel systems, instruments used in previous police staffing surveys, and discussions with police practitioners. The researchers collected initial feedback on the survey instrument through face-to-face meetings in Pittsburgh and Dallas with sworn and civilian staff having personnel administration responsibilities. A second round of testing involved sending emails to the contact person in the Columbus and Las Vegas departments to briefly explain the study and request assistance with questionnaire development. Once these departments agreed, a paper-and-pencil questionnaire was mailed to the contact person for completion. The contact person was asked to note any comments on problem questions, confusing terms, or insufficient answer choices. After receiving the completed questionnaire, the researchers called the contact person to discuss the responses and comments to the survey. A final face-to-face round of testing was conducted with the departments in New Orleans and Pittsburgh.
To ensure an acceptable response rate, the research team developed a comprehensive nonresponse protocol, provided ample field time so that departments could take time to compile information and respond, and provided significant one-on-one technical assistance to agencies as they completed the survey. Specifically, the principal investigators began by sending each department a full survey packet with the instrument, letters of endorsement from the RAND Corporation and the National Institute of Justice, and a postcard to request contact information for survey follow-up. Initial surveys were mailed on February 27, 2008. Additional copies of the survey were mailed to nonrespondents on April 8, 2008 and again on May 19, 2008. Nonrespondents were also called a week and a half after each mailing in order to confirm that the packet had been received, encourage completion, and answer any questions. A final reminder was sent to nonrespondents on August 20, 2008, with follow-up calls on September 23 and 24. Altogether, the survey was in the field for 38 weeks.
Respondents were asked to consult their agency's records in order to provide information about their agency's experience with recruiting, hiring, and retaining officers for 2006 and 2007. Of the 146 departments contacted, 107 completed the survey. The police recruitment and retention survey data were supplemented with data on each jurisdiction from the American Community Survey conducted by the United States Census Bureau, the Bureau of Labor Statistics, and the Federal Bureau of Investigation Uniform Crime Reports.
The sample for this survey came from the 2007 National Directory of Law Enforcement Administrators (NDLEA). The NDLEA is a standard, comprehensive list of police agencies that has been used as the sampling frame for national surveys in previous studies that have passed rigorous peer review. The researchers limited the scope of their study to United States municipal police agencies with at least 300 sworn officers for a total of 146 departments.
Mode of Data Collection:
RAND 2008 Police Recruitment and Retention Survey
American Community Survey, United States Census Bureau
United States Bureau of Labor Statistics, Occupational Employment Statistics
United States Department of Justice, Federal Bureau of Investigation. Uniform Crime Reporting Program Data [United States]
Description of Variables:
The dataset contains a total of 535 variables pertaining to departmental statistics for 2006 and 2007. The variables include information relating to general recruitment and hiring, which includes statistics on hiring and recruitment strategies, incentives, costs, and standards; unions, pay and benefits, promotion possibilities, retirement, attrition rates, and other departmental statistics including years of continuous service by rank, officers in each department by rank, sex and race for 2006 and 2007, and authorized versus actual force strength for the same time periods. Variables are also included relating to the area's unemployment rate, geographic size, and the number of citizens served.
Of the 146 departments contacted, 107 completed the survey (6 departments refused, and the remainder provided no contact or information), resulting in a 73.3 percent response rate.
Presence of Common Scales:
A Likert-type scale (no difficulty, some difficulty, much difficulty) was used in some variables.
Extent of Processing: ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of
disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major
statistical software formats as well as standard codebooks to accompany the data. In addition to
these procedures, ICPSR performed the following processing steps for this data collection:
Created variable labels and/or value labels.
Standardized missing values.
Checked for undocumented or out-of-range codes.