National Survey of Juvenile Justice Professionals, 2005-2007 [United States] (ICPSR 26381)

Published: Mar 21, 2013

Principal Investigator(s):
Janeen Buck Willison, Urban Institute; Christy Visher, University of Delaware; Daniel P. Mears, Florida State University; Jeffrey A. Butts, University of Chicago

https://doi.org/10.3886/ICPSR26381.v1

Version V1

This study involved a survey of juvenile court judges, chief probation officers, prosecutors, and public defenders to measure their impressions of recent policy changes and the critical needs facing today's juvenile justice system. In addition the study garnered recommendations for improving the administration and effectiveness of this system. The study's primary objective was to provide policymakers, administrators, and practitioners with actionable information about how to improve the operations and effectiveness of the juvenile justice system, and to examine the role practitioners could play in constructing sound juvenile justice policy. A total of 534 juvenile court judges, chief probation officers, court administrators, prosecutors, and defense attorneys in 44 states and the District of Columbia participated in the Assessing the Policy Options (APO) national practitioner survey. The survey consisted of four major sections: demographics, critical needs, policies and practices, and practitioner recommendations. Critical needs facing the juvenile justice system were measured by asking respondents about the policy priority of 13 issues in their respective jurisdictions; topics ranged from staff training and development to effective juvenile defense counsel to information technology. Respondents were also asked to assess the effectiveness of 17 different policies and practices -- ranging from parental accountability laws to transfer and treatment -- in achieving 6 vital juvenile justice outcomes.

Willison, Janeen Buck, Visher, Christy, Mears, Daniel P., and Butts, Jeffrey A. National Survey of Juvenile Justice Professionals, 2005-2007 [United States]. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2013-03-21. https://doi.org/10.3886/ICPSR26381.v1

Export Citation:

  • RIS (generic format for RefWorks, EndNote, etc.)
  • EndNote

United States Department of Justice. Office of Justice Programs. National Institute of Justice (2005-IJ-CX-0039)

county

A downloadable version of data for this study is available however, certain identifying information in the downloadable version may have been masked or edited to protect respondent privacy. Additional data not included in the downloadable version are available in a restricted version of this data collection. For more information about the differences between the downloadable data and the restricted data for this study, please refer to the codebook notes section of the PDF codebook. Users interested in obtaining restricted data must complete and sign a Restricted Data Use Agreement, describe the research project and data protection plan, and obtain IRB approval or notice of exemption for their research.

2005-10 -- 2007-12

2005-10 -- 2007-12

The study's primary objective was to provide policymakers, administrators, and practitioners with actionable information about how to improve the operations and effectiveness of the juvenile justice system, and to examine the role practitioners could play in constructing sound juvenile justice policy.

A secondary goal was to provide a forum in which practitioners could directly weigh in on the policies and practices they are frequently mandated to implement.

Between October 2005 and December 2007, a total of 534 juvenile court judges, chief probation officers, court administrators, prosecutors, and defense attorneys in 44 states and the District of Columbia participated in the Assessing the Policy Options (APO) national practitioner survey. The APO National Practitioner Survey was an online, self-administered questionnaire consisting of four major sections. Survey items included measures of critical need, the perceived effectiveness of 17 prominent policies and practices in the juvenile justice system, and a limited set of demographic characteristics. The response format was a five-point Likert-type scale (strongly agree, agree, disagree, strongly disagree, or don't know). The only open-ended survey question captured respondent policy recommendations. Respondents logged on to the survey using a unique "username" and private password assigned by the Urban Institute; online instructions reminded respondents that participation was voluntary and completely confidential. Most completed the survey in 15 minutes. To increase participation, an identical paper version of the survey also was mailed to respondents along with instructions for completing the survey online. Roughly two-thirds of respondents chose to complete the paper-pencil version of the survey, as opposed to the online version.

The initial sampling frame consisted of the nation's 300 most-populated counties, but was later reduced to just 285 counties in response to 1 state's policies governing contact with judges and other court officials. The 285 counties that formed the study's final sampling frame encompassed 45 states and the District of Columbia, and accounted for roughly two-thirds (62 percent) of the United States population. Counties were identified using the most recent United States Census figures available when the project commenced in 2005. Viewed as a distinct population of counties and not representative of the remaining United States counties, the 285 county sampling frame provided a basis for estimating how juvenile justice professionals in the largest counties -- and arguably the counties responsible for processing the vast majority of youthful offenders -- viewed prominent juvenile justice reforms. In short, the study sought to generalize across, not within, counties.

The survey targeted four groups of juvenile justice professionals as uniquely positioned to provide insights about key issues: (1) juvenile court judges; (2) prosecutors; (3) defense attorneys; and (4) chief probation officers and court administrators, referred to collectively as the Court Personnel group. Absent a national "master" list of juvenile justice professionals, researchers consulted multiple sources and employed snowball sampling techniques to assemble respondent lists for each of the four practitioner groups. Ultimately, 1,032 individuals across 285 counties, representing 282 jurisdictions, were identified and invited to participate in the APO national survey. A total of 534 juvenile justice professionals participated in the survey.

Cross-sectional

All juvenile justice practitioners living in 285 of the 300 most-populated counties in the United States between October 2005 and December 2007.

individual

Urban Institute National Survey of Juvenile Justice Professionals.

survey data

The study contains 420 variables and consisted of four major sections: demographics, critical needs, policies and practices, and practitioner recommendations. Critical needs facing the juvenile justice system were measured by asking respondents about the policy priority of 13 issues in their respective jurisdictions; topics ranged from staff training and development to effective juvenile defense counsel to information technology. Respondents were also asked to assess the effectiveness of 17 different policies and practices -- ranging from parental accountability laws to transfer and treatment -- in achieving 6 vital juvenile justice outcomes: less crime in the community, less recidivism by young offenders, appropriate punishment of young offenders, fair treatment of young offenders, efficiency of the justice process, and traditional mission of juvenile justice.

The final sample of 534 juvenile justice professionals resulted in an overall survey response rate of 52 percent. The denominator, in this instance, consisted of the number of viable practitioner contacts obtained across the four target groups or roughly 1,032 individuals, not the number of counties in the sample multiplied by four (285x4=1,140).

Several Likert-type scales were used.

2013-03-21

2013-03-21

2013-03-21 ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:

  • Standardized missing values.
  • Checked for undocumented or out-of-range codes.

Notes

  • These data are part of NACJD's Fast Track Release and are distributed as they were received from the data depositor. The files have been zipped by NACJD for release, but not checked or processed except for the removal of direct identifiers. Users should refer to the accompanying readme file for a brief description of the files available with this collection and consult the investigator(s) if further information is needed.

  • The public-use data files in this collection are available for access by the general public. Access does not require affiliation with an ICPSR member institution.

  • One or more files in this data collection have special restrictions. Restricted data files are not available for direct download from the website; click on the Restricted Data button to learn more.

  • The citation of this study may have changed due to the new version control system that has been implemented.
NACJD logo

This dataset is maintained and distributed by the National Archive of Criminal Justice Data (NACJD), the criminal justice archive within ICPSR. NACJD is primarily sponsored by three agencies within the U.S. Department of Justice: the Bureau of Justice Statistics, the National Institute of Justice, and the Office of Juvenile Justice and Delinquency Prevention.