Court Responses to Batterer Program Noncompliance in the United States, 2005-2006 (ICPSR 20346)
Principal Investigator(s): Labriola, Melissa, Center for Court Innovation; Rempel, Michael, Center for Court Innovation; O'Sullivan, Chris S., Volunteer Counseling Services; Frank, Phyllis B., Volunteer Counseling Services; McDowell, Jim, Volunteer Counseling Services; Finkelstein, Rachel, Center for Court Innovation
The purpose of this study was to explore to what extent criminal courts nationwide are advancing the goal of accountability by imposing consequences on offenders who are noncompliant with a batterer program mandate. The study also sought to understand the goals that courts, batterer programs, and victim assistance agencies currently ascribe to batterer programs. In March 2005, a preliminary survey was sent to 2,445 batterer programs nationwide found through multiple sources. Preliminary survey results were analyzed, and a final sample of 260 communities or triads (courts, batterer programs, and victim assistance agencies) was selected. Respondents were asked to complete a Web-based survey in May 2006. Alternatively, respondents could request a hard-copy version of the survey. The variables in this study encompass community demographic information, the functions that court mandates to batterer programs serve, and the primary focus of the curriculum of batterer programs. Variables specific to batterer programs capture whether the program accepts court-mandated referrals only or volunteers as well, the length and duration of the program, possible reasons for noncompliance, and an approximate program completion rate. Variables related to the interaction between courts and batterer programs capture whether the court receives progress reports from the batterer program, and if so, when, and who receives them.
One or more files in this study are not available for download due to special restrictions ; consult the restrictions note to learn more. You can apply online for access to the data. A login is required to apply for access. (Instructions on YouTube.)
A downloadable version of data for this study is available however, certain identifying information in the downloadable version may have been masked or edited to protect respondent privacy. Additional data not included in the downloadable version are available in a restricted version of this data collection. For more information about the differences between the downloadable data and the restricted data for this study, please refer to the codebook notes section of the PDF codebook. Users interested in obtaining restricted data must complete and sign a Restricted Data Use Agreement, describe the research project and data protection plan, and obtain IRB approval or notice of exemption for their research.
Labriola, Melissa, Michael Rempel, Chris S. O'Sullivan, Phyllis B. Frank, Jim McDowell, and Rachel Finkelstein. Court Responses to Batterer Program Noncompliance in the United States, 2005-2006. ICPSR20346-v1. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2007-11-02. http://doi.org/10.3886/ICPSR20346.v1
Persistent URL: http://doi.org/10.3886/ICPSR20346.v1
This study was funded by:
- United States Department of Justice. Office of Justice Programs. National Institute of Justice (2004-WG-BX-0005)
Scope of Study
Smallest Geographic Unit: none
Geographic Coverage: United States
Unit of Observation: agency
Universe: All criminal courts, batterer programs, and victim assistance agencies in the United States in 2005
Data Types: survey data
Study Purpose: The purpose of this study was to explore to what extent criminal courts nationwide are advancing the goal of accountability by imposing consequences on offenders who are noncompliant with a batterer program mandate. The study also sought to understand the goals that courts, batterer programs, and victim assistance agencies currently ascribe to batterer programs. The study's focus was on the systemic response to noncompliance with a court order to a batterer program, rather than the success of batterer programs in preventing recidivism among program completers. The study was motivated by five main questions: (1) Rationale: Why do criminal courts use batterer programs?, (2) policies and practices regarding court mandates to batterer programs: How do criminal courts use batterer programs?, (3) Policies and practices regrading enforcement of mandates: Do criminal courts enforce their mandates to batterer programs by imposing jail or other sanctions in response to noncompliance?, (4) Cross-agency consistency: Within communities, are the perceptions of batterer programs, criminal courts and victim assistance agencies in alignment?, and (5) Impact of contextual characteristics: Does the use and enforcement of batterer program mandates vary? Additional questions included: when and how courts mandate offenders to these programs, what, if any, other types of programs courts mandate for domestic violence offenders (e.g., alcohol treatment, mental health treatment, or parenting), and to what extent courts, batterer programs, and victim assistance agencies concur on their answers to these questions.
Study Design: In order to explore the extent to which criminal courts use batterer programs to hold domestic violence offenders accountable, a preliminary survey was sent to all identified batterer programs. Surveys were mailed to 2,445 sites with self-addressed return envelopes in March 2005. The survey was only one page, and was printed on the back of a brief cover letter stating the study's purpose in general terms and providing assurances of confidentiality. The focus of the preliminary survey was to maximize the number of responding programs, to obtain contact information on criminal courts and victim assistance agencies, and to gain basic information that would inform the sampling process. In May 2006, a survey of 260 communities nationwide was implemented. Within each community, a batterer program, criminal court, and victim assistance agency was surveyed. All the programs, courts, and agencies selected for the community survey were sent a letter that stated its purpose, asked them to complete the survey online, provided a unique online username and password, and described confidentiality and data security protocols approved by the Center for Court Innovation. (The online format enabled the researchers to program automatic skip-patterns, so that each respondent would only see those questions requiring an answer based upon previous responses.) The letter also provided contact information for respondents who wished to receive and complete a hard copy of the survey. After five weeks, a reminder postcard was sent to nonresponders. Ten weeks later, hard copies of the survey were sent to nonresponders. Simultaneously, the researchers used intensive phone and e-mail follow-ups with those nonresponders for whom there was no accurate contact information. Finally, the director of the National Institute of Justice signed a letter that was sent to all remaining nonresponders (with survey attached) indicating the importance of the study and requesting their participation.
Sample: To identify a geographically diverse sample of mixed community types, a preliminary survey was sent to 2,445 batterer programs nationwide found through multiple sources. Approximately 70 percent were identified with the help of state domestic violence coalitions, over 25 percent were found through Internet searches, and approximately 3 percent came through informal contacts made by an advisory board and co-principal investigator. The sampling approach did not call for a representative sample of all batterer programs or criminal courts nationwide. Rather, a sample was sought that provided a mix of community types (small, medium, and large, and urban, suburban, and rural) but was geographically comprehensive, representing every state and region, including those with smaller populations. Researchers also sought to over-represent communities known to deal with high numbers of batterer program mandates relative to their populations, in order to avoid skewing the final sample toward the experiences of batterer programs and courts that process a low number of cases per year. When the preliminary survey was conducted of these programs, 149 letters were returned addressee unknown, and it was discovered that 30 of the agencies were not actually batterer programs. A population of 2,265 batterer programs remained. The final sample selection process involved reviewing the responses to the preliminary survey one state at a time. Selection in each state was driven by the following preference rules: (1) if the batterer program referrals originated with the criminal court, (2) if the responding batterer program was able to identify a local victim assistance agency, (3) whether the responding batterer program received both pre- and postdisposition batterer program referrals, (4) the size of the relative volume of cases received, (5) population density, and (6) a limit of three to five communities per state. The sampling process led to selection of 260 communities, or triads. This sample included 48 percent of the 543 batterer programs that responded to the preliminary survey. In 28 percent of the communities in the final sample, the batterer program and victim assistance agencies were one and the same. The final sample comprised three to five communities in 24 states (48 percent), more than five in 18 states (36 percent), and less than three in eight states (16 percent). Nonresponders to the court and victim assistance agency surveys were more likely to be from the Northeast.
Mode of Data Collection: Web-based survey, mail questionnaire
Description of Variables: All respondents were asked to define their community as primarily urban, suburban, or rural. Furthermore, all respondents were asked the same questions concerning the function of court mandates to batterer programs: treatment/rehabilitation, monitoring, accountability, legally appropriate punishment, alternative to incarceration, or other. All respondents were asked how courts respond to noncompliance. Courts and batterer programs were asked how often the court imposed sanctions in response to noncompliance, how often the court imposed each of eight specific sanctions, and whether the court responded consistently to reports of noncompliance. Victim assistance agencies were asked how often the court imposed sanctions and whether the court responded consistently to reports of program noncompliance. Both courts and batterer programs were asked whether state laws, regulations, or standards existed governing the use of the batterer programs and whether batterer programs were certified by the state. Courts and batterer programs were asked how they interact. Questions on the batterer program survey included whether, when, and to whom in the criminal justice system the program reported on compliance. The court survey included questions on whether the court received progress reports from the batterer program, and if so, when and who received them. Courts were asked why they used batterer programs. Specific questions asked included: what types of charges lead to batterer program mandates, when these mandates are imposed (pre- and/or postdisposition), whether and how probation is included, approximately how many defendants were mandated per month, whether the court was aware of the specific program in which each defendant enrolls, and which other types of program mandates the court sometimes imposed on domestic violence offenders (alcohol treatment, substance abuse treatment, mental health treatment or parenting classes) instead of a mandate to a batterer program. Batterer programs were asked whether the program accepted court-mandated referrals only or if they accepted voluntary referrals as well, the length and duration of the program, possible reasons for noncompliance, the approximate program completion rate, and primary focus of their curriculum. Victim assistance agencies were asked what they thought the primary focus of the batterer program curriculum should be: addressing participant mental health issues, confronting attitudes towards intimate partners, educating participants about domestic violence, holding participants accountable, or teaching communication and coping skills.
Response Rates: Of the 2,265 programs initially contacted, 543 responded (24 percent). The final sample of 260 communities was then selected from this pool of 543. In each of these 260 communities, surveys were administered to a batterer program, court, and victim assistance agency. Repeat followup efforts culminated in high final response rates of 75 percent for the batterer programs (n = 195), 53 percent for the courts (n = 139), and 62 percent for the victim assistance agencies (n = 162). There was at least one response from 94 percent of the target communities, and at least one response from either the batterer program or the court in 88 percent of the communities.
Presence of Common Scales: Several Likert-type scales were used.
Extent of Processing: ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:
- Checked for undocumented or out-of-range codes.
Original ICPSR Release: 2007-11-02
- Citations exports are provided above.
Export Study-level metadata (does not include variable-level metadata)
If you're looking for collection-level metadata rather than an individual metadata record, please visit our Metadata Records page.