Evaluation of the Domestic Violence Homicide Prevention Demonstration Initiative: Collaboration Surveys, 5 U.S. states, 2015-2019 (ICPSR 38133)

Version Date: Aug 16, 2022 View help for published

Principal Investigator(s): View help for Principal Investigator(s)
Joy S. Kaufman, Yale University

https://doi.org/10.3886/ICPSR38133.v1

Version V1

Slide tabs to view more

The National Institute of Justice (NIJ) and the Office for Violence Against Women (OVW) has evaluated the implementation process and impact of the U.S. Department of Justice's Domestic Violence Homicide Prevention Demonstration Initiative. The evaluation was conducted by a team of investigators from Yale University and Michigan State University.

The demonstration initiative (DI) included 3 implementation sites (California, North Carolina and Illinois). In addition, 2 comparison or typically implementing sites (Michigan and Tennessee) were included in the evaluation. The sites implemented the Lethality Assessment Program (LAP) developed by the Maryland Network Against Domestic Violence.

A web-based survey was used to gather data to assess changes in collaboration within each of the sites. Respondents from agencies providing support to victims of domestic violence and their offenders reported on their level of collaboration with other named agencies in their networks at two or three time points. Data sets are at the site level. Social network analysis was conducted to assess how the network changes over time.

Kaufman, Joy S. Evaluation of the Domestic Violence Homicide Prevention Demonstration Initiative: Collaboration Surveys, 5 U.S. states, 2015-2019. Inter-university Consortium for Political and Social Research [distributor], 2022-08-16. https://doi.org/10.3886/ICPSR38133.v1

Export Citation:

  • RIS (generic format for RefWorks, EndNote, etc.)
  • EndNote
United States Department of Justice. Office of Justice Programs. National Institute of Justice (2013-ZD-CX-0001)

State

Access to these data is restricted. Users interested in obtaining these data must complete a Restricted Data Use Agreement, specify the reasons for the request, and obtain IRB approval or notice of exemption for their research.

Inter-university Consortium for Political and Social Research
Hide

2015 -- 2019
2015 -- 2019
  1. Sometimes agencies closed or opened across the implementation years. If a new agency was added, it was assigned a brand new number. If an agency closed, that agency number was never used for a new agency. Agency numbers across the implementation years remained the same. Example, Agency 12 in T1, is still called Agency 12 in T2.

  2. Each individual would first note which agencies they interacted with - then all following questions were only asked regarding the agencies that individual picked. So if the individual said they only worked with agencies 1, 2, and 3, then they would not be asked any questions about agencies 4 through 30. Agencies 4-30 would then be automatically assigned a value label of -1="N/A".

Hide

This evaluation sought to draw critical findings and potential lessons from the combined experience of the sites including: 1) how the models work in different communities; 2) the barriers and facilitators to implementing the models; 3) the experience of victims participating in the programs, and 4) outcomes of the model delivery.

The survey asks respondents to identify which of the network agencies they collaborate with, the extent of their agency's interactions with these agencies, and to identify which agencies they would like to collaborate with more in the future. It also included a series of questions that probe for more information regarding: agency referrals; facilitators and barriers to collaboration; and the services and supports needed within communities to more effectively support victims of domestic violence.

The survey is web-based and includes a series of questions regarding the respondent's role in their organization and their agency's collaborations within the network. Surveys were anonymous, could be taken at the respondent's convenience, and took about 15 minutes to complete. Each agency was asked to identify two respondents to complete the survey including an individual in a leadership position and an individual within the agency who is familiar with how the organization collaborates with partners to address the occurrence of domestic violence in the community.

Longitudinal: Panel

Agencies in a community that provide services to community members including Law Enforcement, Domestic Violence Service Providers, Counseling/Therapy, Housing, Employment, Victim Services, etc.

Individual

SITE 1: At Time 1, 13 of the 30 agencies (43%) that were invited responded to the survey. Ten agencies (33%) had one respondent, two agencies (7%) had two respondents, and one agency (3%) had three respondents. There was a total of 17 respondents out of the 60 requested (28%). At Time 2, agency participation remained the same when 13 of the 30 agencies (43%) responded. Five agencies (17%) had one respondent, 5 agencies (17%) had two respondents, two agencies (7%) had three respondents, and one agency (3%) had five respondents. There was a total of 26 respondents out of the 60 requested (43%). At Time 3, 29 agencies were invited to participate (due an agency closure in 2018), agency participation increased as 18 of the 29 agencies (62%) responded. Twelve agencies (41%) had one respondent, four agencies (14%) had 2 respondents, and two agencies (7%) had four respondents. There was a total of 28 respondents out of the 60 requested (48%).

SITE 2: At Time 1, 27 of the 30 (90%) agencies that were invited responded to the survey. Eight of the agencies (27%) had one respondent, sixteen agencies (53%) had two respondents, and three agencies (10%) had three respondents. There was a total of 49 respondents out of the 60 requested (82%). At Time 2 the overall agency response to invitations to participate remained the same, when 26 of the 29 agencies (90%) responded. Of the agencies that did respond, thirteen agencies (45%) had one respondent, twelve agencies (41%) had two respondents, and one agency (3%) had three respondents. There was a total of 40 respondents out of the 58 requested (69%).

SITE 3: At Time 1, 28 of the 37 (76%) agencies that were invited responded to the survey. Thirteen of the agencies (35.1%) had one respondent, 14 agencies (37.8%) had two respondents and one (2.7%) had three respondents. There was a total of 44 respondents out of the 74 requested (59%). In Time 2 the overall agency response to invitations to participate remained the same, when 27 of the 35 agencies (77%) responded. Of the agencies that did respond, 15 agencies (43%) had one respondent, 12 agencies (34%) had two respondents. There was a total of 39 respondents out of the 70 requested (56%).

SITE 4: At Time 1, 13 of the 20 (65%) agencies that were invited responded to the survey. Eight agencies (40%) had one respondent, 3 agencies (15%) had two respondents and two (10%) had three respondents. There was a total of 20 respondents out of the 40 requested (50%). In Time 2, 16 of the 21 (76%) agencies that were invited responded to the survey. Of the agencies that did respond, two agencies (10.0%) had one respondent, one agency (5%) had four respondents, and two agencies (10.0%) had five respondents. There was a total of 16 respondents out of the 42 requested (38%). At Time 3 agency participation decreased as 11 of the 20 (55%) agencies responded. Eight agencies (40%) had one respondent, two agencies (10%) had two respondents, and one agency (5%) had five respondents. There was a total of 17 respondents out of the 40 requested (43%).

SITE 5: At Time 1, 18 of the 29 agencies (62%) that were invited responded to the survey. Eleven agencies (38%) had one respondent, 4 agencies (14%) had two respondents, and 3 agencies (10%) had three respondents. There was a total of 28 respondents out of the 58 requested (48%). At Time 2 agency participation decreased as 13 of the 29 (45) agencies responded. Ten agencies (35%) had one respondent and three agencies (10%) had two respondents. There was a total of 16 respondents out of the 58 requested (28%).

The Yale Evaluation Team created the scales for this survey. Respondents were asked about what level of Collaboration they had with each agency. Collaboration levels were defined as:

  • None: No Interaction at all
  • Networking: Aware of organization, Loosely defined roles, Little communication, All decisions made independently.
  • Cooperation: Provide Information to each other, Somewhat defined roles, Formal communication, All decisions are made independently.
  • Coordination: Share information and resources, Defined roles, Frequent communication, Some shared decision making.
  • Coalition: Share ideas, Share resources, Frequent and prioritized communication, Advise each other on decision making.
  • Collaboration: Belong to same provider system, Frequent communication characterized by mutual trust, Consensus is reached on all decisions.

Respondents were asked about how many referrals they sent to these agencies, and how many they received from these agencies. Referral levels were defined as:

  • 0 Never
  • 1 A few times per year
  • 2 About once per month
  • 3 A couple times a month
  • 4 Weekly
  • 5 More than 1 time per week

Hide

2022-08-16

Hide

Not applicable

Hide

Notes

  • The public-use data files in this collection are available for access by the general public. Access does not require affiliation with an ICPSR member institution.

  • ICPSR usually offers files in multiple formats for researchers to be able to access data and documentation in formats that work well within their needs. If you have questions about the accessibility of materials distributed by ICPSR or require further assistance, please visit ICPSR’s Accessibility Center.

  • One or more files in this data collection have special restrictions. Restricted data files are not available for direct download from the website; click on the Restricted Data button to learn more.

NACJD logo

This dataset is maintained and distributed by the National Archive of Criminal Justice Data (NACJD), the criminal justice archive within ICPSR. NACJD is primarily sponsored by three agencies within the U.S. Department of Justice: the Bureau of Justice Statistics, the National Institute of Justice, and the Office of Juvenile Justice and Delinquency Prevention.