Mamalian, Cynthia A., Nancy G. LaVigne, and Elizabeth Groff. USE OF COMPUTERIZED CRIME MAPPING BY LAW ENFORCEMENT IN THE UNITED STATES, 1997-1998. Conducted by U.S. Department of Justice, National Institute of Justice, Crime Mapping Research Center. ICPSR02878-v3. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [producer and distributor], 2008-04-18. https://doi.org/10.3886/ICPSR02878.v3
Persistent URL: https://doi.org/10.3886/ICPSR02878.v3
- RIS (generic format for RefWorks, EndNote, etc.)
- EndNote XML (EndNote X4.0.1 or higher)
Computerized crime mapping technology enables law
enforcement agencies to analyze and correlate data sources to create a
detailed snapshot of crime incidents and related factors within a
community or other geographic area. In particular, it enables agencies
to target resources for more effective crime control strategies,
identify likely suspects, evaluate the results of interventions, and
more efficiently and effectively allocate officers to geographic areas
in which they are needed. While interest in this technology within the
law enforcement community appears to be growing, until recently little
data existed on how widely computerized crime mapping was used, in
what capacity, and what influenced an agency's implementation of a
geographic information system (GIS). As a first step in understanding
law enforcement agencies' use and knowledge of crime mapping, the
Crime Mapping Research Center of the National Institute of Justice
conducted a nationwide crime mapping survey to determine which
agencies were using GIS, how they were using it, and, among agencies
that were not using GIS, the reasons for that choice.
The survey instrument was developed by National
Institute of Justice staff, reviewed by practitioners and researchers
with crime mapping knowledge, and cleared by the Office of Management
and Budget. All agencies with 50 or more sworn officers were contacted
by telephone to obtain the name of the crime analyst or other
appropriate person to whom the survey should be addressed. All other
surveys were sent to the attention of the agency chief. The survey
was mailed in March 1997 to a sample of law enforcement agencies in
the United States. Agencies that did not respond to the first mailing
were sent a second survey. Surveys were accepted until May 1, 1998. It
took responding agencies an estimated average of 33 minutes to answer
the survey, including time to review instructions, gather needed data,
and complete and review the collection of information.
Stratified random sampling.
Description of Variables:
Questions asked of all respondents included type of
agency, population of community, number of personnel, types of crimes
for which the agency kept incident-based records, types of crime
analyses conducted, and whether the agency performed computerized
crime mapping. Those agencies that reported using computerized crime
mapping were asked which staff conducted the mapping, types of
training their staff received in mapping, types of software and
computers used, whether the agency used a global positioning system,
types of data geocoded and mapped, types of spatial analyses performed
and how often, use of hot spot analyses, how mapping results were
used, how maps were maintained, whether the department kept an archive
of geocoded data, what external data sources were used, whether the
agency collaborated with other departments, what types of Department
of Justice training would benefit the agency, what problems the agency
had encountered in implementing mapping, and which external sources
had funded crime mapping at the agency. Departments that reported no
use of computerized crime mapping were asked why that was the case,
whether they used electronic crime data, what types of software they
used, and what types of Department of Justice training would benefit
The response rate after two mailings was 72
Presence of Common Scales:
Several Likert-type scales were used.
Extent of Processing: ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of
disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major
statistical software formats as well as standard codebooks to accompany the data. In addition to
these procedures, ICPSR performed the following processing steps for this data collection:
Performed consistency checks.
Standardized missing values.
Performed recodes and/or calculated derived variables.
Checked for undocumented or out-of-range codes.