Assessing the Validity of Voice Stress Analysis (VSA) Tools in a Jail Setting in Oklahoma City, Oklahoma, 2006 (ICPSR 20625)

Version Date: Jun 23, 2008 View help for published

Principal Investigator(s): View help for Principal Investigator(s)
Kelly R. Damphousse, University of Oklahoma; Laura Pointon, University of Oklahoma; Deidra Upchurch, KayTen Research and Development; Rebecca K. Moore, Oklahoma Department of Mental Health and Substance Abuse Services

https://doi.org/10.3886/ICPSR20625.v1

Version V1

Slide tabs to view more

The purpose of the project was to assess the validity of two Voice Stress Analysis (VSA) tools currently on the market: the Layered Voice Analysis (LVA) and the Computer Voice Stress Analyzer (CVSA). The methodology and sampling protocols for this study were derived from the pre-existing methodology and sampling techniques employed in the National Institute of Justice-funded Arrestee Drug Abuse Monitoring (ADAM) program that operated in Oklahoma County from 1998 to 2004. The researchers interviewed arrestees in the Oklahoma County jail about their recent illicit drug use during the months of February and March 2006. The VSA data collected using each of the software systems in this study were sent to certified examiners from CVSA and LVA for their analysis. After the completion of the interview, the subjects were asked to complete the data collection process by supplying urine specimens. Answers from the 319 respondents were compared to the results of a urinalysis test to determine the extent to which they were being deceptive. Then, their "actual deceptiveness" was compared to the extent to which deception was indicated by the VSA programs. The dataset contains (1) demographic information obtained from the official booking records, (2) responses to survey questions about recent drug use, (3) the results of a urinalysis test on five drugs, (4) variables recording "deception" or "no deception" on each of the drugs, and (5) decisions by novice and expert analysts regarding the indication of deception.

Damphousse, Kelly R., Pointon, Laura, Upchurch, Deidra, and Moore, Rebecca K. Assessing the Validity of Voice Stress Analysis (VSA) Tools in a Jail Setting in Oklahoma City, Oklahoma, 2006. [distributor], 2008-06-23. https://doi.org/10.3886/ICPSR20625.v1

Export Citation:

  • RIS (generic format for RefWorks, EndNote, etc.)
  • EndNote
United States Department of Justice. Office of Justice Programs. National Institute of Justice (2005-IJ-CX-0047)

ZIP code

Access to these data is restricted. Users interested in obtaining these data must complete a Restricted Data Use Agreement, specify the reasons for the request, and obtain IRB approval or notice of exemption for their research.

Hide

2006-02-28 -- 2006-03-24
2006-02-28 -- 2006-03-24
  1. Users are encouraged to refer to the final report cited in the "Related Literature" section of this study and the studies that are part of the Arrestee Drug Abuse Monitoring (ADAM) Program/Drug Use Forecasting (DUF) Series for more detailed information regarding the study design.

Hide

The purpose of the project was to assess the validity of two Voice Stress Analysis (VSA) tools currently on the market: the Layered Voice Analysis (LVA) and the Computer Voice Stress Analyzer (CVSA). The goal of this project was to test the effectiveness of each VSA device in a jail setting by evaluating the ability of the two VSA instruments to detect deceptive answers about recent drug use among an arrestee population.

The methodology and sampling protocols for this study were derived from the pre-existing methodology and sampling techniques employed in the NIJ-funded Arrestee Drug Abuse Monitoring (ADAM) program that operated in Oklahoma County from 1998 to 2004. The data collection team had access to booking data to help select the sample and collect basic data (e.g., demographics and charge information) for each respondent. The original ADAM instruments were used as the basis for the data collection in this study. The process for completing the interviews differed slightly depending on the type of voice stress analysis system being tested. The Computer Voice Stress Analyzer (CVSA) program required that direct questions that resulted in yes or no answers be asked. The Layered Voice Analysis (LVA) program, on the other hand, required that the questions (and the responses) be more open-ended and "conversational." The interviewers were trained in the operation of the CVSA and LVA systems to ensure that the research protocol matched the use protocol of the Voice Stress Analysis (VSA) devices.

The researchers interviewed arrestees in the Oklahoma County jail about their recent illicit drug use during the months of February and March 2006. The voluntary and confidential interviews were conducted only with arrestees who had been in the detention facility for fewer than 48 hours. The researchers collected data using the CVSA program for the first 12 days and then using the LVA program for the second 12 days. In addition to recordings made by the computers, the interviewers also coded each subject's responses to each on a paper survey.

The VSA data collected using each of the software systems in this study were sent to certified examiners from CVSA and LVA for their analysis. These examiners were referred to as the "expert" examiners in this study. When the project was completed, the interviewers ("novice" examiners) and the expert examiners began to assess the computer output for each arrestee to determine if deception was indicated (DI) or not (NDI). Both sets of examiners analyzed the output for each subject to determine if software had detected that the subject was being deceptive about recent drug use. In addition, the "novice" analysts also entered into the data a prediction of whether the arrestee would test positive for a particular drug.

After the completion of the interview, the subjects were asked to complete the data collection process by supplying urine specimens. Answers from the 319 respondents were compared to the results of a urinalysis test to determine the extent to which they were being deceptive. Then, their "actual deceptiveness" was compared to the extent to which deception was indicated by the VSA programs.

The sample for this study was collected at the Oklahoma County Detention Center in Oklahoma City, Oklahoma during the months of February and March 2006. The voluntary and confidential interviews were conducted only with arrestees who had been in the detention facility for fewer than 48 hours. Using the Arrestee Drug Abuse Monitoring (ADAM) probability-based sampling plan, the total number of men arrested within Oklahoma County during this period (regardless of charge) composed the sampling frame for the VSA study. This method of sampling was devised to allow the selection of arrestees during the time of day with the highest volume of arrests and to allow the random selection of arrestees who were booked during the remaining hours in that 24-hour day.

The sample selected during the high volume time was referred to as "flow," and the arrestees from the remaining hours of the day were referred to as "stock." These terms were traditionally used because interviews in the ADAM project were conducted during the peak booking time (flow). Thus, flow samples were composed of people who "flowed" into the jail during the eight-hour period that the research team was collecting data. The "stock" referred to the sample of people who were booked during the 16-hour period that the research team was not collecting data during the times of the day when booking rates were normally low. The sample was drawn proportionately from both stock and flow throughout each data collection period to reflect the distribution of arrests each day. The site had previously been given a target number of interviews it was expected to complete each day, so the researchers used that figure. In Oklahoma City, the sampling plan required the selection of seven "stock" males and a minimum of five "flow" males per day.

The original ADAM program participants included both adult males and females. While the researchers originally intended to collect data from both males and females in the Voice Stress Analysis (VSA) study as well, circumstances at the jail precluded this possibility. Because constraints allowed only one interviewer to work at a time, and since there are about five times as many male arrestees booked each day, the research team decided to maximize the sample size by only collecting data from males.

For the VSA project, the researchers randomly selected 356 male arrestees into the sample. Of those, 331 (93 percent) agreed to an interview, and 319 (90 percent) of the interviewed arrestees agreed to provide a urine sample. This refusal rate was very similar to the original ADAM project except that a greater proportion of interviewed arrestees (96 percent) agreed to provide a urine sample. The refusal rate did not vary significantly for each VSA protocol. Indeed, there were no significant differences between the Computer Voice Stress Analyzer (CVSA) and Layered Voice Analysis (LVA) sample for age, percent who were deceptive about any drugs, or percent who tested positive for any drug.

All men arrested within Oklahoma County between February 27 and March 24, 2006.

individual

Data were obtained from the following sources:

  • Booking records

  • Computer Voice Stress Analyzer (CVSA) surveys

  • Layered Voice Analysis (LVA) surveys

  • EMIT immunoassay screening tests (drug tests)

  • Expert and novice CVSA & LVA assessments

The dataset contains (1) demographic information obtained from the official booking records, (2) responses to survey questions about recent drug use, (3) the results of a urinalysis test on the five drugs (marijuana, cocaine, heroin, methamphetamine, and pcp), (4) variables recording "deception" or "no deception" on each of the drugs, and (5) decisions by novice and expert analysts regarding the indication of deception.

Of the 356 male arrestees in the sample, a total of 319 arrestees agreed to an interview and also agreed to provide a urine sample, thus yielding a final response rate of 90 percent.

none

Hide

2008-06-23

2018-02-15 The citation of this study may have changed due to the new version control system that has been implemented. The previous citation was:
  • Damphousse, Kelly R., Laura Pointon, Deidra Upchurch, and Rebecca K. Moore. ASSESSING THE VALIDITY OF VOICE STRESS ANALYSIS (VSA) TOOLS IN A JAIL SETTING IN OKLAHOMA CITY, OKLAHOMA, 2006. ICPSR20625-v1. Norman, OK: University of Oklahoma [producer], 2007. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2008-06-23. http://doi.org/10.3886/ICPSR20625.v1

2008-06-23 ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:

  • Standardized missing values.
  • Checked for undocumented or out-of-range codes.
Hide

Notes

  • The public-use data files in this collection are available for access by the general public. Access does not require affiliation with an ICPSR member institution.

  • One or more files in this data collection have special restrictions. Restricted data files are not available for direct download from the website; click on the Restricted Data button to learn more.