Impact of Information Security in Academic Institutions on Public Safety and Security in the United States, 2005-2006 (ICPSR 21188)

Version Date: Aug 22, 2008 View help for published

Principal Investigator(s): View help for Principal Investigator(s)
Steffani A. Burd, Information Security in Academic Institutions

https://doi.org/10.3886/ICPSR21188.v1

Version V1

Slide tabs to view more

Despite the critical information security issues faced by academic institutions, little research has been conducted at the policy, practice, or theoretical levels to address these issues, and few policies or cost-effective controls have been developed. The purpose of this research study was three-fold: (1) to create an empirically-based profile of issues and approaches, (2) to develop a practical road map for policy and practice, and (3) to advance the knowledge, policy, and practice of academic institutions, law enforcement, government, and researchers. The study design incorporated three methods of data collection: a quantitative field survey, qualitative one-on-one interviews, and an empirical assessment of the institutions' network activity.

Survey data collection involved simple random sampling of 600 academic institutions from the Department of Education's National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System (IPEDS) database, recruitment via postcard, telephone, and email, Web-based survey administration, and three follow-ups. Results are contained in Part 1, Quantitative Field Survey Data. Interview data collection involved selecting a sample size of 15 institutions through a combination of simple random and convenience sampling, recruitment via telephone and email, and face-to-face or telephone interviews. Results are contained in Part 2, Qualitative One-on-One Interview Data. Network analysis data collection involved convenience sampling of two academic institutions, recruitment via telephone and email, installing Higher Education Network Analysis (HENA) on participants' systems, and six months of data collection. Results are in Part 3, Subject 1 Network Analysis Data, and Part 4, Subject 2 Network Analysis Data.

The Quantitative Field Survey Data (Part 1) contains 19 variables on characteristics of institutions that participated in the survey component of this study, as well as 263 variables derived from responses to the Information Security in Academic Institutions Survey, which was organized into five sections: Environment, Policy, Information Security Controls, Information Security Challenges, and Resources. The Qualitative One-on-One Interview Data (Part 2) contains qualitative responses to a combination of closed-response and open-response formats. The data are divided into the following seven sections: Environment, Institution's Potential Vulnerability, Institution's Potential Threat, Information Value and Sharing, End Users, Countermeasures, and Insights. Data collected through the empirical analysis of network activity (Part 3 and Part 4) include type and protocol of attack, source and destination information, and geographic location.

Burd, Steffani A. Impact of Information Security in Academic Institutions on Public Safety and Security in the United States, 2005-2006. [distributor], 2008-08-22. https://doi.org/10.3886/ICPSR21188.v1

Export Citation:

  • RIS (generic format for RefWorks, EndNote, etc.)
  • EndNote
United States Department of Justice. Office of Justice Programs. National Institute of Justice (2004-IJ-CX-0045)

Department of Education's region classification

A downloadable version of data for this study is available however, certain identifying information in the downloadable version may have been masked or edited to protect respondent privacy. Additional data not included in the downloadable version are available in a restricted version of this data collection. For more information about the differences between the downloadable data and the restricted data for this study, please refer to the codebook notes section of the PDF codebook. Users interested in obtaining restricted data must complete and sign a Restricted Data Use Agreement, describe the research project and data protection plan, and obtain IRB approval or notice of exemption for their research.

Hide

2005-02 -- 2006-06 (Part 1: 2005-10--2005-12, Part 2: 2005-02--2005-04, Part 3 and Part 4: 2006-01--2006-06)
2005-02 -- 2006-06 (Part 1: October 2005 through December 2005, Part 2: February 2005 through April 2005, Part 3 and Part 4: January 2006 through June 2006)
  1. The files in Part 3, Subject 1 Network Analysis Data, are provided in a WinZip archive with six files. Each one of the six files represents one month of network analysis data for a particular academic institution. Files in Part 4, Subject 2 Network Analysis Data, are provided in a WinZip archive with six files. Each one of the six files also represents one month of network analysis data for a particular academic institution. While network analysis data were originally collected on Subject 1 and Subject 2 from January 1, 2006, to June 30, 2006, the June network analysis data are only available through June 9, 2006, for Part 3 and Part 4 in this data collection.

Hide

Academic institutions face unique information security threats and increasingly frequent and severe breaches. Incidents such as information theft, data tampering, viruses, worms, and terrorist activity constitute significant threats to public safety and national security. Despite the critical information security issues faced by academic institutions, little research has been conducted at the policy, practice, or theoretical levels to address these issues, and few policies and cost-effective controls have been developed. The goals of this project were three-fold. The first goal was to create an empirically-based profile of the information security issues and approaches of academic institutions. The second goal was to develop a practical road map for policy and practice at local, state, and national levels to enhance information security in academic institutions. The third goal of this study was to advance the knowledge, policy-making, and practices of individuals and organizations that have an impact on information security in academic institutions.

The study design incorporated three research methods: a quantitative field survey, qualitative one-on-one interviews, and an empirical assessment of the institutions' network activity. The Department of Education's National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System (IPEDS) database provided general contact information for the 600 academic institutions in the sample frame for the survey (Part 1). The 600-institution sample frame actually was comprised of two separate sample frames of 300 institutions that were treated identically. The only difference between the two sample frames was a three-month difference in contact dates. Through research on the Internet and the telephone, key information on the specific professional responsible for information security in each of the 600 institutions, such as the appropriate professional's telephone number, email address, and title, were identified. Initial contact with potential participants was via a postcard invitation. This double-sided postcard, outlining the study's objectives, conditions of participation, contents, and expected outcomes, was sent to the professional responsible for information security at each institution. Two weeks later, using a standardized telephone script, the principal investigator, strategic development director, or survey telemarketer called each institution to invite the professional responsible for information security to participate in the study. This telephone invitation was followed up by an email invitation to participate in the study, which reiterated the study's objectives, contents, intended outcomes, conditions of participation, and included a link to the on-line survey. One month later, the principal investigator, strategic development director, or survey telemarketer conducted a follow-up telephone call to each potential participant who had not yet completed the study. As with the initial telephone invitation, this call was followed up by an email invitation to participate in the study. Three weeks later, the process was completed for the second follow-up with potential participants that had not yet completed the survey. Three weeks after the second follow-up, the process was repeated for the third follow-up with potential participants who had not yet completed the survey. A thank-you email was sent to all participants once data collection was closed in December 2005.

Recruitment procedures for the qualitative one-on-one interviews (Part 2) involved three steps. First, the principal investigator, strategic development director, or survey telemarketer called the information security professional in each institution to invite him/her to participate in a one-on-one semi-structured interview. This telephone invitation was followed up by an email invitation to participate in the interview and contained an overview of the study and the interview process and protocol. If the participant immediately agreed to participate, the interview was scheduled and completed either in person or via telephone. If the participant did not immediately agree to participate, the team member who originally contacted the information security professional called again two weeks later. As with the original invitation, the telephone call was followed up with an email invitation to participate in the interview and an overview of the interview process and protocol.

The procedures involved in performing an empirical analysis of network activity (Part 3 and Part 4) included installing the Higher Education Network Analysis (HENA) tool on participants' computers and executing the network analysis data collection, analysis, and reporting activities. Two academic institutions participated in the Network Analysis component of this research study for six months (January 1, 2006 - June 9, 2006).

The Department of Education's National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System (IPEDS) database was the source of data for the quantitative survey's (Part 1) sample frame. Criteria for inclusion in the sample frame included characteristics of both institutions and individuals. Inclusion criteria for institutions were: (1) Title IV status, as designated by the Department of Education, (2) jurisdiction within the United States, (3) degree-granting status, and (4) nonadministrative office status. The criterion for inclusion of individuals within the institutions was role, e.g., information security professional, information technology director, chief information officer, or other professional responsible for information security within the academic institution.

The survey sample frame was obtained through two rounds of simple random probability sampling from the Department of Education's NCES IPEDS database to create a sample frame representative of the general population of Title IV, degree-granting academic institutions across the United States. The first sample frame size was 300 academic institutions, based on a targeted sample size of 100 institutions and a predicted response rate of 30 percent. The sampling rate was obtained by dividing the desired sample frame size of 300 institutions by the 4,814 institutions in the database, which produced a fraction of approximately 1/14. The first sample frame was then obtained by selecting one out of every 14 institutions from the NCES IPEDS database of 4,184 institutions. A randomized starting point was selected to ensure that there was a chance selection process. Since the IPEDS list was ordered alphabetically according to state and institution name, the database was examined to ensure that the sample frame resulting from one random start would not have a recurring pattern that might be systematically different from sample frames resulting from other starts. No reordering of the database or adjustment of selection intervals was required.

A second simple random sample frame from the Department of Education's NCES IPEDS database was obtained three months later due to a response rate lower than the predicted 30 percent response rate. The expected response rate was adjusted from 30 percent to 15 percent, so an additional 300 academic institutions were required for the sample frame to achieve the targeted 100 participating institutions. When institutions randomly selected matched the first randomly selected institution, that institution was skipped and the next institution was selected. Examination of institutional characteristics (e.g., funding, region, total enrollment) yielded no significant differences.

The Department of Education's NCES IPEDS database also provided the basis for the interview (Part 2) sample frame. This database was selected to ensure that survey and interview data sample frames were comparable. The desired sample frame size was 20 academic institutions, based on a targeted sample size of 15 institutions and a predicted response rate of 80 percent. Inclusion criteria for institutions and individuals for the interview component of this study were the same for the survey. Institutions were selected from the database using stratified sampling (funding control, region) then selected from this list using convenience sampling. This strategy was selected because the researchers wanted to leverage their personal relationships to ensure honest answers to potentially sensitive questions and to recruit institutions with certain characteristics (e.g., rural, minority-serving, military) for the study.

In regards to sampling for the empirical analysis of networking activity (Part 3 and Part 4), participants in this study were identified and recruited through a "sampling of convenience" approach. Specifically, institutions that met the criteria for inclusion in the study and with which members of the research team had professional relationships were identified and approached. InfraGard, a not-for-profit public-private sector partnership to protect critical infrastructure sponsored by the Federal Bureau of Investigation, was extremely fruitful for recruiting participants. Both institutions that participated in the network analysis were approached through InfraGard contacts. Criteria for inclusion in the network analysis component of this research study were: (1) Title IV status, (2) jurisdiction within the United States, (3) degree-granting status, (4) nonadministrative office status, and (5) ability to provide access to the institution's network over 11 months for tool development and data collection.

Academic institutions included in the United States Department of Education's National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System (IPEDS) database in 2005.

institutions

For Part 1, data were obtained from web-based surveys. For Part 2, data were obtained from face-to-face or telephone interviews. For Part 3 and Part 4, data were obtained from Firewall drop logs and Intrusion Detection/Prevention (IDS/IPS) logs.

The Quantitative Field Survey Data (Part 1) contains 19 variables on characteristics of institutions that participated in the survey component of this study as well as 263 variables derived from responses to the Information Security in Academic Institutions Survey which was organized into five sections: Environment, Policy, Information Security Controls, Information Security Challenges, and Resources. The first section of the survey, Environment, included 22 variables relating to the number of attacks, the impact of laws and regulations, the results of incidents, and compromises. The second section, Policy, contained 32 variables relating to information security responsibility, policy formality, end user groups, end users agreeing to information policy, the effectiveness of information security policy, and the consequences of violating policy. The third section, Information Security Controls, included 143 variables divided into the following sub-sections: Operational Practices, Incidents and Disaster Management, Awareness and Training, and Technology. The fourth section of the survey, Information Security Challenges, included 19 variables relating to: culture challenges, end user awareness and knowledge challenges, technology challenges, and structure and system challenges. The fifth section, Resources, included 41 variables divided into the following sub-sections: Strategic Inputs, Budget, and Structure and Roles. A total of six variables were also included in an additional section that allowed for respondents to include any additional comments or suggestions and provide their title.

The Qualitative One-on-One Interview Data (Part 2) contains qualitative responses to a combination of closed-response and open-response formats. The data were divided into the following seven sections: Environment, Institution's Potential Vulnerability, Institution's Potential Threat, Information Value and Sharing, End Users, Countermeasures, and Insights. Section 1, Environment, contained four questions pertaining to attack trends, emerging technologies, and federal regulations. Section 2, Institution's Potential Vulnerability, included four questions relating to the type and level of vulnerability, the difference in vulnerability between institutions, and the institution's vulnerability in maintaining network security in the upcoming one to three years. Section 3, Institution's Potential Threat, included five questions pertaining to threats to individuals, organizations, and critical infrastructure. Section 4, Information Value and Sharing, consisted of seven questions relating to types of information, sharing information, securing information, and vetting procedures. Section 5, End Users, contained six items pertaining to issues and challenges with end users as well as processing new students' and new faculty members' personal computers. Section 6, Countermeasures, contained 25 questions divided into the following sub-sections: Policy, Awareness and Training, Operational Practices, Technology, and Resources. Section 7, Insights, contained six questions regarding the effectiveness of information security, the priority level that is given to information security, the future challenges of information security, and several other general questions relating to the institution's network security.

Data collected through the empirical analysis of network activity (Part 3 and Part 4) includes type and protocol of attack, source and destination information, and geographic location. More specifically, firewall drop log data collected from the university's firewall application include date/time stamps, IPs (source and destination), and ports (source and destination). Intrusion Detection/Prevention (IDS/IPS) data collected include date/time stamps, IPs (source and destination), ports (source and destination), and alert messages. External machine logs (attackers and attackees) include date/time stamps, IPs (source and destination), and ports (source and destination).

Fort Part 1, 72 information security professionals in academic institutions completed the survey yielding a response rate of 12 percent. For Part 2, 12 professionals participated in the interviews, which was an 80 percent response rate. For Part 3 and Part 4, two institutions provided network activity data yielding a 100 percent response rate.

Several Likert-type scales were used.

Hide

2008-08-21

2018-02-15 The citation of this study may have changed due to the new version control system that has been implemented. The previous citation was:
  • Burd, Steffani A. IMPACT OF INFORMATION SECURITY IN ACADEMIC INSTITUTIONS ON PUBLIC SAFETY AND SECURITY IN THE UNITED STATES, 2005-2006. ICPSR21188-v1. New York, NY: Information Security in Academic Institutions [producer], 2005. Ann Arbor, MI: Inter-university Consortium for Political and Social Research [distributor], 2008-08-22. http://doi.org/10.3886/ICPSR21188.v1

2008-08-21 ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:

  • Standardized missing values.
  • Checked for undocumented or out-of-range codes.
Hide

Not applicable.

Hide

Notes

  • The public-use data files in this collection are available for access by the general public. Access does not require affiliation with an ICPSR member institution.

  • One or more files in this data collection have special restrictions. Restricted data files are not available for direct download from the website; click on the Restricted Data button to learn more.

NACJD logo

This dataset is maintained and distributed by the National Archive of Criminal Justice Data (NACJD), the criminal justice archive within ICPSR. NACJD is primarily sponsored by three agencies within the U.S. Department of Justice: the Bureau of Justice Statistics, the National Institute of Justice, and the Office of Juvenile Justice and Delinquency Prevention.