Academic institutions face unique information security threats and increasingly frequent and severe breaches. Incidents such as information theft, data tampering, viruses, worms, and terrorist activity constitute significant threats to public safety and national security. Despite the critical information security issues faced by academic institutions, little research has been conducted at the policy, practice, or theoretical levels to address these issues, and few policies and cost-effective controls have been developed. The goals of this project were three-fold. The first goal was to create an empirically-based profile of the information security issues and approaches of academic institutions. The second goal was to develop a practical road map for policy and practice at local, state, and national levels to enhance information security in academic institutions. The third goal of this study was to advance the knowledge, policy-making, and practices of individuals and organizations that have an impact on information security in academic institutions.
The study design incorporated three research methods: a quantitative field survey, qualitative one-on-one interviews, and an empirical assessment of the institutions' network activity. The Department of Education's National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System (IPEDS) database provided general contact information for the 600 academic institutions in the sample frame for the survey (Part 1). The 600-institution sample frame actually was comprised of two separate sample frames of 300 institutions that were treated identically. The only difference between the two sample frames was a three-month difference in contact dates. Through research on the Internet and the telephone, key information on the specific professional responsible for information security in each of the 600 institutions, such as the appropriate professional's telephone number, email address, and title, were identified. Initial contact with potential participants was via a postcard invitation. This double-sided postcard, outlining the study's objectives, conditions of participation, contents, and expected outcomes, was sent to the professional responsible for information security at each institution. Two weeks later, using a standardized telephone script, the principal investigator, strategic development director, or survey telemarketer called each institution to invite the professional responsible for information security to participate in the study. This telephone invitation was followed up by an email invitation to participate in the study, which reiterated the study's objectives, contents, intended outcomes, conditions of participation, and included a link to the on-line survey. One month later, the principal investigator, strategic development director, or survey telemarketer conducted a follow-up telephone call to each potential participant who had not yet completed the study. As with the initial telephone invitation, this call was followed up by an email invitation to participate in the study. Three weeks later, the process was completed for the second follow-up with potential participants that had not yet completed the survey. Three weeks after the second follow-up, the process was repeated for the third follow-up with potential participants who had not yet completed the survey. A thank-you email was sent to all participants once data collection was closed in December 2005.
Recruitment procedures for the qualitative one-on-one interviews (Part 2) involved three steps. First, the principal investigator, strategic development director, or survey telemarketer called the information security professional in each institution to invite him/her to participate in a one-on-one semi-structured interview. This telephone invitation was followed up by an email invitation to participate in the interview and contained an overview of the study and the interview process and protocol. If the participant immediately agreed to participate, the interview was scheduled and completed either in person or via telephone. If the participant did not immediately agree to participate, the team member who originally contacted the information security professional called again two weeks later. As with the original invitation, the telephone call was followed up with an email invitation to participate in the interview and an overview of the interview process and protocol.
The procedures involved in performing an empirical analysis of network activity (Part 3 and Part 4) included installing the Higher Education Network Analysis (HENA) tool on participants' computers and executing the network analysis data collection, analysis, and reporting activities. Two academic institutions participated in the Network Analysis component of this research study for six months (January 1, 2006 - June 9, 2006).
The Department of Education's National Center for Education Statistics (NCES) Integrated Postsecondary Education Data System (IPEDS) database was the source of data for the quantitative survey's (Part 1) sample frame. Criteria for inclusion in the sample frame included characteristics of both institutions and individuals. Inclusion criteria for institutions were: (1) Title IV status, as designated by the Department of Education, (2) jurisdiction within the United States, (3) degree-granting status, and (4) nonadministrative office status. The criterion for inclusion of individuals within the institutions was role, e.g., information security professional, information technology director, chief information officer, or other professional responsible for information security within the academic institution.
The survey sample frame was obtained through two rounds of simple random probability sampling from the Department of Education's NCES IPEDS database to create a sample frame representative of the general population of Title IV, degree-granting academic institutions across the United States. The first sample frame size was 300 academic institutions, based on a targeted sample size of 100 institutions and a predicted response rate of 30 percent. The sampling rate was obtained by dividing the desired sample frame size of 300 institutions by the 4,814 institutions in the database, which produced a fraction of approximately 1/14. The first sample frame was then obtained by selecting one out of every 14 institutions from the NCES IPEDS database of 4,184 institutions. A randomized starting point was selected to ensure that there was a chance selection process. Since the IPEDS list was ordered alphabetically according to state and institution name, the database was examined to ensure that the sample frame resulting from one random start would not have a recurring pattern that might be systematically different from sample frames resulting from other starts. No reordering of the database or adjustment of selection intervals was required.
A second simple random sample frame from the Department of Education's NCES IPEDS database was obtained three months later due to a response rate lower than the predicted 30 percent response rate. The expected response rate was adjusted from 30 percent to 15 percent, so an additional 300 academic institutions were required for the sample frame to achieve the targeted 100 participating institutions. When institutions randomly selected matched the first randomly selected institution, that institution was skipped and the next institution was selected. Examination of institutional characteristics (e.g., funding, region, total enrollment) yielded no significant differences.
The Department of Education's NCES IPEDS database also provided the basis for the interview (Part 2) sample frame. This database was selected to ensure that survey and interview data sample frames were comparable. The desired sample frame size was 20 academic institutions, based on a targeted sample size of 15 institutions and a predicted response rate of 80 percent. Inclusion criteria for institutions and individuals for the interview component of this study were the same for the survey. Institutions were selected from the database using stratified sampling (funding control, region) then selected from this list using convenience sampling. This strategy was selected because the researchers wanted to leverage their personal relationships to ensure honest answers to potentially sensitive questions and to recruit institutions with certain characteristics (e.g., rural, minority-serving, military) for the study.
In regards to sampling for the empirical analysis of networking activity (Part 3 and Part 4), participants in this study were identified and recruited through a "sampling of convenience" approach. Specifically, institutions that met the criteria for inclusion in the study and with which members of the research team had professional relationships were identified and approached. InfraGard, a not-for-profit public-private sector partnership to protect critical infrastructure sponsored by the Federal Bureau of Investigation, was extremely fruitful for recruiting participants. Both institutions that participated in the network analysis were approached through InfraGard contacts. Criteria for inclusion in the network analysis component of this research study were: (1) Title IV status, (2) jurisdiction within the United States, (3) degree-granting status, (4) nonadministrative office status, and (5) ability to provide access to the institution's network over 11 months for tool development and data collection.
Mode of Data Collection:
For Part 1, data were obtained from web-based surveys. For Part 2, data were obtained from face-to-face or telephone interviews. For Part 3 and Part 4, data were obtained from Firewall drop logs and Intrusion Detection/Prevention (IDS/IPS) logs.
Description of Variables:
The Quantitative Field Survey Data (Part 1) contains 19 variables on characteristics of institutions that participated in the survey component of this study as well as 263 variables derived from responses to the Information Security in Academic Institutions Survey which was organized into five sections: Environment, Policy, Information Security Controls, Information Security Challenges, and Resources. The first section of the survey, Environment, included 22 variables relating to the number of attacks, the impact of laws and regulations, the results of incidents, and compromises. The second section, Policy, contained 32 variables relating to information security responsibility, policy formality, end user groups, end users agreeing to information policy, the effectiveness of information security policy, and the consequences of violating policy. The third section, Information Security Controls, included 143 variables divided into the following sub-sections: Operational Practices, Incidents and Disaster Management, Awareness and Training, and Technology. The fourth section of the survey, Information Security Challenges, included 19 variables relating to: culture challenges, end user awareness and knowledge challenges, technology challenges, and structure and system challenges. The fifth section, Resources, included 41 variables divided into the following sub-sections: Strategic Inputs, Budget, and Structure and Roles. A total of six variables were also included in an additional section that allowed for respondents to include any additional comments or suggestions and provide their title.
The Qualitative One-on-One Interview Data (Part 2) contains qualitative responses to a combination of closed-response and open-response formats. The data were divided into the following seven sections: Environment, Institution's Potential Vulnerability, Institution's Potential Threat, Information Value and Sharing, End Users, Countermeasures, and Insights. Section 1, Environment, contained four questions pertaining to attack trends, emerging technologies, and federal regulations. Section 2, Institution's Potential Vulnerability, included four questions relating to the type and level of vulnerability, the difference in vulnerability between institutions, and the institution's vulnerability in maintaining network security in the upcoming one to three years. Section 3, Institution's Potential Threat, included five questions pertaining to threats to individuals, organizations, and critical infrastructure. Section 4, Information Value and Sharing, consisted of seven questions relating to types of information, sharing information, securing information, and vetting procedures. Section 5, End Users, contained six items pertaining to issues and challenges with end users as well as processing new students' and new faculty members' personal computers. Section 6, Countermeasures, contained 25 questions divided into the following sub-sections: Policy, Awareness and Training, Operational Practices, Technology, and Resources. Section 7, Insights, contained six questions regarding the effectiveness of information security, the priority level that is given to information security, the future challenges of information security, and several other general questions relating to the institution's network security.
Data collected through the empirical analysis of network activity (Part 3 and Part 4) includes type and protocol of attack, source and destination information, and geographic location. More specifically, firewall drop log data collected from the university's firewall application include date/time stamps, IPs (source and destination), and ports (source and destination). Intrusion Detection/Prevention (IDS/IPS) data collected include date/time stamps, IPs (source and destination), ports (source and destination), and alert messages. External machine logs (attackers and attackees) include date/time stamps, IPs (source and destination), and ports (source and destination).
Fort Part 1, 72 information security professionals in academic institutions completed the survey yielding a response rate of 12 percent. For Part 2, 12 professionals participated in the interviews, which was an 80 percent response rate. For Part 3 and Part 4, two institutions provided network activity data yielding a 100 percent response rate.
Presence of Common Scales:
Several Likert-type scales were used.
Extent of Processing: ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of
disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major
statistical software formats as well as standard codebooks to accompany the data. In addition to
these procedures, ICPSR performed the following processing steps for this data collection:
Standardized missing values.
Checked for undocumented or out-of-range codes.