Cross-Institutional Investigation of Faculty Publishing in the United States, 2021-2022 (ICPSR 39429)

Version Date: Feb 16, 2026 View help for published

Principal Investigator(s): View help for Principal Investigator(s)
Nicole R. Webber, University of Northern Colorado; Stephanie Wiegand, University of Northern Colorado; Jason A. Cohen, University of Miami; John M. Reynolds, University of Miami; Lisa Ancelet, Texas State University; Arlene V. Salazar, Texas State University

https://doi.org/10.3886/ICPSR39429.v1

Version V1

Slide tabs to view more

The objectives of this study were to ascertain the knowledge and attitudes of university faculty in the United States concerning journal publication, and specifically, the phenomenon known as "predatory publishing." The research questions that guided the development of the survey were:

  1. How do university faculty determine where to submit journal article manuscripts for publication?
  2. Are university faculty aware of the terminology, tools, and strategies related to predatory publishing and assessing the quality of journals?
  3. How is a faculty member's awareness or attitudes related to their prior experience in academia?
  4. Does a faculty member's knowledge of predatory journals affect which publishers they publish with and how they view the work of other scholars?

This study was exploratory in nature, and the survey instrument developed for this purpose was not formally validated. The investigators designed the survey through a pilot study and consequently refined and expanded it to investigate a broader population. The resulting survey consists of 47 closed- and open-ended items with 136 total variables. The questions are organized into five sections: demographics, environment/department culture, history/experience, journal criteria, and predatory publishing.

The dataset resulting from this study consists of 1167 cases and 152 variables. The target population was faculty of any discipline who worked at a university in the United States and who were required to conduct and publish research as part of their position. The survey was sent to approximately 19,400 faculty at 17 doctoral-granting universities between September 2021 and May 2022.

Webber, Nicole R., Wiegand, Stephanie, Cohen, Jason A., Reynolds, John M., Ancelet, Lisa, and Salazar, Arlene V. Cross-Institutional Investigation of Faculty Publishing in the United States, 2021-2022. Inter-university Consortium for Political and Social Research [distributor], 2026-02-16. https://doi.org/10.3886/ICPSR39429.v1

Export Citation:

  • RIS (generic format for RefWorks, EndNote, etc.)
  • EndNote
Inter-university Consortium for Political and Social Research
Hide

2021 -- 2022
2021-09 -- 2022-05
Hide

Survey Design:

The data collection instrument was designed as an online survey to be administered through Qualtrics XM Platform survey software. The survey is an expansion of a pilot study conducted by Webber and Wiegand (2022). Though the instrument has not been validated, it was tested by graduate students, faculty researchers, and librarians who provided feedback for refinement.

The survey consists of 47 questions organized into five sections. Most questions are multiple choice while some are open-ended. The use of matrix-style questions, multiple-select questions, and other multi-part questions accounts for 136 variables. Sixteen additional variables were added by the survey system or in the cleaning and recoding process for a total of 152 variables in the resulting data set. Generally, all questions appear to all participants, but a small number of questions are logic-based depending on the individual's previous responses. The majority of participants took between 11 and 20 minutes to complete the survey.

Participant selection:

Participant selection for the data set occurred in two phases. First, the investigators recruited institutions to participate, and then the survey was sent to a comprehensive list of faculty members at each of those institutions.

Institutions:

Participating institutions were selected from the 2018 Carnegie Classification Update (Indiana University Center for Postsecondary Research, 2018) with the goal of including 10 institutions from each of the three levels of Basic Classification for doctoral-granting institutions (R1: Very High Research Activity, R2: High Research Activity, and D/PU: Doctoral/Professional Universities). Participation was limited to institutions classified as four-year, large (student FTE >10,000), and either public or not-for-profit.

Institutions of the investigators, UM (R1) and TXST (R2), were included as participating institutions by default; however, UNC was excluded due to its role as the site of the pilot study.

The selection methodology planned to randomly select potential institutions from the three lists of qualifying institutions until the target number of institutions were committed to the project. To achieve this, investigators contacted librarians at potential institutions to act as facilitators in obtaining permissions and deploying the survey. The timing of this stage of the study with the ongoing COVID-19 pandemic severely limited this methodology. To compensate for this, an open invitation was sent through multiple email listservs seeking facilitators at qualifying institutions. The end result was a convenience sample of participating institutions.

Individuals:

Eligible participants were faculty members of any rank whose official workload included at least some portion dedicated to the production of scholarship. The facilitators at each institution aided in recruiting participants using one of three methods aimed at inviting all faculty to participate based on the permissions and functionalities of their institutions. Methods included providing lists of emails to be entered into the survey platform directly, access to distribution lists, and promoting the survey in newsletters or other mass communications to faculty. No incentives were offered to individual participants.

Due to the varying methods of deployment, it is not possible to report an exact number of faculty who were invited to participate in the study. The survey was sent to approximately 19,400 faculty at 17 doctoral universities between September 2021 and May 2022. The survey was available to respondents for 3-9 weeks depending on holidays and each institution's calendar.

Participating institutions were selected through convenience sampling and stratified sampling using three research levels of Carnegie Classifications. All faculty at each selected institution were invited to participate in the survey.

Cross-sectional

University faculty in the United States who publish research.

Individuals

The survey consisted of variables organized into five sections:

  1. Demographics - Determine eligibility and identify the individual's faculty role and some factors indicating their experience.
  2. Environment/Department Culture - Expectations, attitudes, and knowledge regarding publishing based on the individual's encounters with colleagues in their department or field.
  3. History/Experience - The individual's publication history, perceptions, and training.
  4. Criteria - Factors the individual employs to assess the quality or validity of a journal article or publication outlet/publisher.
  5. Predatory Publishing - Any reference to predatory publishing or journals is deliberately omitted until the final section of the survey (including invitation and consent language). This section poses questions in direct relation to predatory publishing, including knowledge, training, attitudes, and direct encounters.

The survey was sent to approximately 19,400 faculty members at 17 universities in the United States. A total of 1,498 responses were received, and 331 cases were removed for having no completed questions beyond the initial Demographics section of the survey. All respondents met the inclusion criteria for faculty who were required to publish research. The remaining 1,167 completed cases result in a response rate of 6 percent.

Hide

2026-02-16

2026-02-16 ICPSR data undergo a confidentiality review and are altered when necessary to limit the risk of disclosure. ICPSR also routinely creates ready-to-go data files along with setups in the major statistical software formats as well as standard codebooks to accompany the data. In addition to these procedures, ICPSR performed the following processing steps for this data collection:

  • Checked for undocumented or out-of-range codes.

Hide

Notes