Responsible Conduct of Research (RCR) Instruction Delphi Panel Results
Method
Panel 4 convened RCR and research experts to address the following question:
Within RCR instructional programs, what specific topics should be taught and assessed in the core areas of:
1. Publication practices and responsible authorship, and
2. Peer review?
The panel used a version of the Delphi method to achieve consensus. Panelists were asked to complete three successive online questionnaires. All responses were anonymous.
Round 1 consisted of an open-response format. Participants were directed to a website which asked them to list, in corresponding text boxes, at least five specific topics in each of the core areas under consideration. After each participant had completed round 1, their responses were carefully condensed, re-worded and organized into topics and subtopics to enhance clarity and prevent redundancy.
Round 2 involved presenting participants the lists of topics they had generated and asking them to evaluate the importance of teaching each topic in an RCR course. Participants were asked to rate the importance of teaching each topic or subtopic on a four-point scale (1 = Unimportant, 2 = Less important, 3 = Important, 4 = Very important). Participants were also asked to make additional comments about each item concerning wording or clarity. Topics receiving a vote of “Important” or “Very Important” from at least two-thirds of participants were deemed to meet consensus criteria and were presented to participants in the next round, after they were revised according to the participants comments. Topics not meeting consensus are nevertheless displayed in the tables below, with their corresponding consensus values and mean scores.
Round 3 added to Round 2 by re-asking participants the importance of teaching each item, and also asked participants to rate the importance of assessing each item within an RCR course. Assessment rankings followed the same four-point scale used in the previous round. For each question (one on teaching, one on assessing), each topic receiving a vote of “Important” or “Very Important” from at least two-thirds of participants was deemed to meet consensus criteria, and is labeled with an asterisk in the table below. The consensus value and the mean score for each topic and subtopic are also shown.
Results: Panel 4
Core Area III: Publication Practices and Responsible Authorship
Topic (Subtopics indented) | Percentage of participants rating item as “important” or “very important” (Mean score) | |
---|---|---|
Teaching | Assessing | |
1. The significance of authorship | 91* (3.45) | 55 (2.64) |
a. The benefits of publishing | 40 (2.70) | N/A= |
b. The problems of inappropriate authorship for legitimate authors, illegitimate authors, and science | 91* (3.45) | 73* (3.00) |
2. Authorship assignment | 91* (3.36) | 64 (2.73) |
a. Authorship criteria | 91* (3.55) | 64 (2.91) |
i. Substantial intellectual contribution to study or text | 100* (3.64) | 73* (3.27) |
ii. Familiarity with and approval of the final text | 82* (3.36) | 55 (2.91) |
b. Ideal of transparent contributions | 73* (3.00) | 45 (2.45) |
c. Multiple authors: how to determine senior/first author | 82* (3.36) | 55 (2.73) |
d. Appropriateness of discussing authorship at outset of a project | 91* (3.64) | 64 (3.09) |
e. Acknowledgments: purpose and examples (including faculty contributions to students work) | 90* (3.40) | 60 (2.90) |
f. Variation of standards and norms across disciplines | 82* (3.00) | 45 (2.27) |
3. Inappropriate authorship practices | 73* (3.36) | 55 (3.00) |
a. Ghost authorship | 64 (3.09) | 55 (2.73) |
b. Forced or “courtesy” authorship, e.g., when students are asked to add authors for political reasons | 73* (3.27) | 55 (2.82) |
4. Dealing with controversies that arise in authorship | 82* (3.36) | 55 (2.73) |
5. Scientific responsibilities of authors | 91* (3.73) | 91* (3.36) |
a. Disclosure of funding sources and other sources of potential bias | 100* (3.82) | 82* (3.36) |
b. Specification of any deviations from standard scientific practices | 91* (3.55) | 82* (3.27) |
c. Full and accurate description of methods, procedures and analytic techniques that allows repetition | 91* (3.64) | 82* (3.27) |
d. Citation of relevant literature without bias | 100* (3.55) | 64 (3.00) |
e. Duty to report findings accurately and completely, including reporting critical or negative findings (even if they are contrary to own research agenda) | 100* (3.73) | 82* (3.45) |
6. Poor publication practices | 91* (3.45) | 73* (2.18) |
a. Plagiarism versus proper citation or paraphrasing | 100* (3.73) | 82* (3.45) |
b. Delay in reporting for commercial reasons | 70* (2.80) | 60 (2.60) |
c. Publication bias | 100* (3.36) | 64 (2.82) |
d. Text recycling; overlapping publication; duplicate and salami publication | 100* (3.55) | 64 (2.82) |
e. Quality standards | 91* (3.27) | 64 (2.73) |
7. Protecting privacy in publication | 60 (3.00) | N/A= |
8. Addressing compliance with ethical standards within articles (e.g., mentioning IRB or IACUC approval, and discussing ethically controversial elements of a study) | 100* (3.18) | 55 (2.64) |
9. Responsible disclosure of scientific information within the popular press | 60 (2.60) | N/A= |
Legend:
* = Item achieved a “consensus” by receiving a rating of important or very important from two-thirds of participants
= = Not applicable because these items were eliminated after round 2 and their importance of being assessed was not measured
Core Area IV: Peer Review
Topic (Subtopics indented) | Percentage of participants rating item as “important” or “very important” (Mean score) | |
---|---|---|
Teaching | Assessing | |
1. The significance of peer review | 100* (3.64) | 73 (3.09) |
a. Peer review as a mechanism for quality assurance in publication and funding | 100* (3.18) | 55 (2.64) |
b. The need for reviewers to be competent and genuine peers | 91* (3.36) | 64 (2.82) |
2. Conflicts of Interest and Peer Reviews | 100* (3.73) | 91* (3.36) |
a. Identifying potential conflict of interest reviewers may have | 100* (3.73) | 82* (3.18) |
b. Managing conflicts of interest by excusing oneself from a review or disclosing and managing the conflict with the assistance of those directing the review | 100* (3.82) | 91* (3.27) |
c. Other sources of peer review bias | 82* (3.09) | 55 (2.64) |
3. Qualities of a good review/reviewer | 82* (3.36) | 55 (2.64) |
a. Respecting confidentiality and intellectual property (e.g., by avoiding use of information and destroying manuscripts after review) | 91* (3.27) | 64 (2.91) |
b. Fairness and objectivity | 91* (3.55) | 70* (3.10) |
c. Collegiality—conveying a respectful and professional tone while offering critical feedback | 80* (3.20) | 40 (2.30) |
d. Timeliness | 82* (3.18) | 45 (2.45) |
e. Providing clear, scientifically competent, and complete reviews | 91* (3.27) | 64 (3.00) |
4. Logistics of peer reviewing | 50 (2.40) | N/A= |
a. Format of written review | 30 (2.20) | N/A= |
b. Peer review process | 60 (2.70) | N/A= |
c. Selection of reviewers | 50 (2.60) | N/A= |
5. Responding to reviewers | 82* (3.18) | 60 (2.70) |
a. Responding to competent reviews: the revision and resubmission process | 60 (2.60) | N/A= |
b. Responding to questionable, biased, or conflicted reviews: the roles of authors (PIs), editors, and scientific review chairs | 91* (3.18) | 64 (2.64) |
c. Inappropriate responses to reviewers and modifications to publications or proposals | 60 (2.50) | N/A= |
6. Reviewer roles in ensuring RCR | 82* (2.91) | 36 (2.27) |
7. Editorial responsibilities | 55 (2.73) | 36 (2.27) |
a. Selecting appropriate reviewers | 55 (2.73) | 36 (2.27) |
b. Attending to matters of RCR (proper authorship, disclosure of bias and conflicts, etc) – 2.70 | 60 (2.70) | N/A= |
c. Respecting rights of rebuttal and mediating disputes | 60 (2.60) | N/A= |
d. Maintaining confidentiality | 64 (3.00) | 45 (2.55) |
Recruitment and Panelist Biosketches
Recruitment:
Recruitment began with (1) a literature search to identify authors actively researching and publishing in RCR, (2) a review of ORI Annual Reports from 2000 through 2005 to identify those who received ORI contracts and grants, and (3) review of recent research administrative and RCR conference programs to identify those who had presented on relevant topics. Lastly, certain subgroups such as current research students, research administrators and journal editors, were identified for specific panels. From the resulting list of possible participants, the Project Director, in consultation with ORI, selected those who were both qualified to serve on a particular panel and represented diverse backgrounds. For Panel 4, experts with knowledge of and experience in the related core areas were recruited along with a select group of general RCR experts. Recruitment letters were sent to these individuals, asking them to volunteer without compensation for a total of 1.5 hours (30 minutes for each round) over approximately nine months. Those who declined participation, but represented a subgroup of interest, were asked to provide a recommendation for another possible participant. Unless otherwise noted, each of the following individuals participated in at least 2 of 3 rounds.
Panelists:
Kathryn Anderson, Ph.D. is a Professor in the Department of Psychology at Our Lady of the Lake University in San Antonio, Texas. Dr. Anderson received her Ph.D. in 1996 from the University of Missouri at Columbia in Social Psychology. Her research focuses on the social psychology of gender, aggression, and personality. She is also an Ad-hoc reviewer for Journal of Personality and Social Psychology, Personality and Social Psychology Bulletin, and Journal of Applied Social Psychology. She recently presented in March 2006, a paper titled, “‘Sorry, no studies on child sexual abuse’: Coaching exuberant Psychology undergraduate investigators on research ethics,” at the conference “Promoting research integrity in the social and behavioral sciences” at the University of Texas at San Antonio.
Alison Antes, B.S. is a Research Assistant at the University of Oklahoma Center for Applied Social Research, seeking a M.S. in Industrial and Organizational Psychology. She received her B.S. in Psychology from Indiana State University in 2005. She is the coordinator of an ORI/NIH sponsored project investigating the effectiveness of current approaches to RCR education. At the ORI Research Conference in December 2006, two of her co-authored papers were accepted for presentation: Ethical Decision-Making in Research: Exploring the Influence of Personality and The Development of Ethical Decision-Making: Early Environmental Predictors of Research Integrity.
Martin Blume, Ph.D. is since 1997 the Editor-in-Chief of the American Physical Society. He received his Ph.D. in 1959 from Harvard University in Physics. Dr. Blume began his career at the Brookhaven National Laboratory as an Associate Physicist in 1962 and was appointed Deputy Director in 1984 and served there until 1996. Scientific communication and publication and scientific ethics and misconduct are among his research interests. Dr. Blume sat on the Board of Directors for the Association of Learned and Professional Society Publishers (ALPSP) from 2001 to 2003. Dr. Blume also serves on the Advisory Committee on Dialogue on Science, Ethics and Religion for the American Association for the Advancement of Science. (Round 1 only)
Annette Flanagin, R.N., M.A., F.A.A.N., is Managing Deputy Editor for JAMA (Journal of the American Medical Association) and Director of Editorial Operations, JAMA and Archives Journals. She is a graduate of Georgetown University, with a B.S. in nursing and an M.A. in English Literature. Before joining JAMA in 1988, she served as editor of the Journal of Obstetric, Gynecologic, and Neonatal Nursing (JOGNN). She is also Past President of the Council of Science Editors serves as the coordinator of the International Congresses on Peer Review in Biomedical Publication. She has co-developed a number of guidelines and policies to guide authors, editors, and publishers in scientific publication and is an author of the AMA Manual of Style: A Guide for Authors and Editors. She participates in research, lectures, and publishes on issues related to scientific publication for authors, peer reviewers, and editors.
C. Kristina Gunsalus, J.D. serves as Special Counsel in the Office of University Counsel and Adjunct Professor in the Colleges of Law and Medicine at the University of Illinois at Urbana-Champaign. She received her J.D. from the University if Illinois College of Law. She is a member of the faculty with the Medical Humanities/Social Sciences program and teaches communication, conflict resolution skills and ethics. During her career, she has served on several committees, including the Committee on Research Integrity of the Association of American Medical Colleges (AAMC), the Government-University-Industry Research Roundtable Ad Hoc Group on Conflict of Interest, and the United States Commission on Research Integrity.
Nalini Jairath, Ph.D. is Dean of the School of Nursing at The Catholic University of America. She received her Ph.D. in 1990 from the Institute of Medical Sciences, University of Toronto with a cardiovascular specialty. Over her 24 years as a nurse, she has held positions in academia, research and clinical settings. Dr. Jairath is also an active reviewer, serving as Chair of Scientific Review Panels for the Tri-service Nursing Research Program in Bethesda, Maryland, as well a reviewer for journals including: Nursing Research, Journal of Advanced Nursing, Journal of Cardiopulmonary Nursing, and Dimensions in Critical Care Nursing. In 2002, she received a grant form ORI for her project, Web-Enhanced Curriculum for Responsible Authorship and Publication Practices.
Faith McLellan, Ph.D. is since 2001 North American Senior Editor for The Lancet. She received her Ph.D in 1997 from the University of Texas Medical Branch in Medical Humanities. In the past, she has served as a medical writer and editor for two academic departments of anesthesiology, managing editor of Physicians Information and Education Resource, and editorial director of Praxis Press. She also serves as International Contributing Editor for the journal Literature and Medicine and as Visiting Consultant Editor for the Department of Anesthesiology and General Intensive Care at the University of Vienna. (Round 1 only)
Sara Rockwell, Ph.D. is Professor in the Department of Therapeutic Radiology and Pharmacology and Director of the Office of Scientific Affairs at Yale University School of Medicine. She received her Ph.D. in 1971 from Stanford University in Biophysics. In addition to her active service to the scientific community at Yale on numerous committees, Dr. Rockwell also serves as a member of the ORI Panel on Responsible Conduct of Research Subgroup on Scientific Publications since 2006. In 2003, she received a grant from ORI for her project, Ethics of Peer Review: A Guide for Manuscript Reviewers.
Miguel Roig, Ph.D. is Associate Professor of Psychology at St. John’s University. He received his Ph.D. in 1989 from Rutgers University’s Institute for Cognitive Studies. His published and presented research focuses on academic dishonesty and scientific misconduct, specifically as it relates to the problem of plagiarism and related issues of authorship and publication practices. He has served on the IRB since 2000. In 2003, he received a grant from ORI for his project Avoiding Plagiarism, Self-Plagiarism, and Other Questionable Writing Practices: A Guide to Ethical Writing.
Regina Smith, Ph.D. is Associate Vice President for Research in the Office of the Vice President for Research (OVPR) at the University of Georgia. Dr. Smith received her Ph.D. in 1985 from the University of Alabama in Educational Research. In her current position, she oversees the Offices for Sponsored Programs and Human Subjects Compliance. She also serves as Institutional Integrity Officer with respect to misconduct in science and scholarly activities and the OVPR contact for conflict of interest issues. Dr. Smith has published articles on the topics of Research Administration, Conflicts of Interest, and Compliance Issues and Grants Administration. Since 2002, she has been a reviewer of Responsible Conduct of Research proposals for DHHS/ORI.
Dan Vasgird, Ph.D. is Director of Office for Research Compliance at the University of Nebraska in Lincoln. He received his Ph.D. from Syracuse University in Social Psychology. He served as the Director of the Office for the Responsible Conduct of research at Columbia University from 2002-2005 and has received and ORI grant for The Development of RCR Internet-based E-seminars on Collaborative Science and Careers. Additionally, he has also conducted numerous seminars and workshops in areas of human research protection, research compliance, research ethics, research integrity and bioethics education including: “Teaching Research Ethics” at Indiana University Poynter Center, and “Ethical Issues in International Health Research” at Harvard University School of Public Health.
Min Qi Wang, Ph.D. is a Professor in the Department of Public and Community Health at the University of Maryland. He received his Ph.D. in 1987 from the University of Arizona. He sits on the editorial board of the journal Perspectives: Drugs and Society, serves as a research editor and chair of the statistical committee for the American Journal of Health Behavior and as well as an associate editor for Perceptual and Motor Skills and Psychological Reports, as well as serving as a reviewer for numerous other journals. In 2004, Dr. Wang received a grant form ORI for his project Computer-based Tool for Peer Review: Evaluating Data Analyses. In 2005 he received another grant from ORI, which has been extended until 2007, for his project Peer Review Tool – Sample Size Determination for Experimental Studies.
Madeline Wong, M.S. is a graduate student in Pharmacological and Physiological Science at the Saint Louis University School of Medicine, expecting her Ph.D. in 2008. She received her Masters degree in Biotechnology in 2001 from the University of Malaya, Malaysia. Her prior work included being a medical laboratory technologist in Malaysia, a Research Officer in the National Cancer Centre in Singapore, and a pre-doctoral fellowship from the American Heart Association.
Acknowledgements:
This project was funded by a RCR Resource Development contract from the Office of Research Integrity. James DuBois served as the project director and Jeff Dueker as the research assistant. Kathleen Wyrwich served as a methodology consultant. Courtney Andrews served as a technology consultant and created the online questionnaires.
This report was submitted to the Office of Research Integrity by the project director on July 12, 2007.