Responsible Conduct of Research (RCR) Instruction Delphi Panel Results
Method
Panel 2 convened RCR and research experts to address the following question:
Within RCR instructional programs, what specific topics should be taught and assessed in the areas of:
1. Data acquisition, management, sharing and ownership; and
2. Research misconduct?
The panel used a version of the Delphi method to achieve consensus. Panelists were asked to complete three successive online questionnaires. All responses were anonymous.
Round 1 consisted of an open-response format. Participants were directed to a website which asked them to list, in corresponding text boxes, at least five specific topics in each of the core areas under consideration. After each participant had completed round 1, their responses were carefully condensed, re-worded and organized into topics and subtopics to enhance clarity and prevent redundancy.
Round 2 involved presenting participants the lists of topics they had generated and asking them to evaluate the importance of teaching each topic in an RCR course. Participants were asked to rate the importance of teaching each topic or subtopic on a four-point scale (1 = Unimportant, 2 = Less important, 3 = Important, 4 = Very important). Participants were also asked to make additional comments about each item concerning wording or clarity. Topics receiving a vote of “Important” or “Very Important” from at least two-thirds of participants were deemed to meet consensus criteria and were presented to participants in the next round, after they were revised according to the participants comments. Topics not meeting consensus are nevertheless displayed in the tables below, with their corresponding consensus values and mean scores.
Round 3 added to Round 2 by re-asking participants the importance of teaching each item, and also asked participants to rate the importance of assessing each item within an RCR course. Assessment rankings followed the same four-point scale used in the previous round. For each question (one on teaching, one on assessing), each topic receiving a vote of “Important” or “Very Important” from at least two-thirds of participants was deemed to meet consensus criteria, and is labeled with an asterisk in the table below. The consensus value and the mean score for each topic and subtopic are also shown.
Results: Panel 2
Core Area I. Data Acquisition, Management, Sharing and Ownership
Topic (Subtopics indented) | Percentage of participants rating item as “important” or “very important” (Mean score) | |
---|---|---|
Teaching | Assessing | |
1. Ethical values behind the scientific standards for data acquisition, management, sharing, and ownership | 92* (3.58) | 75* (2.83) |
a. Confidentiality and privacy | 100* (3.67) | 92* (3.08) |
b. Trustworthiness, honesty, and transparency | 100* (3.75) | 67* (2.92) |
c. Right to property or to prosper from work | 58 (2.67) | N/A= |
d. Scientific collegiality and virtue of sharing | 100* (3.50) | 67* (2.75) |
e. Value of having regulations and standards | 75* (3.25) | 58 (2.75) |
2. Variations in lab practices—legitimate and illegitimate variations | 92* (3.42) | 58 (2.83) |
3. Data acquisition issues | 100* (3.82) | 82* (3.27) |
a. Informed consent or permission to gather or use data | 100* (3.83) | 83* (3.42) |
b. Sampling and data selection | 100* (3.75) | 83* (3.33) |
c. Verifying and cleaning data | 100* (3.67) | 75* (3.17) |
4. Data storage, protection, and archiving | 92* (3.50) | 67* (2.92) |
a. Techniques for entering, storing, and archiving data | 64 (2.82) | N/A= |
a. Data storage longevity (how long to save data and what format) | 83* (3.17) | 58 (2.67) |
b. Data protection and backup | 92* (3.25) | 67* (2.83) |
c. Unique issues pertaining to special kinds of data, such as tissue, DNA, photographic data | 92* (3.33) | 50 (2.83) |
5. Data Sharing | 100* (3.50) | 67* (2.92) |
a. How and when data should be shared, advantages and disadvantages | 100* (3.50) | 75* (2.83) |
b. Transferring data | 64 (2.55) | N/A= |
c. Acceptable and unacceptable uses for shared data | 100* (3.45) | 82* (3.00) |
6. Legal aspects of data ownership and rights | 92* (3.58) | 83* (3.25) |
a. Ownership of data, patents, copyrights, and intellectual property | 83* (3.50) | 83* (3.08) |
b. Institutional versus research rights to own and use data | 92* (3.50) | 75* (3.08) |
c. Commercially useful data | 100* (3.58) | 75* (3.17) |
d. Negotiating contracts | 33 (2.50) | N/A= |
7. Data privacy | 100* (3.50) | 67* (3.00) |
a. HIPAA and other privacy rules | 67* (3.50) | 58 (2.83) |
b. HIPAA and other privacy standards | 55 (2.91) | 50 (2.60) |
c. Confidentiality protection techniques | 100* (3.42) | 75* (3.00) |
8. Scientific methodology issues, including research design, objectivity, and bias | 92* (3.67) | 92* (3.33) |
a. Importance of research design | 100* (3.75) | 100* (3.50) |
b. Elements of good scientific design and methodology | 100* (3.75) | 100* (3.42) |
c. Proper use versus abuse of statistics | 100* (3.75) | 100* (3.45) |
d. Challenges to maintaining objectivity in designing research questions, controlling bias | 92* (3.58) | 92* (3.25) |
9. Data reporting | 100* (3.75) | 83* (3.17) |
a. Ethical issues when reporting data in publications | 92* (3.67) | 75* (3.08) |
b. Responsibility to interpret findings appropriately to diverse audience, scientific and otherwise | 100* (3.58) | 75* (2.83) |
10. Special issues related to scientific roles | 82* (3.18) | 64 (2.73) |
a. Obligations of students to supervise their own data collection efforts | 64 (2.91) | N/A= |
b. Roles and relationships among team members | 92* (3.25) | 67* (2.58) |
c. Who has the authority to make data related decisions | 92* (3.25) | 55 (2.55) |
Legend:
* = Item achieved a “consensus” by receiving a rating of important or very important from two-thirds of participants
= = Not applicable because these items were eliminated after round 2 and their importance of being assessed was not measured
Core Area VIII. Research Misconduct
Topic (Subtopics indented) | Percentage of participants rating item as “important” or “very important” (Mean score) | |
---|---|---|
Teaching | Assessing | |
1. Significance of misconduct | 100* (4.00) | 100* (3.75) |
a. History of scientific misconduct | 82* (3.00) | 42 (2.17) |
b. Incidence rate of misconduct | 58 (2.58) | N/A= |
c. Consequences of misconduct for individuals, laboratories, science, and society | 100* (3.64) | 67* (3.00) |
2. Factors that contribute to scientific misconduct | 100* (3.73) | 75* (3.25) |
a. Effects of laboratory environment | 100* (3.64) | 75* (3.08) |
b. Reward systems in academic and industry settings | 100* (3.45) | 67* (2.83) |
3. Plagiarism | 100* (3.91) | 83* (3.33) |
a. Definition and examples | 100* (3.73) | 92* (3.25) |
b. Case studies with outcomes and punishments | 83* (3.18) | 58 (2.67) |
4. Falsification | 100* (4.00) | 92* (3.50) |
a. Definition and examples | 100* (3.82) | 92* (3.25) |
b. Case studies with outcomes and punishments | 100* (3.60) | 73* (3.00) |
5. Fabrication | 100* (4.00) | 92* (3.50) |
a. Definition and examples | 100* (3.82) | 91* (3.18) |
b. Case studies with outcomes and punishments | 100* (3.55) | 75* (3.00) |
6. Other serious deviations from scientific best practices | 80* (3.22) | 60 (2.70) |
a. Sabotage | 58 (3.00) | N/A= |
b. Questionable research practices (e.g., data manipulation) | 100* (3.55) | 75* (3.00) |
c. Unintentional deviations | 100* (3.45) | 67* (2.92) |
7. Regulations and policies addressing misconduct | 82* (3.40) | 82* (3.18) |
a. The Office of Research Integrity’s role in addressing misconduct | 92* (3.18) | 50 (2.58) |
b. Institutional policies | 92* (3.36) | 67* (2.92) |
8. Responding to observed misconduct | 100* (3.91) | 92* (3.42) |
a. Evidence needed to report misconduct | 100* (3.73) | 64 (3.00) |
b. Whistle blowing, including responsibilities and protections for whistle blowers | 100* (3.82) | 92* (3.25) |
c. Alternatives to whistle blowing with illustrations of good and bad responses | 92* (3.45) | 75* (2.92) |
9. Studying taboo, controversial, or politically sensitive research topics | 83* (3.09) | 50 (2.50) |
Legend:
* = Item achieved a “consensus” by receiving a rating of important or very important from two-thirds of participants
= = Not applicable because these items were eliminated after round 2 and their importance of being assessed was not measured
Recruitment and Panelist Biosketches
Recruitment:
Recruitment began with (1) a literature search to identify authors actively researching and publishing in RCR, (2) a review of ORI Annual Reports from 2000 through 2005 to identify those who received ORI contracts and grants, and (3) review of recent research administrative and RCR conference programs to identify those who had presented on relevant topics. Lastly, certain subgroups such as current research students, research administrators and journal editors, were identified for specific panels. From the resulting list of possible participants, the Project Director, in consultation with ORI, selected those who were both qualified to serve on a particular panel and represented diverse backgrounds. For the Panel 2, experts with knowledge of and experience in the related core areas were recruited along with a select group of general RCR experts. Recruitment letters were sent to these individuals, asking them to volunteer without compensation for a total of 1.5 hours (30 minutes for each round) over approximately nine months. Those who declined participation, but represented a subgroup of interest, were asked to provide a recommendation for another possible participant. The resulting consenting participants who completed at least two of the three rounds are listed below.
Panelists:
Tom Champney, Ph.D. is Deputy Chair in the Department of Anatomical Sciences and Adjunct Professor in the Department of Bioethics at St. George’s University in Grenada, West Indies. He received his Ph.D. in 1984 from the University of Texas Health Science Center at San Antonio in Biomedical Research. A dedicated histology teacher, Dr. Champney also currently teaches the course “Scientific Ethics: Responsible Conduct of Research” at St. George’s University. He has also served as a proposal reviewer for the Office of Research Integrity (ORI) since 2005.
Allen Goldman, Ph.D is a Professor and Head of the School of Physics and Astronomy at the University of Minnesota as well as Distinguished Professor at the Institute of Technology. Dr. Goldman received his Ph.D. in Physics form Stanford University in 1965. His research involves Superconductivity, primarily in the configuration of thin films, with emphasis on the effects of disorder and dimensional constraints.
Elizabeth Heitman, Ph.D. is Associate Professor of Medical Ethics in the Center for Biomedical Ethics and Society at Vanderbilt University School of Medicine. She holds secondary appointments in the Department of Religious Studies and the Center for Medicine, Health and Society in the College of Arts and Science, where she teaches Responsible Conduct of Research. She received her Ph.D. in Religious Studies in 1988 as the first graduate of the inter-institutional program in clinical ethics offered by Rice University and the University of Texas Medical School at Houston. Her primary research addresses the evaluation of education in the responsible conduct of research, and the cultural awareness and professional socialization of students and researchers. She is principal investigator for a NSF-sponsored study of international science graduate students’ experience of US standards of practice in ethical research.
Michael Kalichman, Ph.D. currently holds several titles including: President of the Responsible Conduct of Research Consortium, Co-Director of the San Diego Center for Ethics in Science and Technology, and Director of the Research Ethics Program at the University of California at San Diego School of Medicine and Office of Graduate Studies and Research. Dr. Kalichman received his Ph.D. in Pharmacology from the University of Toronto in 1980. In the past 15 years, Dr. Kalichman has dedicated himself to RCR, believing that working scientists should set the agendas of RCR training. He has worked with numerous institutions to assess the effectiveness of teaching RCR, and continues to hold local and national RCR leadership positions.
Murali Krishnamurthi, Ph.D. is Director of the Faculty Development and Instructional Design Center and Associate Professor in the Department of Industrial Engineering at Northern Illinois University. He received his Ph.D in Industrial Engineering from Texas A&M University in 1988. Dr. Krishnamurthi has received several grants from ORI for his projects, including: Promoting Responsible Peer Review and Publishing Through Interactive eLearning Experience and Responsible Conduct in Data Management in 2005, Active Learning Online on Responsible Mentoring and Collaboration in 2004, and Online Decision Instruction on Data Integrity in 2003, resulting in many online RCR educational modules.
Ross McKinney, M.D. serves as the Vice-Dean of Research at Duke University Medical School. He received his M.D. in 1979 from the University of Rochester Medical School and completed his residency in Pediatrics at Duke University Medical School specializing in Infectious Diseases. Dr. McKinney's research is in the antiretroviral treatment of HIV infected children. In addition to his position at Duke University, he also serves as a member of the editorial board of the journal Pediatrics and as a member of the steering committee for the American Association of Medical Colleges’ (AAMC) Forum on Conflict of Interest in Academic Societies and the Group on Research and Development (GRAND).
Perry Molinoff, M.D. is a former chair if the Pharmacology Department at the University of Pennsylvania School of Medicine and, through 2006, was the Vice Provost for Research at the University of Pennsylvania. Dr. Molinoff received his M.D. from Harvard Medical School in 1967. He is well published in pharmacology and neuroscience, and has dealt directly with issues related to the conduct of research, including human research and the conduct of clinical trials. He has also served on several NIH committees and editorial boards, worked in the pharmaceutical industry, taught classes and trained Ph.D. and postdoctoral students.
Trent Moreland, M.S. is a student in the Ph.D. program at Saint Louis University. He anticipates receiving his Ph.D. in 2008 in Pharmacological and Physiological Science. He has also received his M.S. in Biology, concentrating in cellular and molecular regulation, from Saint Louis University in 1997. Mr. Moreland worked as a scientist at Pfizer for six years prior to beginning his doctoral studies at Saint Louis University.
David Resnik, Ph.D., J.D. is a bioethicist at National Institute of Environmental Health Science, National Institutes of Health. He received his Ph.D. from the University of North Carolina, Chapel Hill in Philosophy in 1990 and his J.D. from Concord University School of Law in 2003. Dr. Resnick has received grants from the National Institutes of Health and the National Science Foundation for projects relating to Responsible Conduct of Research in Scientific Record-Keeping and Data Management. He also serves on several NIEHS/NIH committees including: the Institutional Review Board for Human Subjects Research as Vice-Chair, the Research Ethics Committee, the Trans-NIH Bioethics Committee, and the NIH Committee on the Conduct of Science. He has also been associate editor of Accountability in Research since 2001.
Adil Shamoo, Ph.D. is the founder and Editor-in-Chief of the journal Accountability in Research. He is currently professor and former chairman of the Department of Biochemistry and Molecular Biology, professor of Epidemiology and Preventive Medicine, and a member of the graduate faculty of Applied Professional Ethics, affiliated with the Center for Biomedical Ethics, at the University of Maryland in Baltimore. Dr. Shamoo received his Ph.D. in 1970 in Biophysics from the City University of New York, and is a certified IRB professional (CIP). Since 1994, Dr. Shamoo has been teaching a graduate course on Responsible Conduct of Research. In 2003, he co-authored a textbook with Dr. David Resnik entitled Responsible Conduct of Research, and has numerous other publications within the field of RCR.
Joan E. Sieber, Ph.D. is Professor Emerita of Psychology at California State University, Senior Research Associate at Simmons College, and a Fellow of the American Psychological Association. During the last 25 years, Dr. Sieber has specialized in empirical research on questions of scientific ethics, culturally sensitive methods of research and intervention, data sharing methodology, and whistle-blowing or alternatives to it. She serves as a site visitor to IRBs seeking accreditation, sat on the Accreditation Council of the Association for the Accreditation of Human Research Protection Programs (AAHRPP), and serves on various editorial boards and review panels. She is Editor-in-Chief of a new international peer-reviewed nonprofit educational journal, the Journal of Empirical Research on Human Research Ethics (JERHRE).
Nicholas H. Steneck, Ph.D. is Professor Emeritus of History in the College of Literature, Science, and the Arts at the University of Michigan and a consultant at the Office of Research Integrity. He joined the University of Michigan faculty after graduate studies in history and the history of science at the University of Wisconsin. He chaired the University of Michigan’s pioneering Task Force on Integrity in Scholarship in 1984 and went on to chair the Public Health Service Advisory Committee on Research Integrity from 1991 to 1993. He has published articles on the history of research misconduct policy, the use of animals in research, classified research and academic freedom, and the role of values in university research. Most recently, he authored the ORI Introduction to the Responsible Conduct of Research (2004).
Sandra Titus, Ph.D. currently serves as the Director of the Intramural Research Program at the Office of Research Integrity, DHHS. She received her Ph.D. in 1978 from the University of Minnesota in Family Social Science. At ORI, Dr. Titus establishes research agenda, hires contractors, and implements and reports on study results. She currently has five projects under way: Incidence of Research Misconduct, Training a PhD: Roles of Advisors, Mentors and Institutions, Research Faculty Awareness of Misconduct Regulations and Perceptions of Institutional Integrity in Handling Cases, Research Integrity Officers Role, and Where was the Mentor: Cases of Graduate Student and Post Doc ORI Misconduct Cases. She also organizes Institutional Conferences sponsored by ORI and often speaks on Research Integrity, Research Misconduct and Mentoring.
Acknowledgements:
This project was funded by a RCR Resource Development contract from the Office of Research Integrity. James DuBois served as the project director and Jeff Dueker as the research assistant. Kathleen Wyrwich served as a methodology consultant. Courtney Andrews served as a technology consultant and created the online questionnaires.
This report was submitted to the Office of Research Integrity by the project director on July 12, 2007.