ORI’s extramural program, “Research on Research Integrity” (RRI), funds empirical research that examines societal, organizational, group, and individual factors that affect research integrity. The program also funds conferences that stimulate multi-disciplinary approaches to promoting research integrity and leads to evidence-based research on research integrity. In 2016, ORI awarded five research grants and five conference grants through the RRI program. These projects already are yielding results that increase our understanding of research integrity. Of the ten awarded projects in 2016, four will be presenting their results at the 5th World Conference on Research Integrity (WCRI), which will be held in Amsterdam from May 28 -31, 2017.
Below are abstracts from published in the 5th WCRI preliminary book of abstracts (http://www.wcri2017.org/images/Abstract-Book-5th-WCRI-2017.pdf).
Detecting data anomalies on the basis of summary statistics
C.H.J. Hartgerink, M.A.L. Van Assen, J.M. Wicherts
Tilburg University, Tilburg, The Netherlands
Objective: Despite initiatives to increase data sharing, raw data underlying research articles are frequently unavailable (some even argue against it). Consequently, statistical methods to detect data anomalies in raw data are conditional on actually retrieving those raw data. We investigate the performance of methods to detect data anomalies due to data fabrication based solely on summary results typically reported in empirical research articles.
Method: We tested the performance of two statistical methods in 36 genuine- and 39 fabricated datasets on the anchoring effect (https://osf.io/b24pq). We inspected the variation in variances in independent conditions and anomalous amounts of high p-values in nonsignificant results. Considering we hardly know how researchers fabricate experimental data, we asked actual researchers to fabricate data for experimental studies instead of simulating datasets.
Results: We noticed that, as a group, fabricated nonsignificant effect sizes resembled genuine nonsignificant effects rather well. For fabricated significant effects, fabricators exaggerated effect sizes drastically (13/39 had r>.9; these were also the largest effect sizes across the board). Upon analyzing the datasets individually for data fabrication, we refined the variance of variances method by altering the assumption from one underlying population variance across all groups to condition-specific variances. This greatly improved the performance of this method (AUC = .42 before, .77 after). Detecting data anomalies in nonsignificant results, based on excessive amounts of high p-values, performed barely better than chance (AUC = .52 and .53 for two different nonsignificant effects).
Conclusion: The results of this study on detecting data anomalies indicates researchers might be better at fabricating nonsignificant effects than fabricating significant effects. The variation of variances method seems fairly good, whereas analyzing high p-values performs at chance level. Moreover, large effect sizes (r>.9) seem like an easy first step in detecting data anomalies in research articles.
Perceptions of Research Misconduct among Faculty Members at America's top 100 Universities: Preliminary Results from a Large Survey-Based Project
M.D. Reisig, K. Holtfreter
Arizona State University, Phoenix, U.S.A.
Objective: This study provides a descriptive assessment the perceived breadth and seriousness of different forms of research misconduct (i.e., data fabrication, plagiarism, authorship fraud, data falsification, publishing fraud, resource mismanagement, and disobeying institutional authority) among faculty members at America’s Top 100 research universities. Results from this ongoing large survey-based project also shed light on the perceived consequences and remedies of research misconduct.
Method: The data collection for this study followed a mixed-mode strategy whereby approximately 500 randomly selected individuals were administered online surveys and 500 randomly selected individuals completed mail questionnaires. The data from both samples were pooled to maximize statistical power.
Results: The findings from the study not only reveal information about scientific misconduct (e.g., prevalence, seriousness, and perceived causes) among a representative sample of active researchers, but also provide a comparative assessment of such factors across the social, natural, and applied sciences.
Conclusion: Researchers need to think creatively about ways of gauging the prevalence, seriousness, and causes of scientific misconduct beyond simple self-report studies conducted in specialized fields of studies. Samples consisting of active researchers from all scientific disciplines provide for much richer analyses from which to guide formal prevention efforts.
Repair Consensus Guidelines: Responsibilities of Publishers, Agencies, Institutions and Researchers in Protecting the Integrity of the Research Record
J. Broccardo1, N. Aubert Bonn2
1 Colorado State University, Fort Collins, U.S.A.
2 Hasselt University, Hasselt, Belgium
There are a multitude of barriers to the effective handling of misconduct related retractions, with confusion regarding roles and responsibilities of key stakeholders. In order to work through these barriers, we have drafted the RePAIR Consensus Guidelines. This document is the result of discussions that emerged from the 2016 conference Keeping the Pool Clean: Prevention and Handling of Misconduct Related Retractions (funded by U.S. DHHS grant #ORIIR150014 and Colorado State University). A subgroup of attendees met to draft a guidance document outlining the roles and responsibilities of key stakeholders in the retractions process. International input will be sought prior to obtaining signatories through an open comment period on the Committee on Publication Ethics website and elsewhere. The draft output of this session is the RePAIR Consensus Guidelines. This document defines respective roles and responsibilities of publishers, agencies, institutions, and researchers in protecting the integrity of the research record when questions regarding misconduct arise. The signatories of this document will support the adoption of these guidelines, with key roles summarized as follows: 1) publishers should post a clear policy on misconduct, investigate and communicate credible allegations, and issue freely available retraction notices explaining reason for retraction; 2) existing agencies should perform thorough, timely, and impartial oversight and/or investigation of credible allegations according to individual agency policy; 3) institutions should designate a Research Integrity Officer or equivalent, establish confidential and visible channels for reporting and investigating misconduct, communicate with journals/agencies when misconduct is suspected and when retractions are warranted; and 4) researchers should create an environment conducive to ethical conduct of research, address possible ethical breaches as appropriate, employ rigorous experimental and analytical methods/analyses, and maintain, review, and share primary data. Barriers to open communication are acknowledged and include privacy laws, lack of regulatory and/or research integrity oversight in some countries/institutions, lack of reporting options or whistleblower protections, and unwillingness/inability to share primary data. Widespread support of key signatories agreeing to adopt these guidelines will promulgate best practices in the handling of misconduct related retractions. Such actions are critical to maintain the transparency and integrity of the research record.
Conclusions Inter-American Encounter on Scientific Honesty
S.G.L. Litewka1, E. Heitman2, R.H. Hall3, J. Linares4, P. Ostrosky4
1 University of Miami, Miami, U.S.A.
2 Vanderbilt University Medical Center, Nashville, TN, U.S.A.
3 Autonomous University of Queretaro, Queretaro, Mexico
4 Autonomous University of Mexico, Mexico City, Mexico
Under a grant from the U.S. Office of Research Integrity (ORI), we are working with colleagues at the National Autonomous University of Mexico (UNAM) and Autonomous University of Queretaro (UAQ) to host a conference on the essential academic policies and related infrastructure necessary to support institutional integrity in Mexican universities. Starting with a set of international surveys about institutional infrastructure for research and a review of policy documents and relevant statements from their home institutions and national agencies, a pre-conference working group from several universities across Mexico will identify the strengths and needs of Mexican policy and practice in the following thematic areas: 1) Defining, preventing, and responding to research misconduct; 2) Standards of authorship and responsible publication practices; 3) Conflicts of interest and their management; 4) Data collection, management, ownership, and sharing; conflict 4) Collaborative research and divergent international policies; and 5) Developing a policy and curriculum on research integrity and responsible conduct of research. The conference, scheduled for mid-January 2017 at UNAM, will engage 150+ additional participants to critique and expand upon the on the working group’s provisional definition, typology, and estimated scope of misconduct relevant to Mexican universities. Drawing on participants’ experience and insights, the conference will propose a framework for new policies, practices, and infrastructure necessary to promote and sustain scientific integrity across Mexican universities and support Mexican researchers in international collaborations. Results: The pre-conference activities and conference will result in a series of conclusions about the state of research integrity in Mexico, a parallel series of recommendations and priorities for institutional action on research integrity, and a draft policy framework applicable to the country’s academic research institutions. Conclusion: Mexico has a vast number of academic research institutions but limited policy and infrastructure to support research integrity. The conference will shed light on existing perceptions in Mexico about misconduct and prompt new engagement by Mexican universities in the promotion of institutional research integrity.
Best wishes to the grant awardees. I'm sorry to say that I won't see you in Amsterdam.