When a scientist is credibly accused of serious scientific misconduct, one or more institutions–often but not always the accused’s employer–launches a formal investigation. How long do those investigations typically take? Here’s some data I compiled.
I did some quick googling for information on institutional investigations into the top 10 researchers on the Retraction Watch leaderboard for most retractions all time. I also googled for information on a few other cases I chose because they’re famous (e.g., Jan Hendrik Schön), or because they’re from ecology and evolution (e.g., Oona Lönnstedt). All the institutional investigations I looked at concluded that the researcher in question had committed scientific misconduct. But they varied in how long they took to reach that conclusion:
Yoshitaka Fujii: An anesthesia journal began investigating Fujii’s work in 2010, I think in collaboration with other journals. The journal hired a statistical consultant as part of the investigation. The journal released a report into 168 of Fujii’s papers in Mar. 2012. Then, the Japanese Society of Anesthesiologists began investigating >200 of Fujii’s papers in Apr. 2012, and reported in June 2012 that most were fabricated. So that’s a ~2 year investigation, followed by a ~3 month investigation.
Joachim Boldt was suspended from a German hospital on 10 Nov. 2010. The hospital investigation concluded in Aug. 2012. 1 year 9 month investigation.
Yoshihiro Sato: 4 institutional investigations began in 2017; 3 were complete by March 2018. I couldn’t find information on how long the 4th investigation took. 3 investigations of, oh, approximately 1 year each.
Jun Iwamoto: see Yoshihiro Sato above. Iwamoto was Sato’s collaborator, so I didn’t double-count those investigations.
Ali Nazari: a whistleblower alerted Swinburne University to anomalies in many of Nazari’s papers in 2017. Swinburne completed its investigation by late Oct. 2019. Approximately 2 year investigation.
Diederik Stapel: Tilburg University suspended him for data fabrication in Sept. 2011 and announced an investigation. The investigating committee released an interim report the following month. The final report was due around Nov. 2012. 1 month (interim investigation), 11 months (final investigation)
Yuhji Saito: see Yoshihiro Sato, above. Saito was a collaborator of Sato’s. I didn’t double-count those investigations.
Adrian Maxim: Couldn’t find any information.
Peter Chen: An investigation by ETH Zurich began sometime around March 2009 (no earlier). The investigative report was completed on 15 July 2009. ~4 months
Scott Reuben: A Baystate Medical Center routine audit turned up irregularities in May 2008. In Mar. 2009, Baystate announced that Reuben had admitted faking data. 10 months
Jan Hendrik Schön: Bell Labs investigation began in May 2002. Report released Sept. 25, 2002. 4 months
Fazlul Sarkar: This one’s a special case. Sarkar resigned from Wayne State in May 2014 to take up a position at Mississippi. Suspicions had already been raised about his work on PubPeer, but as best I can tell Wayne State hadn’t started formally investigating at that point (UPDATE: I seem to be wrong about whether Wayne State had started investigating at that point; see the comments). On June 7, 2014, Mississippi notified Sarkar that they had received allegations of misconduct from PubPeer. Later that month, Mississippi notified Sarkar that the job was rescinded. Sarkar subsequently attempted to unresign from Wayne State. They refused to give him his old position back, and instead reappointed him only to a 1-year position. So I guess you could say that Mississippi completed an investigation in less than a month. But really, it’s probably better to say that there wasn’t any formal investigation comparable to the others on this list. At least not in May-June 2014.
Oona Lönnstedt: Sweden’s Central Ethics Review Board began investigating Lönnstedt (and Peter Eklöv) for data fabrication in a Science paper sometime in 2016 and finished investigating in Apr. 2017. Subsequently, James Cook University began investigating Lönnstedt in late 2019, and publicly announced the results on Aug. 13, 2020. ~1 year investigation, and a ~9 month investigation
Anders Pape Møller: The Danish Committee on Scientific Discovery began investigating after receiving a complaint on 29 Mar. 2001. On 25 Sept. 2002 they released their report. 1 year, 7 month investigation.
So if you’re scoring at home, the mean and median time for these 14 investigations is ~1 year. That’s true whether or not you prefer to count the time to the interim report on Stapel, or the time to the final report. The range is 3 months to 2 years (counting the final report on Stapel rather than the interim report).
Note that two of the investigations that were shorter than 1 year were preceded by longer investigations into the same individual. That might be a blip–or it might be a sign that completion of one investigation accelerates subsequent investigations by other institutions.
No doubt there are many factors that might explain variation in investigation length. These investigations were conducted by different institutions in different countries. They were initially prompted by different things. The investigations covered widely varying numbers of papers and other scientific works. Etc.
I think that these data provide some useful context for the ongoing McMaster University investigation into Jonathan Pruitt. That investigation has been running for approximately 1 year at this point. So, still well within the range of timespans within which broadly similar investigations have been completed in the past. Of course, that doesn’t tell you whether the McMaster investigation is proceeding fast “enough”. That’s a different conversation. But it does suggest that the factors determining the length of the McMaster investigation mostly aren’t specific to the McMaster investigation. So if you want these sorts of investigations to be shorter, you should probably focus most of your attention on systemic factors that affect investigation length.