In a recent post I summarized the scholarly literature on retractions, including but not limited to retractions for misconduct. In this post I’ll focus on ecology and evolution specifically. How many papers have ever been retracted from ecology and evolution journals? For what reasons? Are retractions from EEB journals similar to, or different than, retractions from other fields, such as biomedicine?
To answer those questions, I searched the Retraction Watch database for retractions and expressions of concern from a long list of ecology and evolution journals (namely, every journal with “ecology” and/or “evolution” and/or “evolutionary” in the title, plus every other EEB journal I could think of).* The database is large but not comprehensive, so I added in one retraction I know of from an EEB journal that’s not in the database. I included partial retractions for misconduct. They’re classified as “corrections” in the database, but I consider them “retractions”. I ignored four retractions due to publishers inadvertently publishing the same paper twice.
Here’s what I found:
- 56 retractions or expressions of concern from EEB journals. I don’t know how many papers EEB journals have published, but 56 has to be some vanishingly tiny fraction of all EEB papers.
- The oldest now-retracted paper from an EEB journal was published in 1998. At least, the oldest one I know of. That’s Anders Pape Møller’s retraction from Oikos, which isn’t in the database. The oldest EEB retractions in the database are from 2003. So even allowing for the possibility that a few older retractions are missing from the database, retractions in ecology and evolution basically weren’t a thing before I started my my PhD in 1995. The timing also coincides with the internet becoming a thing. So clearly, the advent of EEB retractions is either my fault, or the internet’s fault. 🙂 #posthocergopropterhoc
- The time from publication to retraction is getting shorter for EEB journals. EEB papers published in 2010 or earlier that ended up getting retracted (or receiving an expression of concern) took an average of over 4 years to do so. It’s down to about 1.5 years for EEB papers published from 2011 onwards, and down further to an average of about 0.5 years for papers published from 2017 onwards. It’s not just EEB; the time to retraction is dropping in all fields. It’s one big reason why the frequency of retractions is growing. Many journals these days have retraction policies and use them. And these days, the raw data for many papers gets put online, so that if it’s going to be scrutinized it can be scrutinized quickly. So papers that end up getting retracted, get retracted much faster these days. Another way to say the same thing is to say that the large majority of retractions from EEB journals have happened within the last few years. It’s just that some of them were retractions of old papers. (UPDATE: as commenter Andrew McAdam notes, the mean time to retraction for papers published in the last couple of years might well increase in future, just because as of this writing it’s impossible for (say) a 2019 paper to have gone >1 year from publication to retraction. Just eyeballing the data, it doesn’t look to me like this constraint fully accounts for the drop in average time from publication to retraction. For instance, the 8 retracted EEB journal papers published in 2014 could have times from publication to retraction of anywhere from 0-6 years. But they all went 2 years or less from publication to retraction. It’s a similar story for other recent-ish years–observed times to retraction are bunched up near the low end of the range of mathematically-possible values. Further back, that’s not the case–times to from publication to retraction are spread more uniformly throughout the range of mathematically-possible values. But the mathematical constraint is worth being aware of. /end update)
- 46% of retractions or expressions of concern were for some form of author misconduct, or possible/likely misconduct. That breaks down as nine cases of duplicate publication, nine cases of data fabrication, four cases of plagiarism, three cases of possible/likely data fabrication, and one case of fake peer review. That 46% isn’t too far from the percentage of all retractions across all fields that are due to misconduct, once you allow for the small sample size of EEB retractions. So I stand corrected. In an old post, I speculated that retractions for misconduct were especially rare in EEB compared to other fields. That speculation was based on anecdotes. Now that we have more comprehensive data, I don’t see any reason to think that misconduct, or retraction for misconduct, is notably rarer (or notably more common) in EEB than in other fields.
- 36% of retractions or expressions of concern in EEB journals were for unintentional author error in data collection, data analysis, or mathematical derivation. That percentage isn’t too far from the percentage for all retractions from all fields that are due to author error.
- 18% of retractions or expressions of concern in EEB journals were for some other reason, or for an unspecified reason. That includes a couple of disputes over whether all authors had given permission to publish, a case in which authors could not accurately state where their fossils were housed, a dispute over whether the authors had legal permission to collect their samples, a retraction for “copyright reasons”, and several cases for which no reason for retraction was given.
- Retractions for duplicate publication and plagiarism from EEB journals mostly only happen in certain specific contexts. All but one of the duplicate publications were in obscure EEB journals. The remaining one was a case of someone republishing a paper previously published in an obscure non-English language journal in an English-language journal, without first getting the English language journal’s permission. All but one of the cases of duplicate publication involved authors based in low- or middle-income countries, many of which lack strong research integrity policies, and some of which have, or used to have, policies of rewarding publications directly with cash payments. And all four cases of plagiarism involved authors from low- or middle-income countries. Obviously, the sample sizes here are small, so you wouldn’t want to overgeneralize from them. But these results for EEB journals broadly line up with data from other fields.
- Retractions for unintentional author error from EEB journals mostly only happen in certain specific contexts. Namely, they happen almost exclusively in internationally-leading, high-impact EEB journals. My interpretation of this is not that authors and editors for leading EEB journals are sloppier than others. Rather, just the opposite: leading EEB journals, and the authors who publish in them, care a lot about getting things right. Papers in leading EEB journals probably also get more closely scrutinized by more readers than do papers in obscure or highly specialized journals. So the rare serious errors in papers in leading EEB journals are more likely to be discovered than are serious errors in more obscure journals, and are more likely to lead to retraction when retraction is warranted.
- Most of the twelve cases of fabrication or possible/likely fabrication in EEB journals came from a few repeat offenders. Five were from plant physiologists Harsh Bais and Jorge Vivanco. Bais fabricated data in numerous papers as Vivanco’s postdoc and Vivanco covered it up after he found out. Two were from a single Russian group. All authors but one asked for the two papers to be retracted after a Russian university investigation found that the remaining author had fabricated data. Two were from Jesus Lemus and Javier Grande, who may have doctored samples and invented a co-author. And it’s possible that at least two of the three remaining cases of fabrication or possible fabrication were from repeat offenders too. Anders Pape Moller only has one retraction for misconduct, but there are reasons to think other papers of his were products of misconduct (see links here). And the outcome of the Jonathan Pruitt case remains to be determined officially, but he is likely to end up with numerous retractions from EEB journals. Some ecologists who’ve looked closely at his data have publicly stated their personal view that he fabricated data (see links here**). Broadly speaking, these results for EEB journals line up with data from other fields. A disproportionately large fraction of all retractions, and an even larger fraction of all retractions for misconduct, come from a very small number of serial offenders.
- These data give no reason for blanket distrust of specific EEB journals. Yes, some EEB journals have more retractions than others. But no EEB journals has more than five. And the EEB journal that has the most–Journal of Chemical Ecology–is in the top spot only because it was a favored outlet of serial fraudsters Bais & Vivanco. Three of their retractions were from J Chem Ecol.
*It’s harder to search for retractions of EEB papers from general biology or general science journals, so I didn’t. I’m prepared to put X amount of work into a blog post, where X is >0 but not large. 🙂
**I have my own views on the Pruitt case, which I’m choosing not to share publicly at this time. But I don’t think anyone who’s chosen to state their views publicly should be chided for doing so. There are cogent reasons to publicly express a view (if you have one), and cogent reasons not to.