I continue to keep a close eye on developments in the various ongoing, high profile cases of apparent data fabrication in ecology. Retraction Watch has the news that Science has issued an Expression of Concern for Dixson et al. 2014. This was a high profile paper claiming to demonstrate strong deleterious effects of ocean acidification on fish behavior. The EoC is being issued in part thanks to Science’s own investigative reporting, which helped uncover evidence of apparent data fabrication.
Relatedly, Jeff Clements and colleagues just published a major new meta-analysis of effects of ocean acidification on fish behavior, revealing an absolutely massive decline effect. That is, early studies reported big effects, but subsequent studies have found basically squat. Further, those early studies reporting big effects are all by the same lab group, of which Danielle Dixson was a member. Drop the studies from that one lab group, and you’re left with studies that mostly report small or zero effects. Speaking as someone who just co-authored a paper that looks systematically for decline effects in 466 ecological meta-analyses, and mostly fails to find them (Costello & Fox, in press at Ecology), I can tell you that the decline effect in Clements et al. is enormous. I couldn’t find anything close to a comparable decline effect anywhere else in ecology. Nor do any of the other, weaker decline effects I found have such a strong association with the work of one lab group. Clements et al. is a great paper. It’s very thorough; they check, and reject, a bunch of alternative explanations for their results. Even if you’re not a behavioral ecologist, you should read it.
Lots to unpack from Clements et al! And look forward to Costello & Fox.
Here’s a recent example to “change” in effects. Not the same as declining effects, partly because study designs seem to improve 2000s onwards.
Thanks for sharing that.
We can’t un-obfuscate “decline effect”? !
I mean really all that is happening is that people are going back and re evaluating past results, as we all should as part of the scientific process. It’s a bit sad that it is necessary to sensationalize every step of the process to get page views.
Even when that paper was published it was obviously problematic. I had used it in a class on methodology as an example of bad BE research practice and to underscore the importance of critical thinking. Just because Science published it doesn’t certify authors have done good work. And getting an article published, in general, is not certification that a particular article is true, or even well-done.
Hi Jennifer,
With respect, I’m pretty confident my (and Clements et al.’s) use of the term “decline effect” is standard. It does *not* just mean “recent studies find somewhat different results than older studies”. For instance, see here: https://www.cell.com/trends/ecology-evolution/fulltext/S0169-5347(19)30158-2?rss=yes
I confess I’m amused by the suggestion that a blog that’s been largely inactive for months is sensationalizing the rare posts it still publishes in order to draw traffic. “Don’t post much” is about the worst possible strategy for drawing page views. 🙂 You will be reassured to know that this post is not in fact drawing many pageviews. It’s drawn a couple of hundred so far since it was published over 2 days ago. Which is less than half the traffic that even the most anodyne post of ours used to draw on just the first day, back when this blog was active. Nor has this post been shared widely on social media–just a couple of retweets.
“Even when that paper was published it was obviously problematic.”
Yes, I’m aware of that.