Shameless self-promotion alert: my lab’s new Ecology paper shows that the truth does not “wear off” in ecological research (at least, not usually)

Back in 2010, Jonah Lehrer wrote a big New Yorker feature called “The Truth Wears Off“. In it, he called attention to anecdotal observations that many of the effects and phenomena scientists study seem to shrink over time. Lehrer’s article popularized the term “decline effect” to summarize this pattern. Recently, some striking examples of the decline effect have been reported in ecology, such as in the declining effect of ocean acidification on fish behavior. Further back, Jennions & Møller (2002) found that decline effects were ubiquitous in the (relatively few) ecological and evolutionary meta-analyses that had been published at the time.

Outstanding undergraduate Laura Costello and I decided to revisit the prevalence of decline effects in ecological research, using my quite comprehensive compilation of all the data from 466 ecological meta-analyses. We’re very excited that the paper is now online at Ecology. You should click through and read it (of course, I would say that!). But the tl;dr read version is that the only common decline effect in ecology is in the decline effect itself. The truth no longer “wears off” in ecology, if it ever did. Decline effects might’ve been ubiquitous in ecological meta-analyses back in the 1990s, but they aren’t any more. Only ~3-5% of ecological meta-analyses exhibit a true decline in mean effect size over time (as distinct from regression to the mean, which happens even if effect sizes are published in random order over time). Read the paper if you’re curious about our speculations as to why decline effects are now rare in ecology.

This is the third paper of mine that grew out of a blog post, which is my tissue-thin justification for sharing news of the paper in a blog post. 🙂

I think we need a “shrink ray” for estimated mean effect sizes in ecological meta-analyses, but I’m not sure how to build one. Can you help?

I’m guessing that most readers of this blog will be familiar with the concept of shrinkage estimation. But if not, here’s an example to give you the idea. Imagine you’re trying to estimate the true winning percentage of each team in a professional soccer league–the percentage of games each team would win if, hypothetically, it played each of the other teams many, many times. But it’s early in the season, and so each team has only played each of the other teams once. You could take each team’s observed winning percentage as an estimate of its unknown true winning percentage. But those estimates come from small samples of games, and the better team doesn’t necessarily win every game because chance events play a role. So observed winning percentages after just a few games are imprecise estimates of those unknown true winning percentages. Meaning that the variance among teams in their observed winning percentages surely overestimates the variance among teams in their unknown true winning percentages. In all likelihood, the team with the highest observed winning percentage so far is not only good, it’s also gotten lucky. It’s good, but likely not as good as its observed winning percentage suggests. And in all likelihood, the team with the lowest observed winning percentage so far is not only bad, it’s also gotten unlucky. It’s bad, but likely not as bad as its observed winning percentage suggests. Put another way, as the teams play more games, they’re likely to regress to the mean. So in the aggregate, you can improve your estimates of the teams’ true winning percentages if you shrink the observed winning percentages towards 50% (the average winning percentage). You make the bias-variance trade-off work in your favor by biasing all of your estimates towards the mean, in order to reduce their variance. There are ways to work out exactly how much shrinkage is optimal.

I think we need shrinkage estimation for mean effect sizes in ecological meta-analyses. That is, I think many ecological meta-analyses provide very imprecise estimates of the unknown “true” mean effect size. So that, in aggregate, those estimated mean effect sizes would be improved if they were shrunk towards the mean. Here, see for yourself:

Continue reading

Science issues Expression of Concern for Dixson et al. 2014, a major behavioral ecology paper for which Science’s own reporters uncovered evidence of data fabrication

I continue to keep a close eye on developments in the various ongoing, high profile cases of apparent data fabrication in ecology. Retraction Watch has the news that Science has issued an Expression of Concern for Dixson et al. 2014. This was a high profile paper claiming to demonstrate strong deleterious effects of ocean acidification on fish behavior. The EoC is being issued in part thanks to Science’s own investigative reporting, which helped uncover evidence of apparent data fabrication.

Relatedly, Jeff Clements and colleagues just published a major new meta-analysis of effects of ocean acidification on fish behavior, revealing an absolutely massive decline effect. That is, early studies reported big effects, but subsequent studies have found basically squat. Further, those early studies reporting big effects are all by the same lab group, of which Danielle Dixson was a member. Drop the studies from that one lab group, and you’re left with studies that mostly report small or zero effects. Speaking as someone who just co-authored a paper that looks systematically for decline effects in 466 ecological meta-analyses, and mostly fails to find them (Costello & Fox, in press at Ecology), I can tell you that the decline effect in Clements et al. is enormous. I couldn’t find anything close to a comparable decline effect anywhere else in ecology. Nor do any of the other, weaker decline effects I found have such a strong association with the work of one lab group. Clements et al. is a great paper. It’s very thorough; they check, and reject, a bunch of alternative explanations for their results. Even if you’re not a behavioral ecologist, you should read it.

Prominent botanist Steven Newmaster seems to have faked and exaggerated his work for many years, according to a major investigation by Science (UPDATED)

Kudos to then-grad student Ken Thompson for speaking up about the anomalies that kicked off this investigation, and kudos to Science for pursuing the investigation so well.

I thought #pruittdata was jaw-dropping, and honestly I’m not sure it measures up to this story. Jesus.

Hopefully Science’s investigation will make it hard for Guelph to sweep the matter under the rug, as they seem to be trying to do. I guess we’ll see. Doesn’t it depend how shameless the Guelph administration is? I mean, what if they just don’t care about people (including many senior members of their own staff) criticizing them in public?

I hadn’t realized that Science’s outstanding investigative journalism team is funded in part by a donation from emeritus biology prof Daniel Pinkel. That suggests a thought: maybe one way to improve investigations into scientific misconduct is to fund more investigative journalists. Obviously, that only works for high profile cases, but it’s something.

UPDATE: You can donate here to support Science’s investigative journalism. I just did. I hope you’ll consider it.

#pruittdata latest: another one bites the dust

Earlier this winter, former leading behavioral ecologist Jonathan Pruitt was placed on paid administrative leave by McMaster University, with no access to grant funds or students, and was suspended from his Canada 150 Chair. Now he’s lost yet another paper. Pruitt & Krauel 2010 J Evol Biol has been retracted. The retraction is due in large part to outstanding data forensics by Erik Postma. You should click through and read the whole thing.

I’m pretty sure this is the oldest paper of Pruitt’s to be retracted so far. According to Wikipedia, Pruitt now has 17 retractions (the 16 listed on Wikipedia as of this writing, plus the new one that was just announced) and 6 Expressions of Concern. Protip: don’t do science in a way that eventually results in your Wikipedia page looking like his…

See also Am Nat EiC Dan Bolnick’s brief but interesting remarks on the important role that journals continue to play in enforcing scientific integrity. Of course, some journals play that role more effectively than others. Not sure what some of the journals that put placeholder EoCs on Pruitt’s papers are waiting for. Looking at you, Nature and PNAS…

The Dead of Winter

I link to this every year. Key quote:

A society—a civilization, if you like—is a hard thing to hold together. If you live in an agrarian society, and you have only stone, wood, and bone for tools, and you are on the western edge of Europe, few times are harder than the dead of Winter. The days are at their shortest, the sun is far away, and the Malthusian edge is right in front of you. It’s no wonder so many religious festivals take place around the solstice. Here were a people, more than five millennia ago, able not only to pull through the Winter successfully, but able also to build something like a huge timepiece to remind themselves that they were going to make it.

Last minute Zoom seminar/discussion announcement: Tim Parker on reliability of ecological knowledge, today at noon Mountain Time!

Should’ve posted this earlier, but hopefully better late than never: I’m hosting a Zoom seminar at noon today from Tim Parker on the reliability of ecological knowledge and how we can improve it. Tim’s thought a lot about the “replication crisis”, whether it might apply to ecology, and what could be done about it. And also about openness, transparency, and best practices in scientific research. Tim’s going to talk for about 45 minutes, and there will be an extended discussion afterwards. All are welcome, hope you can join us. Zoom link and a brief advert below.

Speaker: Tim Parker from Whitman College in Washington State

Title: What undermines reliability in ecology and evolutionary biology and how can we improve?

Join Zoom Meeting: https://ucalgary.zoom.us/j/98959651766

Meeting ID: 989 5965 1766
Passcode: 919471

Context: In a variety of empirical disciplines, the last decade has seen a growing awareness that results may be less reliable than we might hope for or expect. Although we lack precise estimates of the reliability of results in ecology and evolutionary biology, there is evidence from these disciplines of a variety of practices that undermine empirical reliability. This presentation will review this evidence and discuss proposals to increase the reliability of research in ecology and evolutionary biology.

Background: Tim Parker is a behavioral ecologist with a long standing interest in the reliability of results in ecology and evolutionary biology. He began investigating this issue more than 10 years ago, and has co-authored multiple reviews, opinion pieces, and empirical papers on this topic. He is a professor of biology and environmental studies at Whitman College in Washington State, and he is a founding member of SORTEE – the Society for Open, Reliable, and Transparent Ecology and Evolutionary Biology.

Two openings for ecology grad students in Jeremy Fox’s lab at the University of Calgary for Sept. 2022 or Jan. 2023

I am seeking two graduate students (MSc or PhD) to start in Sept. 2022 or Jan. 2023.

For background on my lab, visit my lab website. Briefly, my own work mostly involves modeling and experiments on population and community dynamics using laboratory-based microbial model systems. But some of my students have worked in other systems, including alpine plants, plant-pollinator interactions, bean beetles, fossil bivalves, and meta-analysis of data from the literature. So whether you want to join one of the ongoing lines of research in my lab, or have your own ideas, please do get in touch.

See my research page for more on ongoing research in my lab. Current lines of research include:

  • Spatial synchrony of population cycles. Why do population cycles and spatial synchrony often (but not always!) seem to go hand-in-hand?
  • Higher order interactions and species coexistence. Are communities more than just the sum of their parts, and if so, what implications does that have for species coexistence?
  • Studying ecologists, not just ecology. Using a huge database of effect sizes from 466 ecological meta-analyses to quantify what ecologists know about ecology (all of ecology!), and how fast they’re learning it.

The Department of Biological Sciences at the University of Calgary has a strong group of over a dozen ecologists and evolutionary biologists, with strength in depth in evolutionary ecology, population ecology, aquatic ecology, and other areas. The department has two field stations in the mountains, next-generation sequencing facilities, access to various high-performance computing clusters, and everything else you’d expect from a big, well-equipped research university.

Grad students in the department are guaranteed a minimum of $23,000/year through a mixture of TAships, RAships, and other sources like fellowships. In practice, students in my lab make more than the departmentally-guaranteed minimum.

Calgary is a city of about 1.3 million people, 45 minutes drive from the Canadian Rockies with all the opportunities for field work and recreation that implies. I mean, look at these mountains!

If you’re interested, please email me ASAP (jefox@ucalgary.ca). Doesn’t have to be a super-long email. Just tell me a bit about your background, interests, and long-term goals, and about what specifically attracts you to my lab, and/or Calgary more broadly. Please also include a cv, transcripts (unofficial is fine), and contact details for three references.

#pruittdata latest: Did Jonathan Pruitt just quietly lose his Canada 150 Chair? (UPDATE: yes he has–temporarily, pending “further notice” from McMaster University)

McMaster University recently completed its investigation into serious accusations of scientific misconduct against prominent behavioral ecologist Jonathan Pruitt. My understanding is that Canadian universities have to notify federal scientific funding agencies of the outcomes of completed misconduct investigation, so that the funding agencies can take appropriate action. It looks like action may have been taken last week? Specifically, it looks like Pruitt may have lost his Canada 150 research chair:

Canada 150 Research Chairs are a small number of very prestigious (and well-funded) research chairs that the Canadian government funded to mark 150 years since Canadian federation. I just checked myself, and can confirm that Pruitt is no longer listed on the Canadian government’s webpage listing the Chairholders.

To which, if Pruitt has indeed lost his Canada 150 Research Chair, why has there been no public announcement? What useful purpose could possibly be served by lack of transparency around actions taken in response to a completed investigation? Conversely, is there not a public interest not only in institutions taking appropriate remedial and disciplinary actions when such actions are required, but in their being seen to do so?

UPDATE: Science’s news team followed up Nick’s tweet and got NSERC (the Canadian federal funding agency that administers the Canada 150 Chairs program) to issue a brief statement:

Click through for the full statement (it was short enough to quote in two tweets…), but the key points are that (i) McMaster did indeed inform NSERC that Pruitt has been placed on administrative leave, and (ii) NSERC has accordingly “temporarily suspended” Canada 150 Chair payments to Pruitt pending further notice from McMaster.

To which, I’m glad we now have at least a brief statement from NSERC. But why did we only get it in response to a question from a reporter from the world’s leading scientific journal? Obviously it would be silly for institutions to issue press releases about every little action they take. But personally I feel like the public interest and newsworthiness of this case rises to the level where it would be appropriate for the institutions concerned to issue public statements, without reporters having to ask for them. After all, that’s why there’ve been news stories about this case in Science that then got picked up in general media outlets like the Toronto Star and the CBC. Am I just being naive here?

And I remain mystified why McMaster still needs to provide “further notice” to NSERC. McMaster’s been investigating for 22 months. How can the investigation possibly still be at some sort of interim stage?

/end update