Jeremy Fox seeking 1-2 graduate students to start in F 2023 or W 2024

I am seeking 1-2 graduate students (M.Sc. or Ph.D.) to start in F 2023 or W 2024. My lab does fundamental research in population and community ecology. We also have a new line of metascience research–science about science. See here for more on my lab.

I’m at the ESA-CSEE joint meeting in Montreal right now. If you’re interested and are at the meeting as well, please reach out! jefox@ucalgary.ca

Institutional investigation finds star marine ecologist Danielle Dixson guilty of serial data fabrication

Science news story here.

I’m struck by both the similarities and differences to the Pruitt case.

An incomplete list of similarities:

-repeated data fabrication across numerous papers over many years, often taking the form of duplicated sequences of observations indicative of copying and pasting data

-current and former trainees of the accused were crucial to the investigation, going above and beyond to reveal the truth.

An incomplete list of contrasts:

-Dixson was given away in part because of the physical impossibility of her methods. It just wasn’t physically possible for her to have collected the data she claimed to have collected, in the time frame she claimed to have collected it, using the methods she claimed to have used. In contrast, I’m not aware of any instances of the Methods sections of Pruitt’s papers describing any physical impossibilities.

-Pruitt had no public defenders of any consequence, save for his own lawyers. In contrast, Dixson has–indeed, continues to have!–very vocal public defenders, including her own doctoral and postdoctoral supervisors and other prominent marine ecologists. Those defenders have defended Dixson not by addressing the specifics of the allegations against her (e.g., “Here’s why duplicated data X in paper Y don’t actually indicate fabrication”), but rather by (i) imagining that the whistleblowers have bad motives and attacking them for those purported bad motives, and (ii) talking about how hard-working, dedicated, and smart Dixson is. It’s immensely to the credit of Pruitt’s many former friends, trainees, and collaborators that all of them followed the evidence where it led.

-The University of Delaware’s institutional investigation into Dixson was much faster than McMaster University’s investigation into Pruitt.

I don’t know what larger lessons to draw from these similarities and differences, or even if any larger lessons should be drawn. I just find them striking.

#pruittdata latest (and last?): Jonathan Pruitt resigns from McMaster University (UPDATED)

Science news article here.

In the unlikely event that you have no idea what this is about, start here and say goodbye to your day.

I may blog about this later, or maybe not.

UPDATE: Nature has a new piece on the ongoing consequences of the Pruitt case for Pruitt’s trainees and collaborators. The linked piece illustrates that institutional investigations of scientific misconduct and other bad behavior aren’t designed to give closure to the main victims of misconduct (here, Pruitt’s current and former trainees and collaborators). I wish I had good ideas about how to change that, but I don’t. The piece also contains a bit of news that’s surprising to me–McMaster is going to continue the formal hearing process that surely would’ve resulted in Pruitt being fired, even though Pruitt has already resigned. The linked piece also has some new details on Pruitt himself, in case you care (personally, I don’t). Apparently he’s a high school science teacher at a Catholic school in Florida now. If you feel the urge to joke sarcastically about what he’ll do if he catches a student cheating on a test, well, you’re not alone. And, hilariously, Nature claims that it’s still investigating Pruitt’s Nature paper. That paper has yet to be retracted (it carries an expression of concern), despite overwhelming evidence of data fabrication. Yeah, sure you’re still investigating. /end update

Shameless self-promotion alert: my lab’s new Ecology paper shows that the truth does not “wear off” in ecological research (at least, not usually)

Back in 2010, Jonah Lehrer wrote a big New Yorker feature called “The Truth Wears Off“. In it, he called attention to anecdotal observations that many of the effects and phenomena scientists study seem to shrink over time. Lehrer’s article popularized the term “decline effect” to summarize this pattern. Recently, some striking examples of the decline effect have been reported in ecology, such as in the declining effect of ocean acidification on fish behavior. Further back, Jennions & Møller (2002) found that decline effects were ubiquitous in the (relatively few) ecological and evolutionary meta-analyses that had been published at the time.

Outstanding undergraduate Laura Costello and I decided to revisit the prevalence of decline effects in ecological research, using my quite comprehensive compilation of all the data from 466 ecological meta-analyses. We’re very excited that the paper is now online at Ecology. You should click through and read it (of course, I would say that!). But the tl;dr read version is that the only common decline effect in ecology is in the decline effect itself. The truth no longer “wears off” in ecology, if it ever did. Decline effects might’ve been ubiquitous in ecological meta-analyses back in the 1990s, but they aren’t any more. Only ~3-5% of ecological meta-analyses exhibit a true decline in mean effect size over time (as distinct from regression to the mean, which happens even if effect sizes are published in random order over time). Read the paper if you’re curious about our speculations as to why decline effects are now rare in ecology.

This is the third paper of mine that grew out of a blog post, which is my tissue-thin justification for sharing news of the paper in a blog post. 🙂

I think we need a “shrink ray” for estimated mean effect sizes in ecological meta-analyses, but I’m not sure how to build one. Can you help?

I’m guessing that most readers of this blog will be familiar with the concept of shrinkage estimation. But if not, here’s an example to give you the idea. Imagine you’re trying to estimate the true winning percentage of each team in a professional soccer league–the percentage of games each team would win if, hypothetically, it played each of the other teams many, many times. But it’s early in the season, and so each team has only played each of the other teams once. You could take each team’s observed winning percentage as an estimate of its unknown true winning percentage. But those estimates come from small samples of games, and the better team doesn’t necessarily win every game because chance events play a role. So observed winning percentages after just a few games are imprecise estimates of those unknown true winning percentages. Meaning that the variance among teams in their observed winning percentages surely overestimates the variance among teams in their unknown true winning percentages. In all likelihood, the team with the highest observed winning percentage so far is not only good, it’s also gotten lucky. It’s good, but likely not as good as its observed winning percentage suggests. And in all likelihood, the team with the lowest observed winning percentage so far is not only bad, it’s also gotten unlucky. It’s bad, but likely not as bad as its observed winning percentage suggests. Put another way, as the teams play more games, they’re likely to regress to the mean. So in the aggregate, you can improve your estimates of the teams’ true winning percentages if you shrink the observed winning percentages towards 50% (the average winning percentage). You make the bias-variance trade-off work in your favor by biasing all of your estimates towards the mean, in order to reduce their variance. There are ways to work out exactly how much shrinkage is optimal.

I think we need shrinkage estimation for mean effect sizes in ecological meta-analyses. That is, I think many ecological meta-analyses provide very imprecise estimates of the unknown “true” mean effect size. So that, in aggregate, those estimated mean effect sizes would be improved if they were shrunk towards the mean. Here, see for yourself:

Continue reading

Science issues Expression of Concern for Dixson et al. 2014, a major behavioral ecology paper for which Science’s own reporters uncovered evidence of data fabrication

I continue to keep a close eye on developments in the various ongoing, high profile cases of apparent data fabrication in ecology. Retraction Watch has the news that Science has issued an Expression of Concern for Dixson et al. 2014. This was a high profile paper claiming to demonstrate strong deleterious effects of ocean acidification on fish behavior. The EoC is being issued in part thanks to Science’s own investigative reporting, which helped uncover evidence of apparent data fabrication.

Relatedly, Jeff Clements and colleagues just published a major new meta-analysis of effects of ocean acidification on fish behavior, revealing an absolutely massive decline effect. That is, early studies reported big effects, but subsequent studies have found basically squat. Further, those early studies reporting big effects are all by the same lab group, of which Danielle Dixson was a member. Drop the studies from that one lab group, and you’re left with studies that mostly report small or zero effects. Speaking as someone who just co-authored a paper that looks systematically for decline effects in 466 ecological meta-analyses, and mostly fails to find them (Costello & Fox, in press at Ecology), I can tell you that the decline effect in Clements et al. is enormous. I couldn’t find anything close to a comparable decline effect anywhere else in ecology. Nor do any of the other, weaker decline effects I found have such a strong association with the work of one lab group. Clements et al. is a great paper. It’s very thorough; they check, and reject, a bunch of alternative explanations for their results. Even if you’re not a behavioral ecologist, you should read it.

Prominent botanist Steven Newmaster seems to have faked and exaggerated his work for many years, according to a major investigation by Science (UPDATED)

Kudos to then-grad student Ken Thompson for speaking up about the anomalies that kicked off this investigation, and kudos to Science for pursuing the investigation so well.

I thought #pruittdata was jaw-dropping, and honestly I’m not sure it measures up to this story. Jesus.

Hopefully Science’s investigation will make it hard for Guelph to sweep the matter under the rug, as they seem to be trying to do. I guess we’ll see. Doesn’t it depend how shameless the Guelph administration is? I mean, what if they just don’t care about people (including many senior members of their own staff) criticizing them in public?

I hadn’t realized that Science’s outstanding investigative journalism team is funded in part by a donation from emeritus biology prof Daniel Pinkel. That suggests a thought: maybe one way to improve investigations into scientific misconduct is to fund more investigative journalists. Obviously, that only works for high profile cases, but it’s something.

UPDATE: You can donate here to support Science’s investigative journalism. I just did. I hope you’ll consider it.

#pruittdata latest: another one bites the dust

Earlier this winter, former leading behavioral ecologist Jonathan Pruitt was placed on paid administrative leave by McMaster University, with no access to grant funds or students, and was suspended from his Canada 150 Chair. Now he’s lost yet another paper. Pruitt & Krauel 2010 J Evol Biol has been retracted. The retraction is due in large part to outstanding data forensics by Erik Postma. You should click through and read the whole thing.

I’m pretty sure this is the oldest paper of Pruitt’s to be retracted so far. According to Wikipedia, Pruitt now has 17 retractions (the 16 listed on Wikipedia as of this writing, plus the new one that was just announced) and 6 Expressions of Concern. Protip: don’t do science in a way that eventually results in your Wikipedia page looking like his…

See also Am Nat EiC Dan Bolnick’s brief but interesting remarks on the important role that journals continue to play in enforcing scientific integrity. Of course, some journals play that role more effectively than others. Not sure what some of the journals that put placeholder EoCs on Pruitt’s papers are waiting for. Looking at you, Nature and PNAS…

The Dead of Winter

I link to this every year. Key quote:

A society—a civilization, if you like—is a hard thing to hold together. If you live in an agrarian society, and you have only stone, wood, and bone for tools, and you are on the western edge of Europe, few times are harder than the dead of Winter. The days are at their shortest, the sun is far away, and the Malthusian edge is right in front of you. It’s no wonder so many religious festivals take place around the solstice. Here were a people, more than five millennia ago, able not only to pull through the Winter successfully, but able also to build something like a huge timepiece to remind themselves that they were going to make it.