Recent ecological meta-analyses don’t cover longer time periods than older ones. Which is kind of interesting.

Via my fairly comprehensive database of 476 ecological meta-analyses: here’s the year range covered by each meta-analysis (publication year of the oldest paper included in the meta-analysis up to the publication year of the most recent paper), as a function of the meta-analysis publication year. Note that some data points are actually several identical data points, because I was too lazy to jitter the plot:

As you can see, there’s no trend. More recent meta-analyses do not cover longer time spans than older meta-analyses.* Ecological meta-analyses typically include papers published over a 10-30 year period, and always have.

Which I think is kind of interesting. After all, more recent meta-analyses have a longer time span of ecology they could draw from–but they don’t. Why not?

I suspect it’s because relevant papers typically only go back 10-30 years. That is, I suspect that it typically takes 10-30 years for ecologists to publish “enough” papers to make a meta-analysis worth conducting (or what meta-analysts see as “enough” papers). Here’s a graph of the number of papers (“studies”) included in ecological meta-analyses, as a function of meta-analysis publication year:

As you can see, the typical ecological meta-analysis includes 10-30 papers or so, and always has. Once there are “enough” papers, somebody’s going to publish a meta-analysis of them.**

If that speculation is right, then we can infer the typical rate at which ecologists collectively publish papers on “meta-analysis-sized” topics. It’s typically 1-2 papers per year on average, over the first 1-3 decades of research on the topic. Which sounds like a pretty low publication rate when you put it like that!***

An alternative possibility is that meta-analysts tend to overlook older papers. But I find that possibility hard to square with the data. Why would all meta-analysts, regardless of when they were working, tend to overlook papers that were more than 10-30 years old at the time? I mean, it’s not as if Web of Science’s coverage only goes back to 30 years before whatever the current year is!

Does any of this surprise you? Interest you? Bore the socks off you? Or what? Looking forward to your comments, as always.

*And in case you were wondering: the most recent paper in a meta-analysis generally was published 0-5 years before the meta-analysis. That gap hasn’t grown, or shrunk, over time.

**Note that many papers include multiple effect sizes, so most meta-analyses include more effect sizes than papers. The median ecological meta-analysis includes about 60 effect sizes, and that median hasn’t really moved much over time.

***It would be fun to keep pursuing this line of thought. Can you infer something about whether the breadth of “meta-analysis-sized” topics is increasing or decreasing over time? And the annual rate at which ecologists publish papers keeps increasing. Can we tell from the available data if that increase is due to ecologists publishing on a growing range of topics, as opposed to publishing more papers/year on any given topic?

11 thoughts on “Recent ecological meta-analyses don’t cover longer time periods than older ones. Which is kind of interesting.

  1. At the grad program where I was a postdoc, PhD students are required to perform a review or meta-analysis during their PhD, so I evaluated some proposed and finished reviews. One thing I notices is that they often deliberately restrict the search and only include recent papers, for instance searching only for publications more recent than 1990 (so, 30 years). It is my impression that a commonly stated reason for this is that otherwise there is too much material to go through, and also that older papers are harder to have access to. So I tend to have more confidence in your alternative explanation.
    Next time I’m evaluating a PhD project that includes meta-analyses I’ll probably say “hey, did you read those posts by Jeremy? He shows that we do not have enough effect sizes, so please don’t restrict your search to the last 30 years!”

    • Hmm. All I can say is, published meta-analyses rarely say “the search was restricted to the last X years”.

      And yes, you should tell your students not to restrict their searches in that way! 🙂

  2. I applaud your astute insights, Jeremy. I would never have thought to examine trends as you have. I do believe there is a distinct bias for “What have you done for me lately?”. Meaning, we’ve had it ingrained into our grey matter that recent science is good while old science is flawed. That bias could account for what you’ve observed.

    Another aspect might relate to terminology & semantics. I once considered repeating a floristic study published in 1898… Ultimately, I decided against it because so many of the species identified in that monograph had changed their taxonomic status multiple times. There were also several instances where we could not discern with confidence if those taxonomic alternations would identify the same species as described in 1898 v. 2009. These kinds of issues might steer investigators away from older publications.

    • “I do believe there is a distinct bias for “What have you done for me lately?”. Meaning, we’ve had it ingrained into our grey matter that recent science is good while old science is flawed. That bias could account for what you’ve observed.”

      Hmm. I see where you’re coming from. But when you read recent ecological meta-analyses, they usually describe exactly how they searched for candidate studies to include. And those search methods and inclusion criteria never include a time cut-off. No meta-analysis ever restricts its search to papers published within the last X years.

      Re: the issue with terminology, you might be onto something there. Perhaps there are many studies >30 years old that from which a contemporary meta-analyst could extract effect sizes, but those old studies used different terminology than we use these days. So those old studies don’t come up in the meta-analyst’s search results. Or the meta-analyst has a quick look at the old studies and discards them as “not relevant”, because they don’t use current terminology. If that’s right, then it implies that ecologists “put old wine in new bottles” by changing their terminology every 10-30 years or so. I honestly have no idea if that’s part of what’s going on, or how you could tell that it was. Any ideas on how to test that?

      • You made me think on it, Jeremy! Perhaps one way to test for it would be to distribute a database of 100 years of publications for a particular topic. Then, ask participants to select those they deem worthy of a meta-analysis.

        If there were a bias for more recent studies, then you could conduct another test by changing the dates on some of the papers in the database (if it were ethical). If participants excluded essentially the same papers as the initial group had, then you could conclude that year of publication was not a factor, but something inherent to the papers were.

        From there, I’d hire a statistician to design a study to determine the likely factor(s) for the bias.

  3. I would like to see another analysis about the socioeconomic and demographic background of scientists doing ‘older’ ecological studies versus more recent ones. My prediction is that ‘older’ scientists could afford to do longer studies because they were more privileged (richer, whiter, mostly male) scientists who were probably independently wealthy. This allowed them a freedom to do longer ‘riskier’ experiments that are no longer tenable for ecologists who come from less privileged backgrounds who know they need to publish or perish.

    • Thank you for your comments, but with respect I find them puzzling. I’m afraid that some aspects of your comments are way off-base.

      Few ecologists have ever been independently wealthy. Certainly not the ecologists who did the vast majority of primary studies that have ended up in ecological meta-analyses. I just went back and checked my compilation of meta-analyses, and 96% of all effect sizes in my compilation were published in 1990 or later. Take it from someone who started grad school in 1995: ecological research was NOT dominated by independently wealthy scientists *in the 1990s*, never mind later than that! And “publish or perish” was a thing in ecology for decades before 1990! (“publish or perish” probably started to become a thing in N. American ecology in the late 1960s, and really took off in the early 1980s:

      It certainly is true that academic ecological research in North America and Europe has long been dominated by white men from middle class or affluent backgrounds. But with respect, where did you get the idea that it was ever dominated by independently wealthy people whose wealth gave them independence from the “publish or perish” system?

      Not clear what you mean by study “riskiness”. So I’m not clear why you think longer-term studies are “riskier” than shorter-term ones.

      It would be interesting to compile data on study duration and how it has changed over time. Just offhand, I doubt that the median ecological study duration is appreciably shorter these days than it was several decades ago. Various aspects of ecological research have changed not at all or only very slowly over time (Carmel et al. 2013: I’m not sure why study duration or “riskiness” would have changed radically even as many other aspects of ecological research changed little.

    • If anything, Ben, most faculty are constrained not by money, but by time. They are obliged to teach, serve on departmental committees, apply for grants, peer review, research & publish. There are only so many hours in the day…

      If anything, ” ‘older’ who scientists could afford to do longer studies because they were more privileged” would likely opt to do little to no research, assuming they have tenure and can wield the kind of power you believe they possess. Meaning, any person with limitless authority & wealth would be more inclined to work less, not more.

      In terms of the length of studies, it is rare to receive funding for any given project for more than 5 years. While it is true that a wealthy person could bankroll their own research program, it is unlikely to happen because that person would essentially return their salary to their institution.

      As for long term studies, ecology is one of very few disciplines where this can be done with little to no funding. Often, they can access long term data gathered by state & federal governments, free of charge. They can also access long term data from a plethora of free online databases. Lastly, many ecological research programs have few costs involved, especially concerning field programs. In this scenario, faculty have access to institutional infrastructure like transportation, student labor, core labs and so on- which are provided to them free of charge.

  4. Pingback: Poll results: here’s what (some) ecologists think about retracting old and superseded papers | Dynamic Ecology

Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.