Data on the life histories of ecological research programs (and their meta-analyses)

Scientific research programs have life histories. They’re born with the publication of the first paper on the topic. They grow to adulthood as interest in the topic grows. They age as interest in the topic fades. They die once the last paper on the topic is published.

Previously, I’ve posted data on the life histories of some of the most influential ideas in ecological history, as measured by citation data. But those life histories presumably are atypical. What’s the life history of a typical ecological research program? What’s the typical growth rate? The typical age at maturity? The typical lifespan?

The lives of many ecological research programs are summarized by meta-analyses. Or at least, they’re summarized in part, since meta-analyses can be published before a research program dies. When are meta-analyses typically published in the life of an ecological research program? Are meta-analyses like obituaries–summaries of research programs that are now dead? Are they like retirement ceremonies–summaries of aging research programs whose most productive days are behind them? Or are they like high school yearbooks–summaries of young research programs that have most of their lives to look forward to?

Here are the data!

To get data on this, I went back to my massive compilation of ecological meta-analyses. For each of 466 meta-analyses, I have a record of when each primary research paper contributing to the meta-analysis was published. I went back to a haphazardly-selected subset of about 80 meta-analyses, and counted up how many primary research papers in the meta-analysis were published each year. That is, I’m defining “ecological research program” as “a set of primary research papers that was summarized in a meta-analysis”. And I’m quantifying the life of the research program in terms of primary research papers. A research program is born with publication of the first paper that gets included in the subsequent meta-analysis. It matures (or if you prefer, peaks) in the year in which the most primary research papers are published (or the last such year, if there’s a tie). In quantifying research interest with raw publication counts, rather than publication counts scaled relative to the total number of ecology papers published each year, I’m quantifying the absolute amount of interest in a research program, not interest relative to interest in all other ecological topics. Note that I didn’t quantify age at maturity for any research program for which no more than two primary research papers were ever published in any single year. You can’t quantify age at maturity for a research program that never develops much interest. Figuring out when a research program died would be difficult with these data, and in any case it’s clear from the data that most research programs aren’t dead yet when their meta-analyses are published. So I didn’t bother trying to quantify research program lifespan.

To anyone who wants to quibble with how I defined “research programs” or how I quantified their life histories: relax. This is just a fun, conversation-starter blog post. Live a little. 🙂

Here are illustrative results for four typical-ish meta-analyses:

Blue: a meta-analysis from Winfree et al. 2009 Ecol. Orange: a meta-analysis from Daskin & Pringle 2016 JAE. Grey: a meta-analysis from Whitlock 2014 J Ecol. Green: a meta-analysis from Eziz et al. 2017 Ecol Evol.

You’ll notice several things. First, research programs generally start with little interest, as measured by annual number of publications. Interest grows in fits and starts over time. Second, (and this is something I’ve talked about before), interest in most research programs never gets all that high. The median research program in this compilation of 80 maxes out at just 6 publications per year! (and remember, that’s actually an overestimate for ecological research programs as a whole, because I’m ignoring data from research programs that never exceeded 2 publications/year). Third, the max (or the last occurrence of the max) usually comes close to but not at the very end of the time series. Specifically, the median peak occurs 2.5 years before the end of the time series (the mode is 1 year and the mean is 3.7 years; zero years is rare). Which, since the typical time series is a bit more than 20 years long, means that the typical research program in ecology peaks about 20ish years after it begins. Though that’s surely a bit of an underestimate of median “age at maturity”, because these data are truncated by the date of meta-analysis publication. Presumably, if those meta-analysis authors had waited longer before doing their meta-analyses, some of these research programs would’ve reached later, higher peaks. So the “age at maturation” of the typical ecological research program is a bit more than 20 years.

Of course, the max provides a noisy estimate of the peak of research interest. Probably, the “peak” for many research programs would be better described as a plateau. That’s hard to quantify. But just eyeballing all of the 80ish time series, it looks to me like, in cases where there is a reasonably distinct peak or plateau of research interest, there’s almost always just the one, and it doesn’t usually last more than 5-10 years before decline sets in. And while I haven’t shown graphs of this, in those cases where the time series includes what looks like a near-complete decline phase, the post-plateau decline phase is shorter and steeper than the pre-plateau increase phase. Ecological research programs decline faster than they grow.

Finally, ecological meta-analyses typically end their data compilation at the peak of a research program, or else a few years after the peak. Although the data truncation issue means that a minority of ecological meta-analyses may have been published well before the research program would’ve peaked in the absence of the meta-analysis.*

*I say “would have peaked in the absence of the meta-analysis” because it’s possible that publication of a meta-analysis changes a research program’s trajectory in some cases. Either by encouraging or discouraging further studies of the topic.

7 thoughts on “Data on the life histories of ecological research programs (and their meta-analyses)

  1. So one possible interpretation of this is that the exponential growth of scientific literature swamps individual research programs. At a minimum that seems to me like a null hypothesis that needs rejecting. Sort of a neutral theory of research programs – one can differentiate from saying that every paper published in ecology is effectively randomly chosen from the research programs (aka species) out there, and there is a random process of birth and death of research programs (species).

    On the other hand your study of citations show a clear peak and decline. But that is a bit different (as you say) in that it tracks reference to a single paper – to be consistent with the last paragraph would require that research programs don’t necessarily fade away, they just gradually transition to anchoring themselves in new papers which seems very possible to me.

    All of which raises for me a compelling question. Do research programs truly ever die? Or do they just change names and anchor citations but continue to collect the same kinds of data as older research. Optimal foraging is an interesting example. I think most people would say optimal foraging has faded (and some like me would say that’s too bad). But there is still an awful lot of data being collected on diet choice, habitat choice, etc. Both in the field and in the lab. Reminds me of my first day of graduate school when my cohort was in a formal setting, each of us briefly introducing ourselves to each other, and about half my cohort said “I study species interactions”, and I remembering wondering why they weren’t just saying they were community ecologists. So second hypothesis: research programs never die, just relabel themselves. Which is how you can get exponential growth over decades of papers having data suitable for being lumped under one question for meta-analysis.

    • And re-reading your post, it is clear that you interpret a drop at the end, while I am not. It seems to me those drops are only 1 to 2 years and represents a delay in discovering recent papers (and or papers published while the meta-analysis is in prep/publication). In short I see exponential growth with a “pull of the past” bias towards finding papers for a meta-analysis.

      • Yes, you’re absolutely right that, in some cases, the apparent peak near the end of the time series is probably an artifact of papers being published while a meta-analysis was in prep/in review/in press. Hard to say exactly how common that is. I did have a bit of a look at when the meta-analyses were published, relative to when the last study in the meta-analysis was published, and didn’t find any relationship with the timing of the peak. It’s not that peaks 1 year before the end of the time series are all associated with meta-analyses that were published 1-2 years after the final study included in the meta-analysis. So I dunno. You could well be right that it’s most common more meta-analyses to be published while research interest in the topic is still on the rise, rather than just after interest has peaked.

    • Interesting hypothesis that research programs never die, they just get relabeled. I think there’s some truth to that, but I’m not sure how much. Take your optimal foraging example: yes, lots of people still study diet choice and habitat choice. But how much of that work asks whether those choices are optimal?

    • “So one possible interpretation of this is that the exponential growth of scientific literature swamps individual research programs. ”

      Yes, agreed, I tried to make that clear in the post but I guess I didn’t do a great job of it. That’s what I was trying to get at when I noted that I’m measuring absolute growth in research interest/activity associated with a given topic, not interest/activity relative to all ecological topics. If overall research activity (as measured by publications) is growing, then research interest in most topics likely is growing. But I’m not sure how useful it would be to try to get at this with a null model. Research interest in most of these topics is so low overall, and so noisy (zero pubs one year, four pubs the next, back down to 1 pub the year after that…), that it’s not going to be very useful to try to fit growth curves. You won’t be able to distinguish alternative candidate growth curve models (linear, exponential, etc.). And you won’t get very precise parameter estimates for whichever growth model you choose as your null model. So you won’t be able to say much about which research topics experienced greater-than-average growth in interest over time. I mean, you’d be able to say a bit–but not much more than you could say just by eyeballing the time series.

      • I agree with Brian about standardizing for overall publication effort per (a pet peeve of mine with citation analyses). It seems as if your question is an attempt to tepresent a research program as proportion of Ecological Research Effort in Ecology or some larger sub discipline (eco-physiology, community ecology), which is pretty easy to do by searching a few general keywords to identify the denominator. Otherwise, the main signal is likely to an overall effort signal, which has grown exponentially over time.

  2. Interesting material!

    Actually It makes sense to me that once enough studies have accumulated for a meta-analysis (say, 15-20), then people would say to themselves well adding one more to that isn’t going to make a breakthrough, and most likely not change the answer much. In a way, meta-analyses do kill a topic!

    But as Brian suggests, what you’d do next is refocus or shift the angle on the question somehow.

Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.