A (crude) statistical profile of the research productivity of recently-hired N. American ecology asst. profs (UPDATED with additional data)

For the past three years I’ve compiled various bits of data on as many newly-hired tenure tract asst. professors of ecology (and allied fields) in N. America as I possibly can. Here are the data on the Google Scholar h-indices of new hires from the 2016-17 job season and 2017-18 job season. I found Google Scholar pages for 268 people, and checked their h-indices at or around the time they were hired (fall 2017 for the 2016-17 hires, summer and early fall 2018 for the 2018-19 hires). (aside: back in 2015-16 I did the same for a haphazard subsample of the new hires; those data are summarized briefly here.)

This post has been edited to better incorporate the 2017-18 data, which reinforce the conclusions of the previous version of the post based only on 2015-16 and 2016-17 data.

I’ll emphasize right up front that the h index is an extremely crude summary measure of research productivity. Perhaps its biggest limitation is giving individuals the same credit for being sole author and being one middle author among many. For this reason and others, I highly doubt that most faculty searches actually involve looking at applicants’ h indices, though some searches might look at other things that are loosely correlated with applicants’ h indices (e.g., whether the applicant has papers in leading journals). My only goal in this post is to give a very rough sense of what level of research productivity is required to be competitive for a tenure-track faculty position in ecology at different sorts of N. American institutions.

Note as well that some recent hires don’t have Google Scholar pages. That’s especially true for recent hires at smaller and less research-intensive institutions. People without Google Scholar pages likely tend to have lower h indices than people with Google Scholar pages. And as you’ll see, recent hires at less research intensive institutions tend to have lower h indices than recent hires at more research intensive institutions. So my data likely are an upwardly-biased estimate of the typical h indices of recent ecology hires.

Here, for posterity, is a boxplot for just the 124 2016-17 hires, broken down by Carnegie class. Institution types are in increasing order of research intensiveness. But you probably don’t want to look at that, you probably want to look at the updated graph below with two years’ worth of data.

Slide1

Class “BC” is actually all bachelor’s-only institutions lumped together. The sample size for M3 institutions is one, so you should probably just ignore that box.

UPDATE: Below is the combined data for 2016-17 and 2017-18, a total of 267 new hires. “Other” includes one hire at a TC institution, and several hires at institutions that lack an official Carnegie classification (mostly but not entirely Canadian institutions) and that I wasn’t sure how to classify myself.

figs for newly hired faculty posts 4

A few comments:

  • There’s a lot of variation in research productivity among recently-hired tenure-track N. American ecology faculty. Which means that, once you’ve published a handful of papers, you’re competitive for a faculty position (depending of course on how good those papers were, the rest of your cv, your reference letters, your fit to the position, who else applies, etc.).
  • Don’t be shocked by how high some of these h-indices are, and don’t leap to any broad conclusions about that. Some of the really high h-indices in this dataset are from people working at the interface of ecology and other fields like physiology in which publication practices, and thus typical h index values, differ from those in ecology. And don’t leap to the conclusion that many people in this dataset must’ve spent many years as postdocs or soft money professors to accumulate such high h indices. See here for data on how long recent hires spent as postdocs. 3-4 years is typical; spending >5 years is rare (though not as rare as I thought when I first wrote this post; the post has been edited to reflect the data in that last link).
  • There’s lots of overlap in the distributions of h-indices for recent hires at different types of institution. It is not the case that research universities only hire people with sky-high h indices, or that bachelor’s institutions only hire people with low h indices.
  • There is a clear trend for more research-intensive institutions to hire ecologists with higher h indices, on average. The Spearman rank correlation between institutional research intensiveness (Carnegie class) and h index of 2016-17 tenure-track ecology asst. prof. hires is 0.36, which is highly significant (P<10^-4). UPDATE: I haven’t done this calculation for the combined dataset, but I can tell you the correlation would remain positive and highly statistically significant. The distributions for different institution types overlap a lot, but it’s mostly overlap of the upper halves of the distributions for less research-intensive institutions with the lower halves of the distributions for the more research-intensive institutions. No doubt that’s a function of both where different people choose to apply, and how different institutions choose among their applicants.
  • The previous bullet is relevant to a recent claim that Canada’s NSERC Discovery Grant program is biased against applicants from smaller and less research-intensive institutions (Murray et al. 2016). Murray et al. inferred bias against applicants from small institutions from several lines of evidence. One was that first-time applicants from small institutions have lower success rates and receive smaller grants on average than those from large institutions. Murray et al. say that this indicates bias against new investigators from small institutions because they assume that:

[G]iven the contemporary job market and the glut of PhDs [30, 31], it is unlikely that small schools systematically hire weaker researchers to fill tenure-track positions, or that elite ECRs [early career researchers] accept offers only from large institutions.

NSERC Discovery Grant applications are evaluated in part on the applicant’s track record of research productivity over the preceding 6 years (it’s 1/3 of the grant score). Tenure-track ecologists recently hired by more research-intensive N. American institutions do have stronger research track records on average than those recently hired by less research-intensive institutions, at least by one admittedly-crude measure. I presume the same would be true if you looked at other fields, or restricted attention to Canada, though of course I don’t have data on that. The faculty job market does indeed have many PhDs chasing few faculty jobs–but those PhDs have widely-varying research track records, and the variation in their track records is correlated with the research intensiveness of the hiring institution. This doesn’t mean that anybody’s a “weak” researcher; researchers’ track records vary for all sorts of reasons. I emphasize that the data here only speak to one argument of Murray et al. My point here is deliberately narrow.

17 thoughts on “A (crude) statistical profile of the research productivity of recently-hired N. American ecology asst. profs (UPDATED with additional data)

  1. Hmm.. I feel like h-index just isn’t a good way of getting at research productivity relative to hiring choices. As you note, it gives equal weight regardless of authorship position, which I don’t think is how hiring committees weight papers on CVs (i.e. they tend to value the number and quality of first author papers more highly).
    Have you tried also analyzing this data using one of the other h-index type measures that weight or normalize by total number of authors, etc? Would be an interesting comparison.

    • “Have you tried also analyzing this data using one of the other h-index type measures that weight or normalize by total number of authors, etc? ”

      No, because that would be work. 🙂

      I think if you wanted to improve on the crudity of this analysis, you’d want to look at the journals people are publishing in. You’re right that hiring committees tend not to give much weight to being a middle author on a many-author paper. But I suspect, but don’t know, that hires at more and less research-intensive institutions are more likely to be differentiated by where they tend to publish than by the number of papers they’ve first-authored or their average # of co-authors per paper.

  2. The R1 vs. non-R1 graph is interesting but then based on the second graph, there really doesn’t seem to be a difference between R1 and R2 (or even R3)… So any kind of ‘R’ university selects for a slightly higher H index, not just the R1, as suggested by the first graph.

    • Yes. The first graph is there because I wanted to show all the data I had. But as I said in the post, last year I didn’t classify hiring institutions beyond R1 vs. non-R1, and I’m too lazy to go back and do so.

      Possibly, if one had multiple years of finely-classified data, smaller differences among R1 vs R2 vs R3 institutions might show up. Or not.

      The data from both years are online, there’s a link to the Google Docs spreadsheet in that previous post I linked to. So someone more motivated than me could go back and sort the 2015-16 data into all the Carnegie categories.

  3. Question. Knowing very little about the Canadian system, is the “small schools” category in Murray et al more like R3 schools or M2-3 and B schools. If the former, your analysis supports their assumption. If the latter it doesn’t (of course with the caveat of crudeness).

    I’m also curious about Murray defining ECR as “researchers in the first two years of their first postsecondary position”. Not exactly sure what that means, but If it means 2 years post phd then that is very early. Potentially far earlier than the average hired ecologist in your data. How old (years post phd) is the typical N. American tenure track hire?

    • Yes, Carnegie doesn’t classify Canadian institutions, so there’s no exact mapping. I can tell you that Canada doesn’t really have many 4-year institutions equivalent to US liberal arts colleges, so there won’t be many (or any?) NSERC DG applicants from schools that lack any graduate programs whatsoever.

      Murray et al. mean people in the first two years of their first faculty position–so the first two years of eligibility to apply for an NSERC Discovery Grant.

  4. The thing with the h-index (and most metrics, unfortunately) is that it does not change the weight given to a paper as a function of the role an author played in it.
    h=10 obtained with many papers with a large number of authors (e.g. from a working group), you being a middle author in most, is pretty different from h=10 with 10 first-author papers…

  5. Pingback: When did newly-hired N. American tenure-track ecology faculty get their PhDs? | Dynamic Ecology

  6. Pingback: What proportion of recently-hired tenure-track N. American asst. professors of ecology have Nature/Science/PNAS papers? | Dynamic Ecology

  7. Pingback: How many first-authored papers in “leading” journals does an ecologist need to be hired as a tenure-track asst. prof at an R1 university? Not nearly as many as most ecologists think. | Dynamic Ecology

  8. Pingback: Useful links related to tenure track job searches in ecology (last update Nov. 2018) | Dynamic Ecology

  9. Pingback: When, and why, the ecology faculty job market first got so competitive | Dynamic Ecology

  10. Pingback: How productive a researcher do you have to be to be competitive for a TT ecology faculty position in the US or Canada? Here are the data. | Dynamic Ecology

Leave a Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.