In the course of my study of the gender balance of recently hired tenure tract asst. professors of ecology (and allied fields) in N. America, I also compiled data on the Google Scholar h indexes of the new hires. I did the same last year, for a haphazard selection of the new hires; those data are summarized briefly here. Here’s a summary of the combined dataset of all 218 recent hires who have Google Scholar pages, along with a few comments.
I’ll emphasize right up front that the h index is an extremely crude summary measure of research productivity. Perhaps its biggest limitation is giving individuals the same credit for being sole author and being one middle author among many. For this reason and others, I highly doubt that most faculty searches actually involve looking at applicants’ h indices, though some searches might look at other things that are loosely correlated with applicants’ h indices (e.g., whether the applicant has papers in leading journals). My only goal in this post is to give a very rough sense of what level of research productivity is required to be competitive for a tenure-track faculty position in ecology at different sorts of N. American institutions.
Note as well that many recent hires don’t have Google Scholar pages. That’s especially true for recent hires at smaller and less research-intensive institutions. People without Google Scholar pages likely tend to have lower h indices than people with Google Scholar pages. And as you’ll see, recent hires at less research intensive institutions tend to have lower h indices than recent hires at more research intensive institutions. So my data likely are an upwardly-biased estimate of the typical h indices of recent ecology hires.
Here’s a boxplot of the h indices of all 218 recent hires (as of Nov. 2016 for last year’s hires, as of Sept. 2017 for this year’s), split by institution type (100 R1 hires, 118 non-R1 hires):
Next is a boxplot for just the 124 2016-17 hires, broken down by Carnegie class (I didn’t break last year’s data down by Carnegie class beyond R1 vs. non-R1, and I’m too lazy to go back and do so). Institution types are in increasing order of research intensiveness:
A few comments:
- There’s a lot of variation in research productivity among recently-hired tenure-track N. American ecology faculty. Which means that, once you’ve published a handful of papers, you’re competitive for a faculty position (depending of course on how good those papers were, the rest of your cv, your reference letters, your fit to the position, who else applies, etc.).
- Don’t be shocked by how high some of these h-indices are, and don’t leap to any broad conclusions about that. Some of the really high h-indices in this dataset are from people working at the interface of ecology and other fields like physiology in which publication practices, and thus typical h index values, differ from those in ecology. And don’t leap to the conclusion that many people in this dataset must’ve spent many years as postdocs or soft money professors to accumulate such high h indices. I didn’t compile data on this, but as I was IDing recent hires I only found a very few who’d spent 5 or more years as postdocs or in soft money faculty positions. People who spent 1-2 years as postdocs before getting hired were common, including at R1s. (UPDATE: See here for data on how long 2016-17 hires spent as postdocs. 3-4 years is typical; spending 5 years or more is rare but not as rare as I thought when I first wrote this post.)
- There’s lots of overlap in the distributions of h-indices for recent hires at different types of institution. It is not the case that R1 universities only hire people with sky-high h indices, or that bachelor’s institutions only hire people with low h indices.
- There is a clear trend for more research-intensive institutions to hire ecologists with higher h indices, on average. The Spearman rank correlation between institutional research intensiveness (Carnegie class) and h index of 2016-17 tenure-track ecology asst. prof. hires is 0.36, which is highly significant (P<10^-4). The distributions for different institution types overlap a lot, but it’s mostly overlap of the upper halves of the distributions for less research-intensive institutions with the lower halves of the distributions for the more research-intensive institutions. I only identified 5 hires from bachelor’s or master’s institutions with h indices as high as the average PhD-granting institution hire in 2016-17. No doubt that’s a function of both where different people choose to apply, and how different institutions choose among their applicants.
- The previous bullet is relevant to a recent claim that Canada’s NSERC Discovery Grant program is biased against applicants from smaller and less research-intensive institutions (Murray et al. 2016). Murray et al. inferred bias against applicants from small institutions from several lines of evidence. One was that first-time applicants from small institutions have lower success rates and receive smaller grants on average than those from large institutions. Murray et al. say that this indicates bias against new investigators from small institutions because they assume that:
[G]iven the contemporary job market and the glut of PhDs [30, 31], it is unlikely that small schools systematically hire weaker researchers to fill tenure-track positions, or that elite ECRs [early career researchers] accept offers only from large institutions.
NSERC Discovery Grant applications are evaluated in part on the applicant’s track record of research productivity over the preceding 6 years (it’s 1/3 of the grant score). Tenure-track ecologists recently hired by more research-intensive N. American institutions do have stronger research track records on average than those recently hired by less research-intensive institutions, at least by one admittedly-crude measure. I presume the same would be true if you looked at other fields, or restricted attention to Canada, though of course I don’t have data on that. The faculty job market does indeed have many PhDs chasing few faculty jobs–but those PhDs have widely-varying research track records, and the variation in their track records is correlated with the research intensiveness of the hiring institution. This doesn’t mean that anybody’s a “weak” researcher; researchers’ track records vary for all sorts of reasons. I emphasize that the data here only speak to one argument of Murray et al. My point here is deliberately narrow.