Also this week: more academic “quit lit”, how to write the introduction to your next paper in one handy chart, and more.
I’m breaking my blogging break in excitement over Lego announcing that they will soon have a female scientist series! (Less good news on the women-in-science front came from ESA last week, where only 1 of the 12 new fellows is a woman.)
Here’s a piece from the Chronicle of Higher Education on the need for gay mentors in modern academia. (ht: Alex Bond)
Here’s a post from Lada Adamic who was, until recently, in the University of Michigan School of Information. The post, “Why I left my academic job”, is long, but broken into subsections and interesting. (I also enjoyed her post of parenting tips from a lazy parent – I use many of the same strategies!) (ht: Titus Brown)
And, finally, here’s an interesting citizen science project related to fireflies. The project involves using an app to help measure firefly populations around the world. (ht: Skip van Bloem)
Caroline Tucker of The EEB and Flow wins the internet this week with her flowchart for how to write the introduction to an ecology paper.
Dan Kahan on why survey data on how many people “believe” in evolution don’t tell you squat about scientific literacy. Professed “belief” in evolution doesn’t predict people’s knowledge of basic evolutionary facts and concepts, or their ability acquire such knowledge. Nor is it a good index of “science literacy”. Asking people about whether they believe that humans descended from other species (or whatever) basically is just a way of asking people about their religious identities. (ht Ed Yong)
Turns out that you can predict whether someone will eventually become a PI (well, for a crude-but-probably-fairly-workable operational definition of “PI”) by looking at just a few variables early in their career: how many papers they’ve published, the impact factors of the journals in which they publish, how often their papers are cited relative to the average for the journals in which they appeared, where they went to grad school, and (sadly) their gender. I’m linking to this because it’d probably look like I was living in a cave if I didn’t (it was all over the science intertubes this week), but I’m not sure why this is news. Is anyone surprised that people who publish lots of highly-cited papers in top journals are more likely to get faculty jobs? And for anyone who thinks that these data somehow prove that scientists are not judged by the quality of their work and search committees just lazily look at journal impact factors, publication counts, and citation data to decide who to hire: um, no. Lots of things are correlated with who gets hired, for all sorts of reasons (some of them perfectly good reasons). But those reasons often do not include “search committees directly evaluating candidates on the basis of those things”. Correlation is not causation, people! (Aside: the authors have put up a web app that lets you calculate your own odds of becoming a PI. But because the authors’ work is based on the PubMed database, it missed many ecology publications and so the probabilities that their web app spits out probably don’t apply to ecologists). (ht @hormiga, and Jan via email)
Against the internet. Interesting, and more thoughtful than this sort of thing usually is. (ht Marginal Revolution)
Hoisted from the comments:
In the comments on my post on the greatest ecology and evolution dissertations ever, Meg passes on a great story about David Sloan Wilson’s dissertation:
David Sloan Wilson’s thesis is legendary at MSU and especially at KBS. It is held up as the model of quality not quantity (or, at least, it was when I was there). Don Hall, who was his dissertation advisor, said that it was initially rejected by the grad school because it was too short to bind, and a bound copy of the dissertation was required; Don asked them how many blank pages he should send over to them to include with it so they could bind it.
Jeremy, as a follow up to your link on predicting who becomes a PI, there was another “we can predict who will become a star down the road” paper not to long ago, also in Bioscience (Laurance et al. Predicting Publication Success for Biologists). I link to the original paper in my criticisims of their analyses:
I find the attempts to do this kind of predictive analyses a real concern because 1) they are often poorly done or 2) fail to consider the issues you raised in your search committee post.
Cheers for this Emilio, will give it a shout out next Friday. And I think you’ve prompted me to maybe do a broader post on how we tend to oversimplify the causality of scientific influence (which of course is connected to career progress).