Also this week: the mission (impossible) of public research universities, who cares if the evidence for X is “growing”, journal editor pet peeves, timely advice for grad school recruitment visits, and more.
This is a fantastic fake article on tearoom culture and cleanliness, purported to be by Lotta Washinup and Duya Dishez, but actually by Jennifer Upchurch (@jamandcrumpets). She has posted the pdf in case you need to do a little passive aggressive prodding of fellow tea/break/coffee room users to encourage neatness. :)
Psych journal bans p-values. (ht: Tuomas Aivelo) (Jeremy adds: Hey, that was my link! [grabs link from Meg] See below for my comments.)
Eugenie Clark, an expert on sharks, has died at the age of 92. She did her last dive when she was 88, which is mind-blowing to me!
UPDATE: Just found this one and it’s so good I couldn’t wait until next week to share it. Loretta Jackson-Hayes nails it: why liberal arts training is so valuable for scientists.
Scitrigrrl with a nice post on how her third year on the tenure track is kicking her butt–and what she’s doing about it.
Current ecology grad students on questions they wish they’d asked during grad school recruitment visits. Glad they have some advice, because I don’t have any. I visited prospective supervisors individually; their departments didn’t have recruitment weekends as far as I know. I hadn’t realized until recently just how unusual this makes me.
The journal Nature is offering authors the option of double blind peer review. Hard to say how much difference it will make, since it’s not a controlled randomized experiment. And since it’s optional, the mere fact that an author’s chosen it might provide some information to reviewers as to the author’s identity or attributes.
Speaking of peer review, Brian Enquist (an editor at a couple of ecology journals) has some good advice for reviewers in the form of “editors’ pet peeves”.
Scientists often write that a “small but growing body of evidence” suggests X. I’ve probably done it myself at some point. But as Andrew Gelman points out, it’s odd phrasing. Shouldn’t we only care about the current state of the evidence, not about whether there’s more or less evidence than there used to be? And if we do care about the rate of change in the amount of evidence, how come you never hear anyone refer to the “large but shrinking body of evidence” for X, or the “small and unchanging body of evidence” for X, or etc.? I think Andrew’s right to be suspicious of the phrase–it often just functions as a way of making the existing evidence for X sound more compelling than it is.
In what sense is undergraduate teaching “central to the mission” of flagship public research universities in the US? And if you wanted to make it more central, what would be the most feasible way to do it? Of course, given that the “flagship public research universities” are a tiny fraction of all institutions of higher education in the US, and that different institutions have different missions, I don’t know that we should want to greatly adjust the mission of flagship public research universities.
Rich Lenski continues to answer my questions about his long-term evolution experiment. Thanks Rich!
As I feared, people are waaaay overinterpreting results from text mining Rate My Professor reviews (warning: it’s clickbait, don’t bother). Mathematician Courtney Gibbons reminds you to stop and think for two frickin’ seconds before you text mine Rate My Professor for anything other than entertainment purposes.
High achieving low income high school students often have misconceptions that prevent them from applying to selective colleges that would give them generous financial aid and offer the curricula and peers they seek. (ht Brad DeLong)
This week in Huh?: a social psychology journal is banning inferential statistics. As in, all inferential statistics–any attempt to make an inference from sample to population. Which leaves it hilariously unclear why they also will be expecting authors to have larger sample sizes than in the past. Click through in the same spirit that you would rubberneck a car crash. On the other hand, I suppose you could argue that it doesn’t do much harm for a single specialized journal to undertake a radical experiment (better them than Science). There’s value in crazy experiments that aren’t likely to work. On the third hand, there is such a thing as too crazy. And it’s not as if the world lacks for debate about appropriate statistical practice, so I don’t think this experiment has much value even just as a way to prompt discussion. Anyway, the first effect of the new policy has been to cause the entire internet to badger Andrew Gelman for his opinion. If this was trolling, it sure worked! (ht Stephen Heard)
A while back I was invited by a colleague to submit a paper to a special feature in Frontiers in Microbiology. At the time, Frontiers was a newish open access publisher I didn’t know anything about. Based on my mixed experience with them, I wasn’t sure whether to publish with them again. Now I’m sure I wouldn’t, and in retrospect I regret doing so. They’re a poor operation at best. For instance, they’re happy to make money off HIV denialists and to have editors double as peer reviewers. Further troubling examples in the post and comments here. Yes, these are only anecdotes and I’m sure there are negative anecdotes about all publishers. But with Frontiers, the nature and frequency of the negative anecdotes makes me uncomfortable. And Frontiers is old enough now that “growing pains” and “they’re experimenting and trying new things” are no longer excuses, if they ever were. I wouldn’t go so far as to call them a predatory publisher, and I recognize that some reputable academics have had positive experiences with them. But given that Plos One, Ecology and Evolution, and other reputable open access journals focused primarily on technical soundness exist, I don’t see any reason to consider publishing in Frontiers. Just my two cents.
#hipsterscience. “I use Excel for graphs, but I do it ironically.” :-)
And finally: how to get people to open boring work-related emails. Too bad this trick only works once (ht Brad DeLong). :-)
Hoisted from the comments:
Jeff Ollerton with some sensible advice to anyone just starting out as a blogger: it’s going to take months (at least) to build an audience, even if you post lots of good stuff.