Scientists are reading less than they used to! Well, unless you take the median, in which case they’re reading the same amount as before, and you have to rewrite your Nature news piece. ht: Liz Neeley
Here’s a post written by a woman at Harvard, Mia You, about her frustration with the Harvard library policy of not allowing children into the library. The good news is that the official policy was revised within days of the post appearing.
In the sad but too true category: how to kill a paper when reviewing (the first comment containing a rebutttal letter to the editor is even better) HT @calestous
A case study of the importance of best practice in experimental design: Studies of nestmate recognition in ants often compare intra- vs. inter-colony aggression, with the expectation that intra-colony aggression should be low or absent. And that’s indeed what most studies find–but in large part because of confirmation bias. A new meta-analysis shows that the minority of studies that have been conducted blind find much higher rates of intra-colony aggression, and much lower differences between intra- and inter-colony aggression. In other words, researchers observing behaviors that are difficult to classify tend to see what they expect to see, even if it’s not actually there. I’d be curious to know the range of reaction to this among the researchers doing this sort of work. How many are nodding their heads and going “Yup, that’s why we always use blinding,” how many are going “Wow, maybe I’d better start blinding”, how many are going “Well, this just shows what happens if you observe animal behavior without sufficient experience and training”, how many are going “Ok, but that just shows behavioral studies are hard, you can’t expect them all to be perfect,” and how many are going “Meh, even non-blind studies still get the direction of the effect right on average, so this is just nitpickers carping about trivialities”? (ht Florian Hartig, who has additional discussion, and suggests using this study as a teaching tool for undergaduate experimental design couses. A suggestion I plan to take up!) (UPATE #2: I had forgotten this: in an old post I asked if ecologists should do blinded data analyses! Apparently the answer is “yes”.)
A long-running survey (since 1977) reports that US scientists may have reached “peak reading”. Previously, the trend had been for scientists to report reading more papers than previously, but spend less time on each of them. That trend seems to have peaked. In any case, “power browsing”–skimming quickly, just looking to “get the gist” or some useful snippet of information–clearly is the new normal. Which is one big reason why post-publication “review” is mostly a non-starter. As I’ve said before, the only time people read like reviewers is when they’re acting as reviewers. There are exceptions (the arsenic life thing, for instance), but they’re just that–exceptions.
Terry McGlynn responds to my post on natural history vs. ecology. You should click through, he has a lot to say and as always his thoughts are worth reading and very readable. The very short version is that he thinks that academic training in ecology overemphasizes academic job skills and tasks (analyzing your data, writing your next paper, getting your next grant…) at the expense of skills that make no immediate, obvious contribution towards getting an academic job. And that “knowing the natural history of study systems other than one’s own” is one of those skills without obvious job market value. His broad point is an important that goes well beyond natural history, since knowing the natural history of study systems other than one’s own is only one example of skills, activities, or knowledge that have no immediate, obvious professional payoff (“blogging” is another, and there are many more…). The comment thread over there is good too–scroll down for a thoughtful and wonderfully-phrased comment from Dan Janzen.
You’re never too young to write a major synthesis paper. Indeed, arguably you can even be younger than the author of that post. In his famous “modest advice” to grad students, Steve Stearns suggests that a good thesis proposal should be publishable as a critical review paper.
A radical proposal (UPDATE: link fixed) to reform granting bodies: give every qualified scientist a block grant for the same amount, with the requirement that they give some fixed fraction of it away to another qualified scientist. Discuss! [grabs popcorn]
Videos of the talks from BAHFest, the festival of Bad Ad-hoc Hypotheses about evolution. The idea is to present interesting-sounding, seemingly well-grounded evolutionary hypotheses that nevertheless are obvious rubbish. A fun and interesting way to try to teach the general public (and evolutionary psychologists?) to be more discriminating. Maybe also a good teaching tool for classes on scientific methods and study design? I haven’t actually watched the videos yet, but I’m looking forward to checking out the one on adaptive infant aerodynamics. 🙂 (ht Ed Yong)
And finally, peanut butter and jelly…fish (ht Ed Yong)