Dan Bolnick just had a really important – and, yes, brave – post on finding an error in a published study of his that has led him to retract that study. (The retraction isn’t official yet.) In his post, he does a great job of explaining how the mistake happened (a coding error in R), how he found it (someone tried to recreate his analysis and was unsuccessful), what it means for the analysis (what he thought was a weak trend is actually a nonexistent trend), and what he learned from it (among others, that it’s important to own up to one’s failures, and there are risks in using custom code to analyze data).
This is a topic I’ve thought about a lot, largely because I had to correct a paper. It was the most stressful episode of my academic career. During that period, my anxiety was as high as it has ever been. A few people have suggested I should write a blog post about it in the past, but it still felt too raw – just thinking about it was enough to cause an anxiety surge. So, I was a little surprised when my first reaction to reading Dan’s post was that maybe now is the time to write about my similar experience. When Brian wrote a post last year on corrections and retractions in ecology (noting that mistakes will inevitably happen because science is done by humans and humans make mistakes), I still felt like I couldn’t write about it. But now I think I can. Dan and Brian are correct that it’s important to own up to our failures, even though it’s hard. Even though correcting the record is exactly how science is supposed to work (and I did corrected the paper as soon as I discovered the error), it still is something that is very hard for me to talk about.
Also this week: on identity politics within and outside ecology, the running conversation in your head, long-term illnesses vs. graduate students, data science vs. rogue train, why realized niches are “non-interesting”, hedgehogs > foxes, and more. Lots of good stuff this week, including some changes of pace from our usual linkfest fare.
Most recent update: 30 Nov. 2016
Recently I decided to quantify the gender balance of recently-hired ecology faculty in North America. “Recent” being operationally defined as “hired in 2015-16, or in a few cases in 2014”. Data on the gender balance of faculty is widely available only at the level of very broadly-defined fields like “biology”. Current faculty gender balance mostly (not entirely) reflects the long-term legacy of past hiring and tenure practice rather than current hiring practice (Shaw and Stanton 2012; ht Shan Kothari via the comments). And nobody’s anecdotal experience informs them about the outcomes of more than a tiny fraction of all ecology searches in any given year. So this seemed like a topic on which many people would welcome some reasonably comprehensive data. Follow the link for more details on how I compiled the data. In that old post, I also conducted a poll asking readers what they expected me to find.
Here are the answers: what fraction of recently-hired North American ecologists are women, and what do ecologists think that fraction is?
Many of you are going to be pleasantly surprised…
You’ve got mail. Lots of it, especially if you’re a faculty member. And it’s overwhelming. Those were some of the results of the email poll I did recently. I wrote the post because I am often overwhelmed by email. I was curious to know if others were, too. (My guess was yes.) I was also hoping that someone would have magically figured out how to make the email problem go away. Sadly, there doesn’t seem to be a magical solution, but there were some useful tips. In this post, I’ll first give the results of the poll, which I think were interesting. Then I’ll get to some of the suggestions that came in on the blog and via twitter.
In the poll, I asked:
- How many work-related emails are in your inbox now?
- What is your goal for the number of work-related emails you aim have in your inbox?
- How often do you feel overwhelmed by email?
and then asked for information on the respondent’s current position and age. (I was originally also planning on asking about gender, because I thought it would be interesting to see if there was a difference, but I forgot when I set up the poll. Whoops.)
Before getting to the poll results, a little more on the data, code, and analyses: If you’re interested in the full data set and/or the code I used to analyze it, those are available here. I especially want to focus in on the cross-factor analyses, which I think are the most interesting. These rely on the Likert package by Jason Bryer, which I first learned about from Rayna Harris. It makes really cool figures for this sort of data!
Now, the results:
Also this week: Abraham Lincoln vs. confidence intervals, a double-blind review experiment, myths about applying for faculty positions, and more.
Celebrate with this primer on the ecology of wild turkeys.
Also, reupping this (click the image for the source):
Although I’m having turducken.🙂
Following up on my recent post noting that in some social science fields, including economics, faculty hiring places heavy (though far from exclusive) weight on one “job market” paper, here are some other aspects of how faculty hiring works in economics. Tweets from @LauraEllenDee were part of my inspiration, and comments on that previous post were a big help too (have I mentioned lately how much I love our commenters?)
I find it interesting to think about which if any of these formal and informal practices could or should be adopted in ecology and other scientific fields (even though I think current practices in ecology are mostly reasonable). Learning about how things work in other fields stops you from taking things for granted* and helps you imagine how things could work in your own field. It also gives you a more realistic sense of what any reforms in your own field might achieve. Learning about how things work in other fields both helps you dream and keeps you grounded.
One challenge in thinking about this is that to some extent these alternative clusters of practices may be “package deals”. You can’t always pick and choose, at least not very easily, because any one practice might well be undesirable or unworkable in isolation from other practices.
So here are some other hiring practices in economics (follow that link for the post from which I’ve gotten much of my information. See also.) This is obviously a broad-brush picture and I’m sure I haven’t gotten all the details right; comments welcome. If all you know about is hiring practices in ecology, get ready to enter the Twilight Zone. A world like ours in many respects, but weirdly different in others…🙂
Also this week: the pluses and minuses of preregistering your research, does more pressure to publish really make publication bias worse, the first “man on the street”, and more.
If you didn’t know, in economics and political science, people are hired for faculty positions based in large part on their “job market paper”. As in, one paper, ordinarily from their Ph.D. work and often not even published yet. Number of publications matters relatively little (though apparently it matters more in political science than in economics). Economics even has a centralized repository of job market papers; that’s how much they matter.
I am curious to hear what you think of this, and whether you think this approach or something like it could be an improvement on current practices in ecology. Personally, I think current faculty hiring practices in ecology are mostly pretty reasonable (see also), and so don’t think this would be a net improvement on current practices in ecology. But I think it’s not so obviously a bad idea as to be uninteresting to think about. I find it useful to think about the practices of other fields and whether they’d transfer to ecology. It helps me look at standard practice in ecology with fresh eyes. A few thoughts to get the ball rolling:
Note from Jeremy: this is a guest post from my friend Greg Crowther. Who among other things has been a biochemist, and an instructor in various biology courses including ecology. He’s an unusually thoughtful and creative teacher, for instance using songs to teach anatomy and physiology. Oh, and he has three papers in Annals of Improbable Research (e.g.), which is like the science humor equivalent of having three Nature papers. Thanks to Greg for writing us a guest post on a handy teaching tip.
Most people who think hard about how to teach well accept that students should engage in “active learning,” which has been defined (by Freeman et al. 2014) as follows: “Active learning engages students in the process of learning through activities and/or discussion in class, as opposed to passively listening to an expert. It emphasizes higher-order thinking and often involves group work.”
Sounds good, right? In general, it is good. I enjoy challenging students with hard problems and helping them find their way toward an answer, and they are usually glad to be moving and talking, especially if the problems resemble ones they’ll encounter on tests.
Active learning is relatively easy to include in teaching about a specific research study. For example, after providing some appropriate context, one can simply work through the figures by asking students how and why the data in each figure were collected and what they mean (Round & Campbell 2013).
When teaching basic conceptual material, though, I slip into straight-up lecture mode more often than I’d like. It can be very time-consuming to add nontrivial interactivity to coverage of this material.
However, I do have one fall-back strategy for quickly turning a traditional lecture slide into a mini-discussion. I call this approach the “Dissection of the Imperfect Analogy.” Here’s how it works.