Last year, I wrote a blog post about a =piece that had appeared in Nature Biotechnology related to graduate student mental health. There were two big problems: first, Nature Biotechnology had not checked whether there had been IRB oversight of the study before publishing it, which is a huge ethical problem. Second, the major result (that grad students experience anxiety and depression at more than 6x the rate of the general population) was not valid — they used an apples to oranges comparison to get that statistic. Unfortunately, that inaccurate statistic has dominated the discourse on graduate student mental health since it appeared.
In addition to writing a blog post, I worked with two behavioral scientists, Carly Thanhouser and Holly Derry, to write a formal response to the Evans et al. study. We submitted it on May 17, 2018. On April 5, 2019, we finally heard back about our submission. It had been peer reviewed (unlike the original Evans et al. submission) and accepted. On April 17, I uploaded the final version and the paperwork. Since then, the manuscript (which, remember, has already been accepted) is still listed in their manuscript system as “under consideration”. No one at the journal office will explain what is going on, despite multiple emails (including one to the Editor in Chief on May 15th).
Here, I am going to explain why I have devoted so much time and energy to this (frustrating!) process over the past year. I care a lot about graduate student mental health, so it might seem weird that I’ve spent so much time trying to point out that we don’t have evidence that grad students experience depression & anxiety at 6x the rate of the general population. To explain why, I need to briefly introduce the idea of anchoring. And, to do that, I’m going to tell you a story.
When I was in college, I did some fundraising for a club that I was part of. We were given a list of people to call, as well as information on how much they had donated in the past. We were then told to ask for much more — I don’t remember the exact recommendation, but it was something like 3x as much as their highest previous donation. If they said no to that, we could drop to a lower number (say, 2x their previous high). By having introduced that much higher number (3x their previous high donation), we now set that as the reference point, and a lower-but-still-high number (2x their previous high donation) no longer seemed as high. They were much more likely to agree to give more than they had in the past when we set that higher number. (I always felt extremely icky doing that and basically never followed the script.)
What I was being told to do, without me knowing it, was something called anchoring. Anchoring is well known in behavioral economics & psychology. It’s a “form of priming effect whereby initial exposure to a number serves as a reference point and influences subsequent judgments.” (source of that quote)
Back to the Evans et al piece: the reason it zoomed across twitter and got very widespread coverage was the 6x greater risk number. But, as we explain in our response, that number is not valid — they made an apples to oranges comparison to get it. Evans and colleagues do briefly tip their hat to this problem in their article, saying “Although this is a convenience sample in which respondents who have had a history of anxiety of depression may have been more apt to respond to the survey…”. But they then plow ahead and, under a heading that says “Mental health crisis in the graduate student population”, write “Our results show that graduate students are more than six times as likely to experience depression and anxiety as compared to the general population.”
Now that the 6x number has been out there in circulation for a year (and, again, it has received a *lot* of attention), we’ve been anchored to that number. So, when a new study comes out and finds, say, a 2.5x greater risk, it won’t seem that high.
I care a lot about graduate student mental health. I’ve spent a lot of time while on sabbatical working with folks at Michigan’s Rackham Graduate School to develop a task force that will focus on better supporting grad student mental health. There’s so much we need to do in order to improve graduate student mental health. But it’s essential that we work from information that is as accurate as possible. So, I will persist in trying to get Nature Biotechnology to correct the record.
Why haven’t they published our response and why won’t they explain what is going on? ¯\_(ツ)_/¯ But it sure doesn’t look good that they’ve known for over a year that there’s a fundamental problem with the headline-grabbing result of that study and have not corrected it.
Nice post! I’m guessing Nature Biotechnology likes the clickbait attention it got with the 6x number? I hadn’t heard of the word, “anchoring” but always noticed that foundations do that. It backfires because it usually makes me not want to donate anything since I remember they asked for half or 2/3 the amount a few years back.
It’s great that you’re tackling this problem. best of luck!
Perhaps it is an illusion on my part, but science seems to be a less and less useful guide to understanding the world. Individual contradictory reports are published in sequence with no attention to the overall context and how to interpret the contradiction. Less and less work is funded by sources that don’t want to put their thumbs on the scale. Publishing is devolving into PR rather than a search for truth. I appreciate your efforts in maintaining scientific integrity.
Pingback: “Banished” data used in a paper; cancer group’s database draws ethical scrutiny; company employees banned as peer reviewers – Retraction Watch – Journal Ville
Pingback: Dynamic Ecology year in review | Dynamic Ecology