They’re just not that into you: the no-excuses truth to understanding proposal reviews (guest post)

Note from Jeremy: This is a guest post from Peter Adler.


We’ve all been outraged by the nit-picky, irrelevant, or downright wrong-headed criticisms that show up in proposal reviews. How could they have rejected the proposal because they didn’t like the number of replicates? Or the details of the statistical model? Or because you didn’t cite the right references? Or even worse, because they didn’t read closely enough to see that you had cited those references? The answer may be that they didn’t reject your proposal for those stupid reasons. They rejected it because they just weren’t that into it.

While some reviewers are honest and self-aware enough to say that, and perceptive enough to explain why, most reviewers want to back up a negative review with a long list of specific, seemingly objective, criticisms. Sometimes they may not even be fully aware that they are just not that into your proposal. I know that if I’m simply not excited by the core ideas of a proposal I am reviewing, I am more likely to get annoyed by those stupid details. In contrast, if the ideas really grab me, I will be very forgiving of the details.

In order to revise and improve your rejected proposal, it is critical to recognize when the reviewers just weren’t that into your original submission. When that is the case, you need to ignore the details of their review and focus all your energy on making the ideas and the pitch more compelling. Altering details of the experimental design or citing additional references won’t make a difference the next time around. In fact, spending your time chasing after those details means less time and effort addressing the fundamental issue: how to better communicate the novelty, insight, and importance of the work.

The challenge is distinguishing the “just not into it” criticisms from legitimate, well-justified concerns. This is easiest when you know the reviewers are making a valid point. Right now I am conducting some pilot experiments because my last NSF proposal got hammered for not having enough (ok, any) preliminary data. While demands for preliminary data can be a “just not into it” complaint, we were proposing ambitious and expensive work in a new system, and the reviewers’ reaction was understandable. Recognizing the valid criticisms is also easy when many reviewers converge on the same point. Beyond that, it gets trickier. If a reviewer gives strong positive feedback on the conceptual goals and shows real understanding of the problem you are tackling, that may indicate that they “get it” and you should take to heart any accompanying negative feedback.

Could this same advice apply to interpreting manuscript reviews as well? My gut reaction is NO, but I’m not sure how well I can articulate why I feel that way. A practical reason is that revised manuscripts often will be returned to the original reviewers. Obviously, ignoring their comments is a bad strategy. But I think the main reason I see manuscript reviews so differently is that reviewers don’t have to be “so into it” to give a paper a positive review. Papers don’t require reviewers to exercise much imagination: the data has been collected, the analyses are complete, and the story, for better or worse, has an ending. It’s a relatively objective situation for a reviewer. The uncertainty and open-ended nature of proposals makes reviewing them a much different experience. To get funded, you must convince reviewers that your uncollected, unanalyzed data will lead to a story with a better ending than your competitors’ unfinished stories. You are asking the reviewer to imagine a rosy future based on limited information and then commit to your proposal over the others. As the word “proposal” implies, it is a form of courtship, which is why pop culture relationship advice is relevant.

p.s. First, apologies to author Greg Behrendt. Second, if I’m making any implicit claim to authority on proposal writing, it comes much more from my experience as a reviewer and panelist than my very mixed record of success in getting proposal funded.

8 thoughts on “They’re just not that into you: the no-excuses truth to understanding proposal reviews (guest post)

  1. Great post as always Peter.

    I agree that it is different from manuscripts. I think the single biggest difference is just reject rates. While the very top journals have accept rates nearly as low as the fund rates at NSF, they’re still a bit higher, and of course most of the journals that we all have most of our experience with have accept rates that are more like 20% or even 50% than the 7% of NSF. This totally changes everything in how reviewers approach things in my opinion. I think NSF reviewers (and no small fraction of reviewers for Science and Nature) approach things as if it is their job to figure out why it should be rejected, not to evaluate.

    Beyond that – your post raises the obvious question of why are they just not that into you? Is it poor writing and framing? Bad questions? Or is it fashion? Or is it some fields are cooler? Or is it luck of the draw of the reviewers? I suspect all of these play some role.

    • I agree, all those factors play a role. The key is to stay focused on those bigger picture questions (and answers) and not get distracted by the details.

    • Yes, a very nice and useful post. I love the application of the “not that into you” construction to peer review.

      However, I somewhat disagree about the prevalence of “not that into you” in manuscript reviews. Few ecological studies are so simple that there are not alternative valid ways to approach the statistics (see recent posts on this blog about statistical machismo and detection probabilities). My guess is that reviewers who don’t like the message of a paper are more likely to use these alternatives to criticize the manuscript while reviewers who like the paper are more likely to accept the analyses as presented, or frame suggested changes more constructively.

      Identifying the source of the reviewer’s concerns is as important for revising a manuscript as it is for a grant proposal. If the reviewer’s real problem was that you didn’t communicate what was interesting about your work, or you framed it in too controversial a way, changing from one valid but imperfect statistical approach to another valid but imperfect statistical approach is not going to improve your chances of getting accepted at the next journal.

      • Thanks for the comment Andrew (can I call you Rass on a public comment thread?). “Just not into you” can certainly apply to manuscripts too, I just think it’s a matter of degree. As Brian said, the low acceptance rate of many funding programs contributes to the problem. And I think it’s more socially acceptable to make obnoxious methodological requests for work that has yet to be conducted (proposal) than work that has already been done (manuscript).

  2. I think this was a really important post to write! It took me a long time to realize that there were two types of proposal critiques: real critiques and the ‘just not that into my idea’ critiques. I still have trouble distinguishing between them, but at least I now know they exist. The light bulb really went off the first time I was a panelist and realized that some proposals are solid but uninteresting – so what do you say? Because I spent hours as a young prof jumping through hoops that wouldn’t end up saving an unsaveable grant, I try to be honest in the review about why my response is tepid.

    And I think the answer to Brian’s question about why are some proposals not that exciting to the reviewers is ‘yes’. All of those reasons. This is part of the problem with the 5% fund rates. At 30% fund rates solid proposals that had a lukewarm panelist because the proposal just didn’t strike a cord still had a shot at being funded. Not so much now.

  3. Pingback: Recommended reads #37 | Small Pond Science

  4. Pingback: How to win over panels and influence program officers: advice for effective written reviews. | DEBrief

  5. Pingback: How to win over panels and influence program officers: advice for effective written reviews. – DEBrief

Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.