The importance of knowing and recognizing the limits of your knowledge

At some point in every qualifying exam, there will be a question that the student doesn’t know the answer to. Actually, that’s not quite accurate – this doesn’t happen just once; it happens repeatedly, in every qualifying exam. That’s part of the point of the exam: to explore what the student knows and what they don’t know.* What I will argue here is that it’s essential in this case – and, indeed, in science in general – to realize when you don’t know something and to admit that. To put it into Donald Rumsfeld’s framework: scientists need to know their unknowns.

If we go back to the scenario of the qualifying exam: Ideally, when asked about something they don’t know, the student says something like “I’m not sure about that topic, but what I think might happen is that…” That is, the student acknowledges that they are moving beyond the limits of their knowledge, and are beginning to speculate. (Usually, in a qualifying exam, the faculty are interested in that speculation, to see how the student works through a new idea or problem. Speculating is fine; BSing and trying to pass it off as knowledge is not.)

While trying to bluff your way through a qualifying exam isn’t a good strategy, it’s also not going to harm anyone else. In other situations, though, failing to recognize and/or acknowledge what you do and do not know is really important, with the potential to cause harm. This is something I’ve discussed with my sister, who is a physician. I think it’s pretty clear that it’s dangerous for a physician to go beyond his or her knowledge. For example, if she isn’t sure of the recommended treatment for a condition, she wouldn’t just guess as to what to prescribe; she would do additional research or refer the patient to a specialist. Clearly there’s the potential for real harm to someone’s health if she doesn’t acknowledge when she doesn’t know about something, or if she thinks she knows something but doesn’t. In the case of scientists, usually physical harm doesn’t result, but it’s possible – for example, if you’ve never set up an acid bath before, you don’t want to just take a guess as to whether to add the acid or water first.**

Aside from physical harm, why is it dangerous for scientists to not realize or admit when they don’t know something? Perhaps the most important reason is that it can severely compromise the data that is collected; in the worst case scenario, it’s useless but the scientist doesn’t recognize it as useless. Unfortunately, this happens. I know one lab that lost an entire field season’s worth of data because of a very basic mistake a technician was making the whole year. In that case, at least they realized it (though only by chance, right at the end of field season). What is really scary is that, if they didn’t notice it (which easily could have happened), they would have proceeded to analyze and publish those data with no idea that they were not collected properly. Sometimes something about the data will indicate that there was something “weird” going on — say, phosphorus levels being really high might indicate a contamination problem. One of my biggest lab fears is that someone will be collecting data that appears reliable but actually isn’t. Intentionally falsifying data is obviously unacceptable and a major breach of ethics, but, from the point of our ability to understand nature, it is just as bad for someone to be accidentally collecting data that seems reliable but isn’t.***

I think one way to minimize the risk is to establish a lab culture that encourages people to ask questions and to admit if they aren’t sure of something. I also spend time trying to make sure that everyone realizes that, if they make a mistake in collecting data, it affects a lot of people. Students (especially undergrads) really respond to this. In lab meetings focused on ethics, we usually talk about the impacts of falsified data on the people who spent their PhD trying to follow up on those results. This always makes an impression on students, and it’s easy to talk about how this also applies to data that are inaccurate due to a lack of care or knowledge. If you have a good lab culture, they will care about their labmates and want to help them. At the same time, you don’t want to paralyze people with fear of making a mistake, and you really don’t want to discourage someone from admitting it if they realize they’ve made a mistake or been doing something wrong. Again, though, they’re more likely to admit that they’ve made a mistake when they realize that it will affect others. If something went wrong, admit it. And, if you’re not sure how to do something, ask.

Not knowing (or recognizing) the limits of your knowledge also results in a trust problem. This is true both for the individual and for scientists as a group. As an example at the individual level: I once went to a talk at a major international meeting where the person was presenting results of a study that was framed as being about Daphnia, but the picture shown when introducing the study system was of Ceriodaphnia. If the person couldn’t figure out the genus of their main study organism, should I really trust that they had figured out how to measure phosphorus correctly? And, at a larger level, if I were to say in public that, say, there was going to be an outbreak of a horrible disease tomorrow and there isn’t,**** that erodes the public’s trust in scientists. This is not to say that speculation is never appropriate – just that you need to make it clear when you are speculating.

So, in short: know what you know, know what you don’t know, and, if you’re speculating, acknowledge that.

 

* Students often seem kind of terrified at the idea that they will be asked something that they don’t know. But it is completely to be expected that every student will have multiple questions that they don’t know the answer to. No one in the room could answer all of the questions that are being asked. There are usually 3-5 people with many years of experience asking the questions, and they’ve usually been chosen because they have complementary expertise. I have thought “Huh, that’s interesting, I’ve never thought about that” and “Hmm, I think I knew that once, but apparently have forgotten it” countless times during oral exams.

**“Acid to water, just like you oughta”

***This relates to my post on the merits of system-based research: I think you are less likely to make a mistake due to an “unknown unknown” if you know a system really well.

****I think not making stuff up is so important that I am being intentionally vague here. It feels wrong to me to make up a fake prediction and put it here in writing.

17 thoughts on “The importance of knowing and recognizing the limits of your knowledge

  1. Great essay Meghan. Especially liked the points about creating the right atmosphere for people to admit mistakes when they occur, question things end about how important mistakes can go entirely undetected in the absence of close scrutiny.

    The basic point also applies to the full acknowledgement of statistical/mathematical uncertainty in the analysis of data, a frequent problem. Because there is such a premium put on making new discoveries in science, there is pressure to gloss over uncertainty in favor of bold conclusions. This is the most serious, systemic problem in science generally, IMO.

  2. This is a really important thing to bring up, and it’s really constructively done.

    Here’s a relevant anecdote: I just had something amounting to a minor catastrophe happen in my lab when a student unilaterally decided that one component of a field project wasn’t important and then didn’t do it (at least, not adequately). We had plenty of discussions about this experiment, and the predictions and variables involved, and I have no doubt that I was really clear that this piece was a critical element. The student brought to me the idea of scaling back or not doing this one piece, and I explained exactly how and why it was important and couldn’t be dropped. And then, a month later, the student did the exact opposite of what I said. So now, almost a whole summer’s worth of fieldwork is (almost) useless because there are no numbers for one of the central response variables.

    The student clearly is at fault. But I also am at fault just as much as the student, for I (despite working hard otherwise) cultivated an environment in which the student couldn’t (or at least, didn’t) comfortably discuss misunderstandings and priorities. As pointed out in the post, this was caused by, or at least resulted in, a trust problem. This was an overconfident student, and I’m used to dealing with the opposite. The last time I had student fail to disclose doing something wrong before this, a long while ago, was also from a position of overconfidence and in which (for entirely different reasons) the students didn’t trust me. So, at least for me, generating an open environment where it’s okay to be not just wrong, but unclear, is a long-term work in progress.

  3. This will probably be controversial, but one area in which I wish scientists would recognize the limitations of their knowledge is science publishing, particularly the financial aspects.

    Buying groceries does not make you an expert in supply chain management. Neither does being as an author, reviewer or editor give enough insight into the economics of publishing to make an informed judgment. It’s a real puzzle, as academics must know that things are never as simple as they seem, yet there are all too many sweeping declarations about the state of peer review or about the ‘goodness’ or ‘badness’ of the various organizations involved. It’s reminiscent of the Dunning-Kruger effect (see here), which we normally find ourselves battling when educating the public about evolution or climate change.

    To give a few examples, most think that non-profits are ‘better’ than for-profit companies, yet very few know how these business models actually differ. Non-profits don’t make a profit, right? Similarly, almost everyone thinks that online publishing is virtually free compared to expensive old print publishing. It’s not – curation of print copies falls on libraries, but with online publication the publisher has to build, maintain and constantly upgrade a hosting system and the related content, which is much more expensive. Lastly, in terms of cost per paper, Open Access publishing is more expensive than subscription publishing, mainly because in one system you’re buying in bulk and in the other you’re making thousands of individual purchases.

    I’m not suggesting that academics do a course in science publishing, but it would be nice if everyone checked over their assumptions before making judgments.

  4. One thing I bang away on with my summer undergrads when I’m training them: don’t guess, ask. Not sure how to do something? Ask. Not sure what you’re seeing under the microscope? Ask. Think you got a weird data point? Ask. Even if you know it’s something I’ve already told you once or whatever. I’m not going to yell at you. Asking shows me that you care about getting it right.

    • I thoroughly agree. I am a speech pathologist and I always tell the people that I supervise that if you ever think you know it all, it’s time to leave the profession. The people who are most successful in any field are those who are willing to admit professional limitation. They ask questions. That is how we continue to learn and grow.

  5. Another context in which this comes up is when you’re teaching, and you’re asked a question about the material to which you don’t know the answer. A good way to respond is, “That’s a good question. I don’t know the answer, but I’ll find out and get back to you next class.”

    • My father in law teaches med school, his response in those situations is to assign the student who asked the question to look it up and report back to the class next lecture. On it’s surface, it sounds like a dick move, but I have run across students who seem to ask questions with the intent of tripping up the lecturer, especially in a graduate setting. This sets the ground rule that asking questions in lecture is fine, but you should actually want to *know* the answer enough to research it yourself if it comes to that.

  6. I think this issue of recognizing when you don’t know something and acting accordingly should also apply to engaging in public criticism of an area of science that you lack expertise in. Major examples that come to mind are the minority of scientists who oppose things like vaccines, climate change, and evolution. Contrarian scientists can always be found, but they usually have expertise in fields with little connection to the topics they criticize – e.g., Michael Behe is one of the most outspoken creationist PhDs but has a degree in biochemistry rather than evolutionary biology.

    • Can’t say I agree with you.

      Science has fundamentals that cross all disciplines. Very few hypotheses satisfy all the relevant questions, which leaves the door open for criticism from other quarters. Beyond that, the most general theories are always multidisciplinary. Evolution is an excellent example of a multidisciplinary theory, drawing on elements of biology, geology, chemistry and even physics (isotopic ages).

      People with extensive experience in a given field should always be wary of their own perceived expertise. Often, it’s criticism from outside of a discipline that drives science forward. And of course anyone can have a good idea, regardless of their credentials.

      There will always be people with nutbar ideas living on the lunatic fringe of science. The way to keep them on the lunatic fringe is to effectively refute their ideas, not to attack their credentials.

  7. Pingback: What we’re reading: Polygenic mutation-selection balance, demographics of invading mice, and the U.S. consensus on climate change | The Molecular Ecologist

  8. Pingback: My most embarrassing moments in academia | Dynamic Ecology

  9. Admitting ignorance is a key issue- but I think a point that has been lost in many fields of biology, particularly in some areas of cellular and molecular biology, is that is OK to speculate, so I was very happy to read in your post “Speculating is fine”. I’ve interviewed hundreds of grad school candidates over the years, and many are terrified of playing with a hypothetical scenario. Some are just timid, but in some (I’d say most) cases, it is pretty clear that this has been drummed into them by their University professors- a virulent form of “Whereof one cannot speak, thereof one must be silent.”

  10. Pingback: The benefits of continuing to work in the lab as a PI | Dynamic Ecology

  11. Pingback: What’s the optimal composition of a graduate supervisory committee? | Dynamic Ecology

Leave a Comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.