At some point in every qualifying exam, there will be a question that the student doesn’t know the answer to. Actually, that’s not quite accurate – this doesn’t happen just once; it happens repeatedly, in every qualifying exam. That’s part of the point of the exam: to explore what the student knows and what they don’t know.* What I will argue here is that it’s essential in this case – and, indeed, in science in general – to realize when you don’t know something and to admit that. To put it into Donald Rumsfeld’s framework: scientists need to know their unknowns.
If we go back to the scenario of the qualifying exam: Ideally, when asked about something they don’t know, the student says something like “I’m not sure about that topic, but what I think might happen is that…” That is, the student acknowledges that they are moving beyond the limits of their knowledge, and are beginning to speculate. (Usually, in a qualifying exam, the faculty are interested in that speculation, to see how the student works through a new idea or problem. Speculating is fine; BSing and trying to pass it off as knowledge is not.)
While trying to bluff your way through a qualifying exam isn’t a good strategy, it’s also not going to harm anyone else. In other situations, though, failing to recognize and/or acknowledge what you do and do not know is really important, with the potential to cause harm. This is something I’ve discussed with my sister, who is a physician. I think it’s pretty clear that it’s dangerous for a physician to go beyond his or her knowledge. For example, if she isn’t sure of the recommended treatment for a condition, she wouldn’t just guess as to what to prescribe; she would do additional research or refer the patient to a specialist. Clearly there’s the potential for real harm to someone’s health if she doesn’t acknowledge when she doesn’t know about something, or if she thinks she knows something but doesn’t. In the case of scientists, usually physical harm doesn’t result, but it’s possible – for example, if you’ve never set up an acid bath before, you don’t want to just take a guess as to whether to add the acid or water first.**
Aside from physical harm, why is it dangerous for scientists to not realize or admit when they don’t know something? Perhaps the most important reason is that it can severely compromise the data that is collected; in the worst case scenario, it’s useless but the scientist doesn’t recognize it as useless. Unfortunately, this happens. I know one lab that lost an entire field season’s worth of data because of a very basic mistake a technician was making the whole year. In that case, at least they realized it (though only by chance, right at the end of field season). What is really scary is that, if they didn’t notice it (which easily could have happened), they would have proceeded to analyze and publish those data with no idea that they were not collected properly. Sometimes something about the data will indicate that there was something “weird” going on — say, phosphorus levels being really high might indicate a contamination problem. One of my biggest lab fears is that someone will be collecting data that appears reliable but actually isn’t. Intentionally falsifying data is obviously unacceptable and a major breach of ethics, but, from the point of our ability to understand nature, it is just as bad for someone to be accidentally collecting data that seems reliable but isn’t.***
I think one way to minimize the risk is to establish a lab culture that encourages people to ask questions and to admit if they aren’t sure of something. I also spend time trying to make sure that everyone realizes that, if they make a mistake in collecting data, it affects a lot of people. Students (especially undergrads) really respond to this. In lab meetings focused on ethics, we usually talk about the impacts of falsified data on the people who spent their PhD trying to follow up on those results. This always makes an impression on students, and it’s easy to talk about how this also applies to data that are inaccurate due to a lack of care or knowledge. If you have a good lab culture, they will care about their labmates and want to help them. At the same time, you don’t want to paralyze people with fear of making a mistake, and you really don’t want to discourage someone from admitting it if they realize they’ve made a mistake or been doing something wrong. Again, though, they’re more likely to admit that they’ve made a mistake when they realize that it will affect others. If something went wrong, admit it. And, if you’re not sure how to do something, ask.
Not knowing (or recognizing) the limits of your knowledge also results in a trust problem. This is true both for the individual and for scientists as a group. As an example at the individual level: I once went to a talk at a major international meeting where the person was presenting results of a study that was framed as being about Daphnia, but the picture shown when introducing the study system was of Ceriodaphnia. If the person couldn’t figure out the genus of their main study organism, should I really trust that they had figured out how to measure phosphorus correctly? And, at a larger level, if I were to say in public that, say, there was going to be an outbreak of a horrible disease tomorrow and there isn’t,**** that erodes the public’s trust in scientists. This is not to say that speculation is never appropriate – just that you need to make it clear when you are speculating.
So, in short: know what you know, know what you don’t know, and, if you’re speculating, acknowledge that.
* Students often seem kind of terrified at the idea that they will be asked something that they don’t know. But it is completely to be expected that every student will have multiple questions that they don’t know the answer to. No one in the room could answer all of the questions that are being asked. There are usually 3-5 people with many years of experience asking the questions, and they’ve usually been chosen because they have complementary expertise. I have thought “Huh, that’s interesting, I’ve never thought about that” and “Hmm, I think I knew that once, but apparently have forgotten it” countless times during oral exams.
**“Acid to water, just like you oughta”
***This relates to my post on the merits of system-based research: I think you are less likely to make a mistake due to an “unknown unknown” if you know a system really well.
****I think not making stuff up is so important that I am being intentionally vague here. It feels wrong to me to make up a fake prediction and put it here in writing.