Recently, I started a little series of posts about scientific fraud, inspired by a book about financial fraud, Dan Davies’ Lying For Money. In the first post in the series, we talked about how the optimal level of financial fraud, or scientific fraud, isn’t zero. Because the only way to have literally zero financial or scientific fraud is for no one to ever trust anyone. Which leaves everyone much worse off than if they all default to trusting each other, and tolerate the resulting non-zero level of fraud as a price worth paying. To paraphrase Steve Randy Waldman, you can have a trust-based economy that admits some level of financial fraud, or you can herd goats. Analogously, you can have a trust-based scientific research system that admits some level of scientific fraud, or you can do alchemy.
Today, let’s think about the causes of fraud. Davies suggests a simple framework for thinking about this: the “fraud triangle”.
The fraud triangle is analogous to a famous dictum from murder investigations: the murderer must have the means, motive, and opportunity to commit murder. Analogously, Davies, relying on Donald Creesey’s earlier work Other People’s Money, says that financial fraud happens when the following three conditions are simultaneously met:
- Need. Fraud happens when someone feels they need more money than they can come up with via honest means. There are many different sorts of “need”: greed, institutional pressure, fear of admitting that your business is a failure, etc.
- Opportunity. Weaknesses in the systems of fraud prevention are opportunities to commit fraud. A subdivision can be made between “incidental” financial fraudsters who commit frauds against targets of opportunity they stumble across accidentally, and “entrepreneurial” financial fraudsters who seek out specific weaknesses to exploit.
- Rationalization. Financial fraud is committed by people in positions of trust. Most people are averse to breaking trust unless they can somehow rationalize it to themselves. They have to mentally redescribe the crime to themselves in a way that seems to justify it. “It’s only a temporary measure until I’m back on my feet; after it’s over I’ll make good”. “Everyone else is doing it; I’d have to be a sucker not to do it too.” “The system is rigged against people like me; I’m just leveling the playing field for myself.” Etc.
I think this framework applies to scientific as well as financial fraud. Here are a couple of things I like about it:
- It puts systemic and individual factors on an even footing. When a scientific fraud happens, some people react by highlighting the systemic background conditions that they claim led to the fraud, such as pressure to publish and get grants. They emphasize that lots of people have an incentive–a “need”–to commit scientific fraud, and argue that we should reduce fraud by reducing the need to commit it (e.g., by somehow reducing pressure to publish). To which others (including me on occasion!) respond by noting that the vast majority of scientists don’t give in to that “need”. They may feel pressure to publish papers and get grants, but yet they behave honestly anyway. See for instance Tal Yarkoni’s argument that, when it comes to scientific fraud (and other, lesser sorts of corner-cutting), it’s not the incentives, it’s you. This response suggests that, to prevent scientific fraud, we need to understand why rare scientists are able to rationalize fraud to themselves (which we don’t really understand, unfortunately). And still others argue that scientific frauds highlight the need for, and value of, control measures that help prevent fraud, such as mandatory data sharing (e.g., on Data Dryad) and data editors. The “fraud triangle” framework highlights that everybody is right, at least in principle. In principle, you can reduce fraud to the desired level (which should not be zero!) by reducing need, opportunity, rationalization, or any combination of those. Which factors to focus on is an empirical question of marginal costs and marginal benefits. For instance, international comparative data suggest that there are some countries that could reduce scientific fraud by removing certain strong, direct incentives to commit it, such as direct cash payments for publications. Of course, if everyone is right in that sense, that also means everyone is wrong in a different sense. If you think that the only way to prevent scientific fraud is to reduce the (perceived) need to commit fraud, you’re wrong. If you think the only way to reduce scientific fraud is to reduce the opportunity to commit it, you’re wrong. Etc.
- It makes me feel good about how I talk to my students about scientific fraud. One thing I tell my grad students, and also the students in my undergrad courses, is that at some point you’re going to feel a strong “need” to commit misconduct (scientific fraud in the case of grad students, academic misconduct in the case of students in my undergrad courses). That is, you’re going to be tempted to commit fraud or misconduct at some point, whether because your thesis project seems to be failing, or you forgot about an assignment that’s due in a hour, or whatever. And in that moment, your panicked brain is going to work overtime coming up with a rationalization to justify the misconduct. So in that moment is when you need to remember three things. First, your brain is looking for a rationalization. Don’t let it find one. Instead, decide right now that, if you’re ever tempted to commit misconduct in future, you’re going to choose not to. Because right now you’re calm and thinking clearly and in a much better position to make good choices. Second: you may feel the need to commit misconduct in a moment of temptation, and you might even find a rationalization for it, but you don’t have an opportunity. You’re going to get caught. Third, the need to commit misconduct isn’t nearly as big as you think it is in that moment of temptation. One common part of the rationalization process is convincing yourself that that you have some huge, urgent need to commit misconduct. But in fact, you don’t. The stakes aren’t nearly as high as you think they are. There’s some alternative thesis project you could switch to, so that you don’t have to fake the one you were originally planning. That assignment that you forgot was due in an hour is only worth a tiny fraction of your mark in one course; it’s no big deal in the grand scheme of things. Etc. (UPDATE: this paragraph edited slightly from its original version, to clarify it in response to a comment)