Scientists have moral obligations, some of which apply to everybody, and some of which are specific to scientists. For instance, everyone agrees that scientists shouldn’t falsify their data, or abuse their authority over their students. Many scientific societies, such as the Ecological Society of America, have an official code of ethics to which all members are expected to adhere.
But precisely which scientific practices should be covered by our professional ethics? It seems to be increasingly common to argue that scientific practices that weren’t previously thought to be ethical issues actually are. For instance, here’s Mike Taylor arguing that it’s immoral to publish your papers in any venue that isn’t open access. Other practices that have been argued to be immoral include submitting to, reviewing for, or editing for for-profit journals (or even society journals which the society contracts with a for-profit publisher to publish). It’s been argued that reviewing for Faculty of 1000 is immoral. (UPDATE: Via Twitter, Casey Bergman denies that he ever said reviewing for Faculty of 1000 is a moral issue. Apologies for the misreading, although I think my reading was a natural one given his post starts with what he calls a “morality tale” and says reviewing for Faculty of 1000 violates his “principles”). I’ve had people tell me that it’s immoral of me to decide what to read by scanning the TOCs of leading journals. I’ve heard it argued that pre-publication peer review is immoral, amounting to a form of illegitimate censorship. I wouldn’t be surprised if someone out there has argued that sharing your raw data and code is not merely good for science, but a moral obligation. And I’m sure there are many other examples I haven’t thought of, to do with practices totally unrelated to the “openness” of science.
There’s a common thread to many of the above arguments. It’s claimed that scientific practices in which individual scientists engage (sometimes but not always because they have an externally-imposed incentive to do so) are immoral because of purported negative externalities for some larger group or entity (e.g., science as a whole, the general public, or some subset of the public).* But not all of the above arguments have that character. And in any case my goal with this post isn’t to debate whether or not any particular practice is indeed a matter of ethics, and if it is, what ethical practice consists of. Here I’m just curious about which practices people see as having an ethical dimension. I just want a sense of the range of opinion out there on what’s a matter of ethics and what’s not.
So I’ve set up a little poll, asking you to identify scientific practices that you think have an ethical dimension (by which I mean “ethical issues are a relevant consideration”, not necessarily “ethical issues are the only relevant consideration”).
Obviously this poll isn’t going to take a random sample of views from any well-specified population. But that’s ok, I just want to get some sense of the range of views out there. And the list of “possibly-ethical” issues in the poll may well be incomplete. If you think it’s incomplete, say so in the comments. I also recognize that the poll asks a pretty coarse question and it’s not going to capture all the nuances of people’s views. That’s ok, it’s just a conversation starter.
I’m also curious about how far people think it’s ok for scientists to go in support of their views on ethical scientific practice. But I wasn’t sure how to phrase a poll question on that, so I guess I’ll just throw it out there for discussion. Presumably, nobody has any problem with any scientist who decides on ethical grounds to share all their raw data, or only submit to open access journals, or whatever. Because while others may not share that ethical stance, nobody thinks it’s unethical to share your data, or only submit to open access journals, or whatever. And I don’t see why anyone would have a problem with anyone else explaining the ethical reasons for their own choices–explaining why they personally chose to share their data or whatever. I might not agree that your choice was ethical or wise, but why would I have any problem with you explaining why you chose as you did? But the next step beyond that is trying to convince others that there are ethical issues here that those others aren’t considering. And the next step beyond that is trying to convince others to adopt the same ethical stance as you and change their behavior accordingly. I could imagine that taking either of these two steps might turn off some people, much as some people are turned off by missionaries knocking on their door. The next step beyond that (and one that is perhaps hard to avoid taking if you’ve taken the previous step) is to tell your fellow scientists that they’re behaving unethically. And if the ethical issue is serious enough (and presumably you think it’s pretty serious if you’ve come this far), isn’t the next step to consider your fellow scientists as bad people and treat them accordingly? At least, it might be hard to convince others that you don’t consider them bad people, despite appearances to the contrary! See, e.g., this response to Mike Taylor’s editorial, and Mike’s reply.** Presumably, people’s views on this will come down to how confident they are in the rightness of their ethical stances, and how important they think it is that others adopt the same stance. Again, in raising this I’m just curious to get a sense of the range of views out there.
A request: obviously, people feel very strongly about ethical issues. So I’m more worried than usual about the comment thread getting heated. The comment thread on Mike Taylor’s piece illustrates just how heated discussions of this topic can get. It’s for that reason that I decided not to argue for my own views on any of this, instead just focusing on surveying the range of opinion out there. Indeed, I considered not allowing comments on this post at all. But in the end I decided to trust the great commenting community we have here. Let’s all bend over backwards to make sure the discussion stays productive, professional, and respectful (that applies to me as well). In order to help make sure that happens, I’m going to be taking my role as moderator especially seriously.
*As an aside, I’ll note that any negative externalities created by any given individual’s actions ordinarily are small. So if you call such actions immoral on the grounds that they create negative externalities for some larger group, you are basically trying to combat a purported “tragedy of the commons” via moral exhortation. Garrett Hardin, who first defined tragedies of the commons, famously questioned the effectiveness of attempts to resolve them via moral exhortation. And even Elinor Ostrom, who won a Nobel Prize for questioning whether the only way to resolve tragedies of the commons is via privatizing the commons, agreed that moral exhortation is ineffective unless backed by appropriate norms, rules, and institutions. Even if you think an issue is a moral issue, exhorting others on that basis isn’t necessarily the most effective way to change their behavior. For instance, this is why Owen Petchey and I suggested a rule-based system to oblige authors to review in appropriate proportion to how much they submit, rather than exhorting authors that that’s what they ought to do.
**Mike suggests that publishing behind a paywall is an immoral act that might nevertheless be justified for certain reasons, so that you’re not a bad person if you have a reason. Of course, drawing a distinction between bad acts and the good people who perform them isn’t much comfort to the person whose acts you’re criticizing if that person doesn’t draw the same distinction. Plus, like Jean Renoir said, the truly terrible thing is that everybody has their reasons. I’ve had to address a similar issue in arguing against zombie ideas, drawing a distinction between serious mistakes and the very good ecologists who’ve made them. I’m comfortable making that distinction because ecology is hard and nobody is infallible, so all ecologists, even the best ones, will sometimes make mistakes.