Ethical norms change over time. What once was widely regarded as wrong can come to be regarded as acceptable, admirable, or even obligatory. And what was one widely regarded as acceptable, admirable, or even obligatory can come to be regarded as wrong. Norms can change so much that it becomes difficult to imagine how the old norms could ever have been seen as ok.
Hence my question: what currently widespread norms regarding the proper conduct or teaching of science will change dramatically in the next few decades?
That’s an interesting timescale to consider because it’s roughly the timescale for complete turnover of the scientific community. It’s the amount of time needed for every current scientist to be replaced by a new one, and so it’s the timescale on which norms can change even if nobody ever changes their mind as to what’s ethical. Of course, norms can change much faster if people change their minds about them.
Data sharing is an obvious one. Indeed, I’d say that norm already has shifted (within less than a decade!) It used to be ok to keep your data to yourself forever. Now it’s not ok, absent some special circumstance like the need to maintain confidentiality.
Experimentation on captive animals is another obvious one, at least for the animals most closely related to humans. It’s my impression that research on captive chimpanzees is on the way out due to changing norms of how to treat animals.
Spousal hiring is another one, though that’s more of an academic norm than a specifically scientific norm. Spousal hiring used to be considered nepotism and was widely frowned upon (or so I’ve heard; someone more senior than me correct me if I’m wrong on this!) Nowadays colleges and universities routinely consider requests for spousal hires from the faculty they hire, to the point where they’ve started saying as much in job adverts. (UPDATE: to clarify, I’m not saying that all colleges and universities now routinely grant all requests for spousal hires! Because they don’t. All I’m saying is that, these days, it’s much less common than it used to be for colleges and universities to refuse to even consider a request for a spousal hire on the grounds that spousal hires are unethical nepotism.)
I can think of many other scientific norms that have changed somewhat (or for some people), and that some scientists would like to see changed completely. But I don’t see them completely changing in the next few decades (I could be wrong!). I’m thinking for instance of publishing in subscription journals, lecturing as a pedagogical approach, flying to scientific conferences, and single-blind peer review as opposed to double-blind or open review. All of those things are currently widely (not universally) regarded as ethically acceptable, and I don’t think any of them will come to be widely regarded as unethical in the next few decades. Rather, I think for many of them we’ll see a long-term quasi-equilibrium in which the majority regard the conduct concerned as ethically acceptable while a minority do not. Much as with vegetarianism or veganism in the US–a minority regard it as an ethical imperative, but that minority seems not to be either growing or shrinking, at least not very fast.
I’m sure there’s a massive historical and sociological literature on norm change. I wish I knew more about it. Why do some norms change fast, others slowly, others not at all? One hypothesis: changing some norms requires solving collective action problems. The majority of people might well want to change to a new norm, or at least not mind changing. But only if everyone else changes too, because it’s potentially risky or costly for a “pioneer” to change to a new norm without many others doing so. So the norm remains unchanged in the absence of some coordination mechanism everyone trusts. The norm that it’s ok to publish in subscription journals seems like an example. Part of the reason everyone wants to publish in certain subscription journals is that everyone else thinks highly of those journals and pays at least some attention to them. It’s hard to get everyone to coordinate on some new publication venue or literature-filtering method that plays the same role currently played by those journals.
Note that this post is descriptive, not evaluative. For purposes of this post I’m only concerned with whether any current scientific norms might change dramatically in future, not with whether current or future norms are good or bad. (UPDATE: I emphasize that this post is not a list of current scientific norms that I think are bad and ought to change. In this post, I’m not expressing any view one way or the other on whether any particular norm is good or bad. I do of course have my own views on which current scientific norms are bad, but I’ve chosen to keep my views to myself for purposes of this post. I of course recognize that many readers would rather comment on which norms should change rather than which norms will change, which is fine.)
Via Twitter, a joke answer to the question posed in the post title:
But seriously though, in field ecology we don’t often consider the historical ownership* of the places where we do our work and cultural role of the organisms we study. So where I work (in Canada) that would include the rights of First Nations and other First Peoples. I can forsee field ecologists having to submit to some kind of Ethical Review Board that would assess if a project is respecting these principles
*I know ‘ownership’ isn’t the right word here, but I can’t think of a better one.
http://onlinelibrary.wiley.com/doi/10.1890/10-0340.1/abstract
I’d say transformations in general will go by the wayside… especially in ecology. Increasingly we are finding, for example, that continuous non-normal data are just fine for many parametric tests.
Another reason I’d predict transformation to essentially vanish from ecology concerns a very insightful approach presented by Limpert et al. (2001) for log-normal distributions. Limpert et al. (2001) applied a proposed characterization,
using the geometric mean and multiplicative standard deviation, yielding standard
deviation intervals representative of the normal distribution- and thus no transformation was needed when applying parametric tests to log-normal data.
Some other predictions:
* P-values will become an endangered species.
* AIC (I know Jeremy disagrees) will be recognized for the powerful statistical tool that it is and shall be used far more frequently.
* While not wanting to “go political,” the medical procedure of late-term abortion (7th month on) will be banned due to the scientific/ medical communities backing the opinions of 80% of the American populace.
None of these except the last are really matters of ethics, and the last isn’t a norm about the conduct of science (or a norm on which scientific information is likely to have much impact, I suspect).
Re: AIC, it’s already taking off in ecology in a big way. Can’t find the link now, but there’s a recent text-mining paper in Ecology that shows that leading ecology journals reached “peak ANOVA” back in 2002; its use has been declining ever since. In its place, AIC in particular has been taking off very fast.
Re: my views on AIC, I think it’s like any tool: it can be used well or poorly. I agree with everything Brian said in this post: https://dynamicecology.wordpress.com/2015/05/21/why-aic-appeals-to-ecologists-lowest-instincts/
Interesting post. Something running through my head as I read this post, was the role of senior/established members of the field. My intuition is that established scientists would have more clout to influence these norms through their actions, but would (as individuals) stand to benefit the least from those changes. Additionally, given how normalized/engrained many of these practices are, those ‘influencers’ might also be the same individuals who are less likely to deviate from the status quo.
People who are trying to change norms do sometimes call for senior people to lead the way on the desired changes. But at least in the case of data sharing, it’s my impression that change very much came from the bottom up. It seems like it was mostly junior people arguing for the change, and their arguments carried the day with enough other people that the change happened. Ok, the people who set up Data Dryad and changed journal policies to oblige authors to share their data included senior people at scientific societies and on journal editorial boards. But my sense is that they were very much reacting to what was seen as a critical mass of opinion among mostly-junior people. What carried the day was a combination of (i) good arguments and (ii) a critical mass of junior people who’d already been convinced by those arguments. I think if you lack either one of those things, you’re going to have a hard time getting anywhere.
Yes, data sharing is an excellent example of how these conversations can be driven by bottom up influence. However, what about those norms that are ‘in process’ of shifting? Consider ideas like: double blinded peer review or Ed’s comment about field safety – What are the compelling arguments against making peer-review more fair, and keeping individuals safe? Top-down influence (whether it be editorial boards or university policy) can enact these changes relatively quickly. Regardless of the direction by which these ideas are put forward, the importance of established/senior members of these communities in realizing and catalyzing these ‘shifts’ shouldn’t be understated.
You could well be right about double-blind review taking off in a big way. I know I said in the post that I don’t think it will, but on reflection I’m less sure. Because as you say, this is a case in which the norm could change quickly if a critical mass of junior people makes a good case to a small number of senior people at leading journals. In EEB, it’s arguably already happening thanks to Am Nat’s experiment with double blind review, the results of which are starting to trickle out to and influence other leading journals.
The reason I said in the post that I wasn’t sure peer review blinding norms would change all that quickly is because there are also people who want to switch to open review. So you have competing alternatives to the prevailing norm of single-blind review, which seems like a recipe for stasis. But on reflection, it seems like those arguing for open review are both rarer and have weaker arguments than those arguing for double-blind review. The “argument” for open review seems to boil down to “openness is just an inherently Good Thing”, which isn’t a compelling argument to anyone not already convinced.
Absolutely! When Paul Sereno pulled his Nigersaurus manuscript from review in Science and published it in PLOS ONE instead, followed by his preaching of Open Access, many paleontologists started publishing almost exclusively in OA journals. They are now one of the most zealous OA proponents of all biology disciplines.
Interesting. I didn’t know that.
Via Twitter:
Look out for a blog post about exactly this, coming out on Monday next week!
Yeah, this is an area in which increasing concerns about legal liability on the part of universities are both consequence and cause of changing norms. The days of the “cowboy” field ecologist who takes risks (or has trainees take risks) with little/no training or planning are numbered, I think.
Amen to that, brother. And if ever you would want a detailed account of students suffering unimaginable harm in the field over many years, look me up.
Oh yeah, I heard of dinosaur paleontology grad students being held at gunpoint in war zones!
I would say same goes for exchange students without any preparation on cultural differences and laws in other countries
Via Twitter, an obvious one I somehow forgot:
We have old posts on this:
https://dynamicecology.wordpress.com/2016/07/28/views-on-authorship-and-author-contribution-statements-poll-results-part-1/
https://dynamicecology.wordpress.com/2017/04/05/case-studies-in-coauthorship-what-would-you-do-and-why/
https://dynamicecology.wordpress.com/2015/02/12/is-the-peg-model-paper-an-indicator-of-changing-authorship-criteria/
https://dynamicecology.wordpress.com/2017/03/06/changes-in-number-of-authors-and-position-of-corresponding-author-in-ecology-papers/
Nice historical coverage on this (and I’m not surprised you’ve covered it, and well). And like others at the beginning of your blog post, it’s an example of a norm that’s in the process of changing, albeit maybe not fast enough. Author contribution statements are a step in the right direction, although perhaps every 20th one needs to be externally audited just to keep them honest? It’s probably also a topic that needs to be brought up periodically so that new grad students see it.
Re: author contribution statements, see here:
https://dynamicecology.wordpress.com/2016/08/08/views-on-authorship-and-author-contribution-statements-poll-results-part-2/
I think they’re an example of a purely symbolic action rather than an actually changing ethical norm. Author contribution statements have taken off, but they don’t seem to have any effect on anything. No one reads them or cares about them for any serious purpose. So right now they’re just a pointless ritual.
While not disagreeing with the basic point, I find this an odd example in that the trend has gone the other way. 30 years ago most advisers did not put their name on papers. now most do. Is there really a trend away from this again? This is really part of a larger trend to more authors on papers and that has shown no sign of reversal.
@ Brian:
Yes, I assume what the tweet meant was that the trend has gone towards advisers putting their names on all papers coming out of their labs.
Or perhaps it’s a very bold prediction that the trend toward expansive author lists is about to reverse? Hard to say with a tweet.
No, tweets don’t allow for much elaboration. My perspective is that gratuitous authorship for major advisors has perhaps gotten worse in the last 30 years, but that we have recognized that and hopefully reached a turning point? In any event, that is my wishful thinking coming through.
I got my Ph.D. in 1990 and my supervisor had a written contract for grad students (rare at that time). If you were supported by NSERC funds (which most of us were), he asked that you involve him as coauthor on one paper from your dissertation so that he might continue to be competitive in competing for funds, but it didn’t matter which paper he was on (preferentially one where he had the most intellectual involvement). But everything else could be single authored if you did all the work yourself, and that was encouraged.
That’s a rare model today, but back then the model of analysis (mainframe SAS code, several aspen trees worth of output) and writing (14″ yellow legal pads, typewriter or after hours access to the single departmental word processor) was also inherently different, and much more of a solitary activity. Versus today where graduate students might receive feedback multiple times on analyses and early version manuscript drafts, so the work today is also inherently more of a communal effort, and multi-authorship is perhaps more appropriate today?
For papers that inspire me, I like to read author contribution statements. Whose idea was it? Who did the statistical analysis (if that was impressive). Who was lead writer? And who simply provided lab space, funding, and approved the final version of the manuscript?
Thanks for stopping by to elaborate Todd, and for sharing your experiences. Interesting suggestion that advances in technology have made our ways of working less solitary.
Re: liking to read author contribution statements, according to an old reader poll we did, that’s by far the most common reason people read author contribution statements: just to satisfy their own personal curiosity about who did what. Second (and the only other reason cited by more than a few percent of respondents) was wanting to know who did what so you can email them to ask a question about that bit of the paper. Which I’d have thought was what corresponding authors were for, but whatever. Basically nobody reads them for purposes of apportioning credit or responsibility (e.g., when evaluating applicants for faculty jobs, or student fellowships, or research grants, or etc.)
Very nice topic! Futurology is always risky, but I would dare say that three other practices will change in the coming decades. First, something that is already changing: people will need to explain their data management and statistical analysis better in published papers, and even provide codes for all routines used. Second, the practice of charging for academic publishing on both ends will probably end, and readers will have free or at least cheap access to all papers (as happened with music). Third, there will be more control on how people in positions of power treat colleagues lower in the academic hierarchy.
“the practice of charging for academic publishing on both ends will probably end, and readers will have free or at least cheap access to all papers (as happened with music). ”
Bold!
“there will be more control on how people in positions of power treat colleagues lower in the academic hierarchy.”
Can you elaborate?
“Bold!”
I think it’s a global trend in many sectors of society: users should not pay or at least pay small charges for consuming cultural content (music, videos, podcasts, news and, maybe, academic papers). There are other ways to make money (advertisement, big data etc.) and keep the publishing industry profitable. It’s only a matter of disrupting the current business model and creating a new one. SciHub is the Napster of academia.
“Can you elaborate?”
Sure! Academics in tenured positions (i.e., professors) have huge power over their subordinates (i.e., non-tenures professors, lecturers, research associates, postdocs, technicians, and students). And, worryingly, there is little external control on their practices. That’s why cases of sexual harassment, moral harassment, racial prejudice, xenophobia, and bad team management abound in universities. I know things are slowly changing in developed countries, but we are far behind in the rest of the world, and bizarre cases emerge everyday.
“That’s why cases of sexual harassment, moral harassment, racial prejudice, xenophobia, and bad team management abound in universities.”
The UK absolished tenure in 1987. It would be interesting to see if that had any effect on these behavoirs.
Really? I thought there were still full professors with tenure in the UK.
The only tenured professors in the UK are those that achieved it before 1987. The last tenured professor just retired from my Dept. He was over 70 (until recently all academics were made to retire at 65). We still have full professors, they just don’t have tenure.
In line with your point on animal welfare, it seems that we might soon exend ethical considerations to small critters (invertebrates), and even, some day, plants, as we realize that sensitory systems are more widespread. Will be much more complicated to do experimental biology then…
A bold prediction!
I doubt animal welfare regs will ever extend to microbes or plants. Certainly not within the next few decades. Recall that Switzerland (? I think; it was some European country) proposed to do this a few years ago. IIRC, the regulation was broad and vague and wasn’t to do with pain and suffering. It was something about every organism, including plants, being entitled to live in the way that it was by nature meant to live, or something. The regulation was widely mocked and quickly withdrawn.
And gardening and farming are both popular and as far as I know almost nobody has an ethical problem with those activities because they hurt plants. Very hard for me to imagine the future discovery about plant sensory systems that would change that view. And if gardening and farming are ethically ok, then it’s hard to imagine scientific research on plants being considered unethical because it hurts plants.
Same for insects, other inverts, and microbes. It’s currently legal, and considered ethically ok, for ordinary people to purchase poisons to kill them indiscriminately, and you don’t even have to have a reason beyond “I felt like it”. So there’s a looong way to go before animal welfare regs get extended to insects or slugs or whatever.
As someone who works with protists, I sure hope animal welfare regs never get extended to microbes! I sometimes joke that I grow up huge populations of protists and then pour them down the drain just to hear their tiny screams.
Octopus research already requires IACUC approval in a few countries so extension to some other invertebrates is possible. As for vertebrates, research on birds I did in grad school not so long ago (1994-2004) is now impossible, as it requires multiple surgeries on the same animal, OK back then, a big no-no today.
Its interesting to me that octopi and other animals show up on menus even in places that regulate their use in research…
Also, go to Google Scholar and search for “ethics creep”. It is all about humans and IRBs going wild, but it is happening with animals and IACUCs as well, more under the radar. And there is nobody to complain to about your IACUC if it is full of morons or PETA activists. Essentially all teaching labs that used live animals in lab classes are now shut down, while field classes are next in their sights.
Via Twitter; I’m putting it here because it fits this subthread:
Using undergrads as menial labor (cleaning, pipetting, data entry etc) without paying them. [Giving them real scientific experience, mentorship, and opportunities to provide author worthy contributions or learn complex/marketable skills might be seen differently].
Hmm, interesting suggestion. You might be right.
Terry McGlynn has a related argument that using high school students as menial volunteer labor in university labs is unethical. On the grounds that the high school students who volunteer are mostly students who were born into lots of advantages already, and who are just looking for a (actually meaningless, but nonetheless good-looking) line on their resumes for when they apply to selective colleges and universities.
Unpaid internships in conservation biology are a related example, though with some differences. Many of those internships aren’t totally menial and do provide real experience. But the arguments, as I understand them, are that (i) it’s exploitative for conservation organizations to take advantage of the fact that lots of people would like careers in conservation and are willing to do unpaid internships if that’s what it takes to get a foot in the door, and (ii) it’s discriminatory for conservation orgs to staff entry-level unpaid internships only with the people from well-off backgrounds who can afford to take such internships. There are counterarguments, though, and I wouldn’t venture to predict which side will win the normative argument over the next few decades.
I appreciate Chris MacQuarrie’s comment ‘re ethics of territoriality for indigenous peoples in the places where we do research (and anticipate/very much hope that will extend beyond Canada). I also anticipate there will be a swing toward widespread acceptance, perhaps outright expectation, of public engagement by scientists. This would likely entail a shift toward norms such as explicit and dedicated courses and other types of training for most or all students. I still very much appreciate the numerous sides of the “need every scientist do scicomm?” discussion. And yet, the surge of interest in scicomm training, science of scicomm, etc., across career levels and career types, in the Ecological Society of America (as just one example), suggests to me that this is a rising trend that’s gaining momentum. And, externally, we see increasing pressure to meet public scrutiny in compelling ways that may require training and practice to become accomplished at. For example see the recent research priorities relatimg to scicomm released by the National Academy of Sciences.
Interesting suggestion! Very curious what others think of this one.
EDIT (sorry, hit post too soon): I also wonder if this is a case where ethical norms will appear to change, but people’s behaviors won’t actually change all that much. For instance, I can imagine a future in which it’s expected that every scientist do some sort of scicomm, but in practice the majority of scientists just do token scicomm, or do scicomm in ways that look good on paper but don’t actually accomplish anything. A world in which, say, everybody’s expected to write a popsci blog post about every paper they publish is also a world in which the vast majority of those blog posts are read by approximately no one.
Rather than a norm that expects every scientist to do lots of (pointless, ineffective, one-more-drop-in-a-sea-of-content) scicomm, I personally would rather live in a world in which some scientists do it, and are really good at it, and are rewarded and respected appropriately for it. Which arguably isn’t too different from the world we live in now, actually…
Jeremy, yes, I agree, scicomm and engagement done well is the standard I’d like to see, now and in the future. And, some people do that really well. What I do sense is changing, though, are the overarching perspectives re whether or not scicomm is “worth it,” whether it’s a civic responsibility of scientists, etc. And, I hear more about students pushing for training, opportunities, and projects integrating scicomm, even in spite of reluctance from advisors at times. Whether that means more people doing token scicomm or more people doing meaningful scicomm and engagement, may be what we can debate as a prediction. Or, perhaps, what we can look for in the future is an increase in funders holding grant proposals and recipients accountable for more meaningful (and scientifically informed/quantifiably effective) scicomm work. And, this may arise at least in part in response to bottom-up expectations, like some of the other predictions being made in this comment thread and blog post.
Via Twitter, somebody has *very* strong feelings about scientific graphics. 🙂
Here’s a provocative one, via Twitter: a prediction that one day it will be considered unethical to run a big lab:
This one would really require a entire post to respond to, I think…
Here’s one: will it someday be considered discriminatory for even small sets of people to be homogeneous with respect to gender, race, etc.? There certainly seems to be an increasing view that it’s not ethically ok for, say, a symposium to include only male speakers even if it’s only 4 people or something.
As another example, I’m chair of the ASN Jasper Loftus-Hills Young Investigator Awards committee this year. Each year four awards are given out. Last year, I noted that the award had gone to four women for the first time, and had gone to a roughly gender-balanced mix of awardees in recent years (having gone mostly to men before that). I expressed the view that in the long run the award would, and should, continue to go to a gender-balanced mix of awardees even if the gender balance in any given year was skewed. In a change from the past, I think the norm now is that it would not be ok for the award to consistently go mostly to people of one gender. But will things shift further in future, to the point where it would be considered ethically obligatory for the award to go to two men and two women every year?
Meghan did a post a while back on the suggestion that each of us should calculate the gender balance of our co-authors and labmates, and be concerned about our own biases if it’s unbalanced (https://dynamicecology.wordpress.com/2014/01/27/my-gender-gap/). At the time, neither Meghan nor I thought the imbalance in our “personal gender gaps” was at all problematic. But will there come a time in future in which you’ll be considered a bad person if you have a “personal gender gap” in your co-authors, trainees, or supervisors? I don’t think so, but I wonder.
I would predict, if anything, that identity politics is likely to wane not only in science, but society generally. The primary reason for this is that very often identity politics fails, miserably. Exhibit A: During the 35 years of Affirmative Action, black and Hispanic enrollments have steadily declined at top American universities. We’ve seen signs a new trend departing from the failures of identity politics. Exhibit B: Michigan’s Supreme Court victory in lessening the constraints of Affirmative Action on law school admissions.
I should also like to point out that it is not always as easy as it might seem to be “inclusive,” even when we try to be. In 2016 I organized and moderated a symposium at a national-level conference. There were initially 6 speakers (two female, no minorities). One woman withdrew due to a scheduling conflict. I contacted twice as many women as men to be potential speakers, and ended up with a 4:1 male:female ratio. In 2017 I organized a symposium with 8 speakers- and had one female and no minorities… and it was not for a lack of trying. I am beginning to suspect that there might be some aversion to women/minorities engaging in these programs when a knuckle-dragging white man is the one doing the organizing…
I believe that in general our culture is moving away from gender-based and minority-based preferences, in large part because people (especially the under-35 crowd) are “over it”. That is to say, we are approaching a point where a majority of our citizenry no longer pigeon-holes people on the basis of gender and race, and that people of all genders and races are simply recognized “for the content of their character”.
Hmm, I don’t have a problem with gender imbalance when it goes in favor of the historically underrepresented group. I’d like to think we’d eventually reach a point where women have too much power/acknowledgement/success in a STEM field, but we have a long way to go.
Paying grad students and postdocs less than is required for reasonable living standards.
http://www.nature.com/nature/journal/v549/n7671/full/nj7671-297a.html
Personally, I would love to see more equity along these lines, but I wonder if this is a red herring? Most often, research/ teaching assistantships also include a tuition waiver. The tuition waiver is no small thing, considering tuition costs today. If we factor in graduate and non-resident status, gee whiz.
I say red herring because the more salient issue might be tuition costs in of themselves. Feel the Bern…
Via Twitter:
This isn’t exactly about ethics, but I wonder how much longer the bulk of scientific research will be done in academia. The ethics part is that universities are relying more heavily on part-time faculty to teach courses, and many of those faculty are trying to maintain research projects that will help them get full-time jobs, putting an unfair burden on PT faculty. At some point, I wonder if those folks will shift more and more towards non-profits and some other institute that supports research.
The shift you mention I have personally made, and I have never been happier, at least career-wise. My sense (I have no data to support it) is that many highly qualified academics are leaving academia for a variety of reasons. In general though I would say it happens due to the imposition of a highly selective phenotype among the faculty, and one that many of us find repulsive, myself included. (Hint: Mr./Ms. Politically Correct…)
Re: unis relying more on part-time faculty, that’s actually not true in the US except at certain types of teaching institutions. The boom in community colleges back in the 70s and 80s, and the subsequent boom in private for-profit colleges (now possibly going into reverse, thankfully) led to a lot of part-time people being hired at those institutions. That’s why part-time faculty have grown in absolute terms, and as a percentage of all faculty at all institutions. But other sorts of institutions actually haven’t seen more than very modest drops in full-time faculty, either per-student or as a percentage of all their faculty. We linked to these data in an old linkfest, will have to see if I can find it again.
Having fieldwork sites on the other side of the planet;
Flying to fieldwork sites;
Crossing the ocean multiple times a year for work while writing about the impacts of climate change;
Single use coffee cups at conferences…
Re: local field sites, Meghan has an old post in praise of them, although the issue of the C footprint of travel to field sites isn’t one that came up as far as I recall: https://dynamicecology.wordpress.com/2013/08/28/in-praise-of-boring-local-field-sites/
Personally I’d be very surprised if the ethical norm that it’s ok to have faraway field sites (or to fly to faraway conferences at least once in a while) changes because of C footprint concerns. But I’ve been wrong before!
Via Twitter, a similar thought to F’s above:
I would say that curtailing ones travel schedule to conferences, or restricting ones range concerning field work are, while well-intended, nothing more than window dressing. Moving into an alternative home, like a tiny house or earthen-walled structure that has the potential of reducing your fossil fuel consumption by more than 50% would be far more worthwhile. Eliminating the conventional refrigerator, freezer, convection oven, washer and dryer might have an impact too. But I think what you and others suggest by way of frugality in our careers is largely symbolic and is not likely to curtail pollution or global warming.
Via Twitter:
As I said in the post, I think the “deposit your data” norm has already changed. I doubt that depositing a preprint of your as-yet-unreviewed work will come to be regarded as an ethical obligation (and personally, I don’t think it should be regarded as an ethical obligation). As far as I know, it’s not even widely regarded as an ethical obligation in fields in which use of preprints is widespread and longstanding, like economics. As far as I know, someone who didn’t share preprints of their work in those fields would be regarded as odd and and as possibly hurting their own career by making it harder for others to find their work. But they wouldn’t be widely regarded as unethical.
But we’ll see!
A point from my post draft that got left on the cutting room floor: it’s also interesting to think about changes over time in what matters are regarded as matters of ethics at all. The example of depositing preprints of your work is a possible example. I think for a lot of people (including me), that’s not a matter of ethics at all, because it’s viewed as a personal choice that doesn’t have sufficiently-serious consequences for anyone else for it to be an ethical matter. Like how it’s not a matter of ethics if you prefer to wear blue shirts rather than red ones. But of course, some things that previously weren’t widely regarded as ethical matters come to be regarded as ethical matters, for instance because people become newly-aware of the consequences of some action. And some things that previously were regarded as ethical matters can cease to become regarded as ethical matters. Think for instance of religious prohibitions on certain behaviors, prohibitions that are no longer widely seen as having any rationale. So many people have stopped regarding the choice as to whether or not to engage in those behaviors as a matter of ethics at all.
I have an old and poor post that tried to articulate this and that polled readers on which scientific matters have an ethical dimension at all, vs. which ones are just not ethical issues (like how your choice of shirt color isn’t an ethical issue).
I suspect that, if a behavior isn’t widely regarded as a matter of ethics at all, it’s especially hard to get people to change that behavior via appeals to ethics. Say the consequences of an action for others are so non-obvious, indirect, or diffuse that people don’t see the choice as to whether to engage in the action as a matter of ethics at all. In that case, it seems like it’ll be hard to convince people not only that the choice to engage in the action *is* a matter of ethics, but that the action is unethical (or ethically obligatory). That’s why I think it’ll be hard to convince scientists that depositing preprints is an ethical obligation.
Via Twitter:
As with many of the tweeted answers, I’m unsure if this is a prediction of a future norm change, or a hope that this will become the new ethical norm.
As we discussed above regarding welfare of animals and other organisms, I’m also unclear on the scope of this norm (there being only 140 characters in a tweet). Will it only become the norm for prey species currently covered by animal welfare regs? Will it become the norm for any living organism that might be consumed by another, including plants and microbes? Or what?
While not wanting to pooh pooh advocacy for animals entrusted to our care, I can share several lessons from my time of using mouse models in my research:
1) By and large, the animals used for research have been modified to such an extent that they could not survive in the wild (i.e., white mice). There are exceptions, such as the chimpanzee, but primates have mostly been phased out of research programs.
2) Animals receive very good care in research settings. They have ample food, water, lodging, medical care, climate control and so on. So for example- I had mice that survived almost two years- far beyond their natural lifespan.
3) Decisions regarding animals used in research are objective, not emotional. The chimpanzee is an excellent example of this principle. For a long time chimps were used in research even as animal welfare groups protested the practice. Chimps were not phased out due to protests. Rather, they were phased out because there was no longer an objective scientific justification to continue the practice.
4) The research and development culture is for the most part tone-deaf when it comes to animal welfare groups and their related activities. I always found this dynamic somewhat unique. I cannot think of another example where the group that is protested against essentially tunes out the opposition. But believe me, at least within the research culture itself these groups might as well be on Mars.
5) The use of animal models in research is always the “last choice”. That is, if it is possible to test a hypothesis using tissue culture, yeast, bacteria or what have you- then those options will be the de facto choices. Animals and their related care cost money and labor, period.
6) I do not believe researchers are immune to the pain and suffering of the animals used in research. Quite the opposite, I’d say. I used mice for a 5-year project on developmental abnormalities in humans, and then for another 2 years involving cancer. Simply, the conditions observed in humans were expressed in mice- and yes, those mice suffered considerably. I was not immune to their suffering. Rather, the empathy I developed for my mice was extended to the millions of human patients I never met.
The use of whole animal models in research is not perfect, but I find the alternatives woefully unacceptable. If you disagree then I would suggest volunteering at a children’s hospital…
A tweeter after Meghan’s own heart:
(Meghan hates when people write things like “widely-accepted” rather than “widely accepted”)
We haven’t talked much about scientific practices currently regarded as unethical that will someday come to be regarded as ethical. The example of spousal hiring is an example of a practice one regarded as unethical coming to be regarded as ethical. Relaxation of authorship standards now allows as authors people whom it previously would’ve been regarded as unethical to include as authors. And maybe self-promotion is an example that’s in the process of shifting? I think it’s increasingly seen as ok to talk up your own work in ways that decades ago would not have been seen as ok. We have an old post on self-promotion in science: https://dynamicecology.wordpress.com/2014/08/04/poll-what-constitutes-self-promotion-in-science/
But none of those examples are specific to science. Having trouble thinking of a specifically-scientific ethical prohibition that will be relaxed or reversed in future.
I would argue we’ve relaxed the amount of understanding one should have about statistical methods before using them in published research. I think the level of understanding before public use found today would have been considered unethical 30 years ago. Not saying that in a judgmental way – the tools have gotten better and more important it is impossible for most people to be expert in the number of the methods they are likely to be expected by reviewers to use in contrast to 30 years ago.
Very interesting remark.
Via Twitter (link goes to one tweet in a series):
As I hope the original post makes clear, my intent was not to produce a comprehensive list of current scientific practices that are unethical. My interest for purposes of this post was descriptive, not evaluative–asking what current norms will change, not which should change. I of course recognize that many readers will not share that interest, and would prefer to discuss which norms should change, which is fine.
I would say A.I. is and will continue to be a significant game changer and may just render any concept of human ethics obsolete.
Wasn’t too long ago that using rotenone or dynamite to get a complete fish community sample by killing every last one in the reach was not an unusual practice in my next of the woods. The fisheries management agencies are still pretty liberal at using chemical eradication methods when they decide the ecology of a lake needs a reset for proper “management.”
Working alone.
Both the rise of collaborative research and the increasing legal oversight of many scientific activities (I’m thinking specifically of field work and lab work) are pushing towards more science being done by groups rather than individuals. There are already bans on doing some activities without another person near enough to help you if things go horribly wrong (almost all field work included, at least on paper), and those bans are typically justified by the fear of lawsuits should a person be injured in a way that could have been prevented if another person had been nearby. This is often justified with ‘common-sense’ arguments about personal safety and vague descriptions of the terrible consequences of something going wrong and nobody is there to help you.
People will still walk off into the forest alone, but they will be increasingly questioned about this behaviour by their colleagues – “What were you thinking? What if you got hurt?”. Lab procedures will increasingly require pre-arranged mutual assistance discussions – “I’ll stick around tonight until your last sample pops out, and you’ll be there to put labels on vials on the weekend for me”.
When was the last time you took a safety training course? Pretty much every such course I’ve taken, including the one to get everybody up to speed on recent changes to the WHMIS labelling system, has included some statement about when and how it’s acceptable to work alone. Yes, it’s probably a good idea to have a buddy when you’re handling HF, and anything that could be a confined space with all of its associated hazards deserves a permit, but there are many more activities that are (in my mind) unjustifiably required to be conducted by no fewer than two people. Furthermore, I sense an increasing attention being paid to mental-health risks associated with long or frequent periods spent alone. Will that scientist to spends all day at the far end of the field site and all night grinding out data analyses be regarded as a suicide risk?
The ethical argument that nobody should be alone is too easy to make, and too hard to argue against.
So, my prediction is that single-author papers will be increasingly rare. This will be driven most obviously by the need to work with other people of complementary expertise in order to create a publishable paper, but a secondary reason will be the decreasing number of activities that it will be considered acceptable to conduct alone. Even writing, given the notices to avoid staying in the office past a certain time of day, especially if one is a woman.
Interesting speculation. I do think it’s speculative, in the sense that the prediction won’t come true until far in the future if it ever does. For instance, I don’t see any sign yet that rightly-increased concern about mental health in the academia is turning into an overreaction that defines any and all solo late-night work as mentally unhealthy. And at least at my own uni, our regulations on working alone haven’t (yet?) crossed over into requiring a buddy for work that by any reasonable standard has no need for a buddy.
I do broadly agree with you that increasing regulations in the name of improving safety is hard to argue against. My uni certainly has its share of “safety theatre” regulations that do nothing to improve actual safety, not even by creating a “climate” in which safety is taken seriously. And the attitude in some corners at our uni seems to be that unless more safety regulations are being brought in every year, safety isn’t being taken seriously.