Friday links: pay an undergrad to critique your teaching, apps for field biologists, meiosis > John Rawls, and more

From Jeremy:

Want to become a better teacher? Here’s an idea: pay a thoughtful, experienced undergrad to sit in on your class and critique your teaching. Political science prof Henry Farrell tried it and says it worked wonders for him.

A woman in Oklahoma has twice had her house destroyed by a tornado. When something like this happens, our first reaction often is to say “Wow–what are the odds of that?” Stats blog Normal Deviate asks, well, what are the odds of that? It’s actually not easy to say, in part because you have to be careful exactly how you pose the question. It occurs to me that this could be a good question to pose in undergrad statistics classes.

Is there such a thing as an optimal level of civility in writing? If so, what is it?

After the people who are developing MOOCs get done disrupting higher education, maybe they’ll move on to…disrupting organ transplants. Very funny satire.

Finally, can’t imagine this will be of interest to anyone but me, but it’s so cool (for “late-night-college-dorm-room-discussion” values of “cool”) that I have to share it. Ace philosopher of biology Samir Okasha, author of an important book on the levels of selection, has a new(ish) paper. In it, he points out an analogy between the notion of a “veil of ignorance” in Rawlsian political philosophy and…fair meiosis! Basically, fair meiosis, an evolutionary device to ensure that all genes work for the “common good”, is a real-world analogue to the “veil of ignorance”, a hypothetical device to ensure that individuals choose the political, social, and economic arrangements that work best for all. Okasha argues that recognition of this analogy sheds light on both philosophical and biological issues.

From Meg:

Emilio Bruna maintains a great list of apps for smartphones that may be of use to field biologists. ht: Karen Lips via Twitter

Hoisted from the comments:

This old post of mine arguing that ecologists should do more research in model systems has a great comment thread, covering all sorts of issues. What makes for a good model system? What are ecology’s best model systems? How many model systems does ecology need? What are the advantages and drawbacks to “question-first” vs. “system-first” approaches to research? If you haven’t read this comment thread, you should.

7 thoughts on “Friday links: pay an undergrad to critique your teaching, apps for field biologists, meiosis > John Rawls, and more

  1. I would love to hear from more people using the smartphone apps on Emilio Bruna’s list. I’ve been using Fulcrum this field season & I love it, but it’s my very first foray into futuristic data collection, and I often feel slightly awkward (& like a wilderness ethics deviant) conspicuously using an iPhone on hiking trails.

  2. I just had a few-minute conversation with students about how I insist that my students record data using a pencil and paper in the field and in the lab. I want a written record of every datum.

    Fingers and brains make mistakes when entering data. This rate of error is higher when punching keys than when writing it down. If there is a funny outlier that emerges in an analysis, it’s possible to go back to the written data and see if it was an entry error. If, however, the data were entered in the field and there was an error, we’ll never know. (Was the closest tree really 20 m away, or was it just 2.0?)

    Yes, I’m from the early medieval period.

  3. Terry, I’m not sure collecting data on paper minimizes errors, in fact, there are at least three opportunities for introducing mistakes – (1) person 1 calls out a number and person 2 writes down the wrong one (2) person two writes something that is hard to read and (3) when transferring to the computer the wrong number gets typed in. Of these, only #3 can be corrected by looking at the original datasheet, though you would only know to look if it was an outlier. Of course all of these can be reduced with well constructed datasheets, careful data entry, and rigorous QA/QC, but by collecting data electronically using forms that constrain your input to minimize mistakes you can really cut down the error rate. Might be interesting to send teams into the field to collect the same data with the different methods and see which one has the lower error rate…class project maybe?

    The main issue i’ve faced is making sure staff back up electronic files daily in the field and print paper copies of electronic datasets. Of course if you are somewhere with cell service, you can do this on the fly and breathe a little easier.

  4. #1 happens digitally, too.
    #2 doesn’t happen much if they’re trained and you get a reference writing sample at the back of the databook.
    #3 happens with digital methods in the field probably as often as when transferring from paper to digital, I suspect. But it can’t be detected if it happens digitally in the field but it can be.

    It agree that it’s possible that there might not be more errors digitally than otherwise. I’m not sure.

    Primarily, I’m anxious about having a long field day and then a battery runs out or there’s a wrong finger swipe here or there and the data don’t exist anymore. (I’ve seen it happen to others twice, so far). And, also, we often do plenty of data collecting when it’s raining.

    I think this falls under “if it floats your boat.”

  5. Agreed on the boat thing. I also think it depends on the type of data being collected and entered. We take height and stem number measurements on 7 thousand plants, and data entry from paper to computer and error checking takes weeks.

  6. Terry & Emilio, I took the three sources of error very seriously as I embarked on this iphone data collection adventure.

    I’m lucky enough to do field work in a location that has (patches of) good cell service, short drives between trailheads and towns, and fast internet once I arrive back at park housing at the end of each day. I agree that collecting data like this is not for everyone — it’s not always feasible, and it probably is best to learn the old fashioned way. But as a second-year PhD student with lots of field experience, and no funding for a field assistant (sequestration ftw!), I found fulcrum to be an incredible resource & it has saved me hours and hours of data entry after long field days (allowing me to spend more time fighting with HOBOs!).

    In fulcrum, I created a form with constrained drop-down menus. First, I input which Ridge I am working on (I have three ridge transects); then I input an elevation zone (there are four zones on the North side and four on the South side of each ridge). Each Ridge-Zone combination has its own species list. My data is synced from the field or as soon as I get home, and I download each day’s csv file that evening. I’m not completely paperless though — I have a notebook and camera in the field with me too — and these provide a trail of tricky plant identifications, first flowers, and general field notes on start & finish times, weather, and fellow hikers.

    Anyway, the main thing is, it seems to work for me. Eight weeks into the field season, with over 2,000 individual data points in my fulcrum app account, I’m a happy camper.

    • Sorry for the delay in replying, cnmackenz…I’ve been in the field! I’m glad to hear digital colleciton has gone well. I think the use of constrained menus was really critical, even on paper we should strive to design data sheets so that the options what we write down are limited, which will minimize mistakes.

Leave a Comment

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.