Inference and being wrong in a post-truth era

There’s been a confluence of recent events that have got me thinking about truth and facts. First of all, I’ve been thinking a lot about how we move forward, as scientists when there’s this steaming mess going on.  Secondly, there’s been a series of really honest blog posts where some amazing, respected scientists, talk about the times they’ve found errors in their published work. Thirdly a student member of a online community of academic women that I’m part of posted about a severe dressing down she got from her adviser for what appeared to be a few minor errors that she worried she’d undermined her professional credibility, and possibly her whole career. Finally, I’ve been preparing for another offering of my quantitative methods course, and as such, am thinking about how to best talk to students about data, inference, and what it all means.


Sci, Robot.

It’s more important, now more than ever, that scientists are willing to stand up against misinformation. We are knee deep in it, and friends, we’re the ones holding the shovels. But we don’t always, because it’s exhausting and often makes us vulnerable. We live in an era where scientists are not particularly trusted. When we’re trusted, we’re seen as inhuman. And we’re certainly not understood- who can forget how our research is openly mocked with reductio ad absurdum arguments by people with political agendas to disprove our work1 or undermine science in general (a situation that, I fear, is only going to get worse over the next few years)?  Add into the mix the funding climate leading to cutthroat competition for positions2 and stir in the scientific enterprise’s (essential and important) self-criticism, and  it’s no wonder that scientists are hesitant to engage. In short, it’s soul crushing, and the stakes are high.

This puts a lot of pressure on scientists to be right.

But I believe this climate is hurting us- in a lot of ways. Toxic levels of perfectionism. In the example I mentioned above, we have a graduate student worried about her career because of a typo. I’ve often cited the times I’ve observed students I’m working with get stuck on data problems because they’re afraid to do the wrong thing. Heck, I’m struggling to get the words out right now, because I’m afraid that someone will read what I write and think I’m advocating for sloppy science.3

But how much value does science gain when its practitioners are paralyzed into inaction?  And even more pressing, how do we encourage more diverse contributions in science when people trying to join the community are are shunned when they misstep?

This whole seeing scientists as robots, expecting scientists to be robots, even from within, I believe has a much more insidious consequence as well. It sets up the expectation  that science, all science, every preliminary study, to be absolute.  And we, as practitioners, know that’s not true. And when science is seen as absolute and invariable, it sets itself up as a convenient straw-man for people with anti-science agendas. For example, anthropogenic climate change is happening.4 If you’re reading this blog, you probably agree with this statement. But there’s a controversy, and often cited by team Give-The-Planet-An-Uninvited-Sweaty-Hug5 is the so-called Antarctic cooling, as a point to ‘disprove’ global climate change.6

So, no. Obviously no. Thinking that an overwhelming body of evidence can be brought down by localized variation in trends? And this is not even to mention the role interpretation, statistics choice, even method of observation play. This is a problem, and I can’t help but feel it is up to me, at least in a small part, to fix it. My gut says this is something we can only address by helping people really understand how science is done, and by inviting more people in.

So back to my course- my corner of influence. As this is a course that advocates a open science approach, it forces students into a paradigm where they will be publicly wrong, or at least incomplete and unpolished. As I mentioned, this is something not all scientists take kindly to. But I know I’ve had a lot more luck convincing people of my sometimes controversial conclusions when I open up, show them the steps I took to get there. This is why I applaud the scientists who are open about their missteps and mistakes- these people are teaching the world more about science and process than any ‘perfect’ paper may.

In my own work, I’ve adopted something of a radical openness. The projects I’m leading right now are out there  for the world to see, from the time I create the first file and start dumping stream-of-consciousness comments in about what I intend to do, through each bump in the road.  It is my hope that through this openness and transparency, people feel invited to build on my work,  they can validate the robustness of what I conclude, and they can use these ideas to help, in a small way, understand and buffer change in the world.7

Seventeen students have chosen to join me on this journey next semester. I hope to give them what they need to save the world, too. No pressure.

  1. The blog title makes it easy for you to do this with my work. Since really transitioning to quantitative ecology full time, you might even say I use a computer to count imaginary bugs.
  2. #operationhiremeplease2017
  3. See footnote #2. I’m not. I’m meticulous.
  4. I do hope that NASA article I linked isn’t turned into an ad for a certain prominent brand of luxury hotel in newly coastal Ames, Iowa anytime soon.
  5. This was the most PG  glib nickname for this group I could muster.
  6. Incidentally, I was at a conference in South Africa in October, and Michael Gooseff presented the data from the McMurdo dry valleys that showed this apparent slight cooling trend from the 1978-1998, and then showed the rapid increase since then. I was most intrigued by the inflection point, naturally, as a person interested in breakpoint analysis would naturally be, so stay tuned for some upcoming work in that area. I’ve started a new project. 🙂8
  7. And I mean this quite literally. The project I’ve linked to is developing a tool to understand when changes are occurring in dynamic populations from legacy population data. Because it’s really hard to determine when and how the factors regulating a population changed in an unbiased way- and without knowing that, it’s pretty hard to mitigate those factors.
  8. I’m being all secretive about this to create intrigue but if you’re motivated you can check out exactly what I’m doing on my github. 🙂 🙂

About cbahlai

Hi! I'm Christie and I'm an applied quantitative ecologist and new professor. I am an #otherpeoplesdata wrangler, stats enthusiast, and, of course, a bug counter. I cohabitate with five other vertebrates: one spouse, one third grader, one preschooler and two cats.
This entry was posted in Uncategorized. Bookmark the permalink.

1 Response to Inference and being wrong in a post-truth era

  1. I just happened to listen to a TED Radio hour on making mistakes this morning…

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s