“Truth is mighty and will prevail. There is nothing wrong with this, except that it ain’t so.” –Mark Twain
“It doesn’t matter how beautiful your theory is, it doesn’t matter how smart you are. If it doesn’t agree with experiment, it’s wrong.” –Richard Feynman
Every day that you set forth in the world is a new opportunity to learn something about it. Every new observation that you make, every new test you perform, every novel encounter or piece of information you pick up is a new chance to be a scientist.
You have a conception of how things work in this world. You’ve pieced it together as a combination of your experiences, your knowledge, and the working hypotheses that you’ve accepted as the best mirror of reality. And every new shred of evidence you pick up about reality interrogates these hypotheses, daring your picture of reality to hold up to this level of scrutiny.
No matter who you are, no matter how smart you are, no matter how brilliantly you’ve drawn the conclusions you’ve drawn from the evidence you’ve gathered, there will come an instance where the evidence you encounter will be irreconciliable with the picture of reality you presently hold. And when that moment happens, your response will mean absolutely everything.
Because there is the possibility that your view of reality — the way you make sense of things — is flawed in some way. You have to open your self up to at least the possibility that you are wrong. It is a humbling admission, that you may be wrong, but it’s also the most freeing thing in the world. Because if you can be wrong about something, then you can learn.
The discovery that planets move about the Sun in ellipses required exactly that; were it not for Kepler and his ability to accept that his earlier models were flawed, and then abandon them and create new and improved ones, physics and astronomy would likely have been set back an entire generation. And if you, yourself, can do this in your own life, you can find a better explanation for the phenomena you encounter in this world. You can bring your understanding of the world more closely in line with what reality actually is. In other words, you can do what all good scientists do, and in the end, learn something amazing.
But if you can’t admit that you might be wrong, if your picture of reality is unchangeable despite any evidence to the contrary, if you refuse to assimilate new information and new knowledge and re-evaluate your prior stance on an issue, then you will never learn.
Perhaps as an adult you’re entitled to that right; you are, after all, free to believe whatever you want. But if you’re a student in school? Your job is to learn. If you don’t do your job, particularly if you don’t even try to do your job, it’s your teachers duty — and I would say responsibility — to fail you.
At least, it should be. Recently, some incredibly appalling things have been happening in education that completely undermine this, including the banning of the words ‘dinosaur’ and ‘evolution’ from standardized tests and the passage of Tennessee’s “academic freedom” bill that allows teachers to teach counterfactual scientific information to their students about biological evolution and climate science, among other topics.
And this is unfathomable to me. See that creature above? That’s a black wolf. Know what’s interesting to me about it? The black wolf doesn’t occur in nature! The mutation for black fur did not occur until after the domestic dog had been in existence for thousands of years. If ever you see a black wolf, that tells you that at some point in their lineal history, there was a wolf that engaged in breeding with a domestic dog that had that (dominant) black fur mutation.
Biology, of course, doesn’t stop with evolution. What I just explained to you is an explanation that requires genetics to understand, which is encoded in an organism’s DNA. But before you get to DNA, before you even get to genetics, at a more basic level you must have an understanding of evolution. If you want to understand disease: evolution. If you want to understand whales and dolphins: evolution. (I mean come on, they’ve got freakin’ leg bones!)
Same deal with global warming; there are plenty of people asserting that the Earth isn’t warming anymore (yes, there really are), despite all studies showing that it totally is, if you look at the data without cheating. For example, last year (2011) was “only” the 11th-warmest year on record since records began in 1880. But last year was also a La Niña year, which is characterized by cooler temperatures. It was also the hottest La Niña year of all time, since 1880.
The question I always ask people who dig in even deeper when their view on an issue is challenged by new data is the following:
What evidence would it take to change your mind on this issue?
For the “Is the Earth continuing to warm” question, you may very well get your wish in 2012 or 2013; one of the next two years could easily become the new warmest-year-on-record.
Believe it or not, it’s actually harder for many of us to admit that we could be wrong about something the less we know about it! Why’s that? A neat little psychological effect known as the Dunning-Kruger effect. In a nutshell, it says that people who are incompetent at something (e.g., biology, climate science, etc.) lack the very skills necessary to evaluate the fact that they are incompetent!
This results in people who know almost nothing about a particular topic who are willing to opine at length, argue with experts, and declare — incorrectly — that they are right and you are an idiot. Here’s the original graph from the original Dunning-Kruger paper, illustrating exactly that.
But if we recognize that our present understanding may not be the final answer, and we can absorb that ego-bruise from possibly not being in the right when we thought we were, we can step forward. There are plenty of people working to help make it easier for us all to do exactly that. I’m not exempt from this either, even in areas where my knowledge actually is far above average. Last week, I wrote about when ultramassive stars die, and a number of people challenged some of the contentions I made. Yes, some of them may have been jerks about it, but they also had information that I didn’t. Despite being a theoretical astrophysicist, I don’t know all there is to know about all aspects of astrophysics, and I never will.
So I went out and learned what it was that I didn’t know, and now my picture of how supermassive stars die is — while possibly still imperfect — improved over what it was. And the next time I go to explain it, there will be at least two things that I can do a better, more accurate job of explaining, and there will be at least one misstep I won’t make again.
It doesn’t make me any less of a person or any less of a scientist that I didn’t get everything right the first time I put it all together; on the contrary, it makes me human. I’ve been refining what I know and how things make sense to me my entire life, and I’ll continue to do that tomorrow. There is no part of that picture of reality that I hold so dear that overwhelming evidence to the contrary couldn’t change my mind. I would be surprised at a great number of things, but I wouldn’t be stuck.
I know exactly what types of evidence would change my mind about the theories, hypotheses and ideas that make up my world view. Remember the words of Carl Sagan:
When you make the finding yourself — even if you’re the last person on Earth to see the light — you’ll never forget it.
I hope that I never reach the point where I think I’m always right; I hope I can always gather new information and knowledge, have that crisis when my preconceptions conflict with new data, and admit when I was wrong. Because I don’t want to ever stop learning; no matter how much I know, there’s always going to be a whole Universe out there to explore.