It is four days before the 2024 US presidential election. It's up to a week before we know the result1. I don't know about you, but the election has taken up a hefty chunk of my recent online scrollings and offline musings. It's been a bit stressful.
I am a scientist, and these stressful times remind me of the absolutely best thing I have learned in my career in science. It's a hard thing to learn, and it really doesn't want to stay learned, and the one thing I can honestly love about the US election is that it burns that one great thing deeper and deeper into my little brain.
I'm talking about uncertainty.
My fellow stacker Ian Leslie has just written a piece advising us to ignore the polls, and indeed to ignore all election news, because it contains zero information. The race is 50-50. No-one knows what will happen. Nothing you can glean from the polls, or the news cycles, or anyone's genius analysis, will tell you what will happen. We just don't know. He manages to work an entire article out of this basic fact, and for one simple reason: it's a basic fact that the human brain utterly rejects. The brain rejects it so instinctively and relentlessly, in fact, that I doubt any single article, no matter how well written and brilliantly argued, is going to have the slightest impact.
That is why this is without doubt the top thing I have learned from being a scientist. It's not just that I am reminded of the meaning and importance of uncertainty on a regular basis in my day job. It's that through repeated exposure I have slowly, over years and decades, come to appreciate just what it means.
That's a wonderful thing about all kinds of learning: there are a lot of simple ideas that are easy to understand. But understanding them is not enough. It takes years for them to sink in. You can hear an idea, you can understand the idea, you can explain the idea to others, but only by running into it and using it again and again and again do you finally say, “Oh, now I get it.”
It's also a frustrating thing. Once you think you've finally got the idea, you think that now you can really explain it properly. You write down your explanation. Then you realise that it's exactly the same as the original explanation. Understanding the idea is different to absorbing the idea and living the idea, and there’s no short cut.
Such is uncertainty.
As an undergraduate in first-year physics labs, I thought uncertainty was the dullest thing imaginable. It was such a dopey thing. You do a measurement. It's not perfect, so you have to report some uncertainty. Just to be honest, I guess. To be on the safe side. Then if you're combining several measurements into a calculation, there are a set of rules to work out what the uncertainty is in the final calculation. It was such a drag.
I mean, isn't all this just admitting that you were too lazy to do the experiment properly? Or covering your back because maybe you screwed it up? And why did you have to be so precise about manipulating quantities which were, according to their very name, uncertain? If it's all so uncertain, just make up some number. If you're feeling confident, it's a small number. If you're doubtful, it's big. Who cares?
That was my stupid and naive take on uncertainty. That was before I ran into situations where it matters. In science people do experiments that produce data, and then do calculations on those data, to try to say something that couldn't be said before. When they do that, the uncertainty is often, paradoxically, a measure of certainty. In the 19th century people found that the orbit of Mercury around the Sun drifted in a way that couldn't be explained by Newton's theory of gravity. Well, there are lots of effects to take into account when doing that calculation, so the uncertainty must be pretty big, and in reality maybe the orbit could be explained by Newtonian gravity? No: when you take into account the uncertainty in all of the measurements and all of the approximations in the calculation and all reasonable possible effects, it still doesn't fit with theory. Sorry.
The first triumph of Einstein's general theory of relativity was that its prediction for the orbit of Mercury did agree with measurements. (That's another amazing thing about how we slowly absorb ideas: when I first learned this fact, it was a cute piece of science history. Now when I teach it, and no matter how many times I teach it, tears come to my eyes and it's a struggle to keep speaking.)
That's one side of uncertainty. That's the easy one to grasp. It's all about how we're actually quite certain, and the brain likes that stuff.
The other side is, of course, how we're often uncertain. Sometimes we just don't know. That's hard to accept. It's especially hard to accept if the answer is important.
The answers from a scientific measurement or calculation might not seem especially important compared to, just to make up a random example, the future of US democracy, but the answers are important to the scientist. We really care. And it's really really hard to accept that we don't know. It's hard to write a paper that says, in effect, “I really want this to be true, and I've turned around the numbers every way I can think of to prove that it's true, but it just can't be done. We just don't know.”
That's hard to do, but this happens to us all the time. It's not just in the main results of papers — we thought we had a fantastic massive result, and in the end it was just a small interesting result — but it happens again and again during the whole project, in hundreds of tiny irritating ways and occasionally in huge devastating ways. Every time it happens, I'm reminded: sometimes we just don't know. And every time my ordinary human brain tries to forget.
For me, then, I find the chaotic events of the last decade extremely useful. Trump, Brexit, Covid — every shock was a painfully edifying lesson in uncertainty. Covid was the best, because it just went on and on. It was one prediction proved wrong after another. I knew scientists who waded in confidently with their fanciest data analysis techniques and statistical wizardry, and they all got slapped in the face with the same lesson. Sometimes you just can't know.
I have learned lot of things in science that I find amazing and incredible and even inspiring. It's hard to find many that have been useful in daily life. The dull, inexorable reality of uncertainty is the big one. How to spot uncertainty, how to wield it, and most importantly how to accept it and even embrace it — those have been among the most useful lessons of my life.
This time next week we should know the outcome of the US election. Commentators will already have decided on the reasons for the outcome. They will have decided that those reasons were obvious, and for anyone who watched the campaign with a dispassionate eye, the outcome should also have been obvious. Within a year they will “know” why the election went the way it did, and why it was inevitable.
They will not have learned the lesson.
I have not entirely learned the lesson, either. I still listen to commentators and look at polls and hope to get the answer. (Sorry, Ian.) But part of me gets that the outcome is unknown. Part of me loves that it's unknown. Part of me revels in the intellectual thrill of trying to finally beat the fact of uncertainty into my brain.
After all, we have to console ourselves as best we can.
In 2020 the results of the Tuesday election were announced the following Saturday. For many reasons we might expect it to be much quicker this time, but, well, take a look at the main article.
I need a shot of hopium to get me through all this uncertainty.
I feel like my job is to contantly remind people about ubcertainty.
Don't forget about you quantum mechanics now!