Using Omicron to preach Bayes
For New Year’s Eve, I traveled with some friends to the beach. On the 30th, I started feeling a sore throat and a bit of fever. I ended up spending midnight in bed on the 31st, with a 39ºC fever.
I took a covid-19 test and it came positive, to no one's surprise. What was weird was when my friends started taking the test. Three of them tested positive, but two didn’t. They all had mild symptoms very similar to mine, so could it be that they had escaped?
Although we all knew this kind of stuff happened with covid, we were skeptical. Omicron is highly contagious, we had all shared the same house, and the friends who tested negative had taken the less reliable antigen tests.
So could we just ignore the tests, go with our guts, and assume they had covid? I mean, it sounded reasonable, but at the same time, hadn’t we spent the last two years saying people should trust science and not their intuitions? How could we make sense of our feeling that the overwhelming probability of them having covid was stronger than a negative test?
Enter Thomas Bayes
Luckily for us, an English clergyman of the 18th century has already solved that one. What Thomas Bayes proposed is that our credence of something (that is, how much we should believe in it), should not only be a function of the evidence for it (a covid test, for example) but also of how much we already believed it to be true prior to seeing the evidence.
Bayes even created a mathematical formula to quantify this. I won’t go into the math, but the general takeaway is this: the more improbable something is, the stronger the evidence needs to be for you to believe in it. This makes sense, right? If a friend says they are late because there was traffic, you will consider how much traffic you think there is that day and things like that, but usually you can take their word for it. If they say, however, that they were late because there was an ostrich in the middle of the street, then you won’t believe them unless they have pictures or something of the sort. The crazier the claim, the stronger the evidence you should require. Or as scientist Carl Sagan has put it, “extraordinary claims require extraordinary evidence.”
Back to medicine
Let’s look at a famous example in medicine to see how this works. Imagine there is a rare disease that affects about 1 out of 100,000 people. There’s a test for the disease with a 99.9% reliability. That is, whether you have it or not, the test will only be wrong 0.1% of the time. (For rigorous readers, I’m assuming that both the specificity and the sensitivity are 99.9%.) You take the test and it comes out as positive. What are the chances you actually have the disease?
If you plug these numbers on Bayes’ formula, you will find out it’s less than 1%. If you’ve been paying attention, you shouldn’t be so surprised: the disease itself is so rare, that even the tiny 0.1% makes it so that there are more people who fall into the false positive zone than people that actually have the disease. In fact, 100 times more people.
This is the exact opposite of my friends’ story. In that case, the prior probability that they had covid was so high we should believe they have it despite the negative tests. What the test can do is simply update our beliefs: if they take another one and it’s negative again, then we have more reason to believe they didn’t catch covid after all.
Who cares?
You may be thinking “Ok funny man with your tricky math riddles, but why do I care, my doctor probably knows that… right?” Well, first of all, not really. Multiple studies have shown that doctors fail in simple Bayesian questions, meaning they are more likely to recommend some invasive treatments for diseases you don’t even have based on flawed diagnostics. So that should be enough reason for you to pay attention.
Failing to think in Bayesian terms is also why you think you have cancer whenever you Google some symptom. Yes, you moron, it’s true that most Esophageal cancer patients have trouble swallowing, but Esophageal cancer is also super rare while trouble swallowing is quite common.
Bayesian thinking is also a good way of becoming more skeptical. What is more probable: that there is a race of lizard people running the US government or that someone just made that shit up in their basement? Although most readers will find this example obvious (I hope), it’s a good reminder that when the evidence is much more likely to occur (or to be fabricated, in this case) than the event itself, then the evidence probably shouldn’t serve as proof of the event.
Finally, that person you’re interested in is not seeing your Instagram stories, so you think they’re not interested back. But while it’s true that people don’t usually see stories of people they are not interested in, it’s also true that not seeing stories is quite common, so why don’t you ask them out so you can have real, Bayesian-verified evidence that you’re a loser?
Back to science (quickly)
Ok, maybe doctors don’t know about this Bayes guy, but the scientists who for a great part define what we believe in do… right? Right???
The sad answer is… kinda. The truth is in practice it is pretty hard to apply Bayes’ theorem to scientific inquiry, so scientists tend to cast it aside and pretend it doesn’t exist. The result is all sorts of absurd claims that should be rejected like homeopathy or telepathy popping up in the news once in a while. Some scientists have been pushing a Bayesian agenda in statistics, trying to estimate prior probabilities for all sorts of phenomena, but it’s still far from mainstream.
I’m not sure what to do about this. If you are a scientist, maybe try to be more mindful of statistics and the common pitfalls of doing science. If you are not, maybe try to be more skeptical of absurd scientific results or something? I’m also not sure whether this is the correct takeaway because many of the true Bayesian-sound facts of modern science do seem absurd to us. So I think I can settle for this: try to be more skeptical of scientific results that claim to have overthrown everything we (including scientists) have known so far.
The definite takeaway
If people take something away from this article, I hope it is that evidence is not something that either proves or disproves something else, but rather that updates our beliefs about that something else. And that the crazier something else is, the crazier the evidence for it should be.
Thanks to Felipe Germanos, Fernão Ferreira-Reimão, François Boris, Miguel Sallum, Pedro Mello, and Sambhav Bhandari for reading drafts of this.