top of page
Search

Do yourself a favor and learn some Bayes!

  • Writer: Giovanni Cortes A
    Giovanni Cortes A
  • Mar 16
  • 5 min read

I think some of my friends might have a stroke when they see me writing a blog about Bayes!


For context, I used to hate statistics and probability back in university. But after some reflection, I realized why: statistics and probability can feel incredibly counter-intuitive (or at least, that’s what I used to think). Imagine you want to know whether you have a deadly disease. You take a test, and the instructions say: “The test is 99% sensitive and 99% specific.”


In plain English, this means:


  • If you do have the disease, the test will correctly tell you so 99% of the time.


  • If you don’t have the disease, the test will correctly tell you that 99% of the time.


Sounds pretty accurate, right? But here’s where things get interesting. You take the test, and—boom!—it comes back positive. Panic sets in, looks like according to the specification of the test you are 99% likely to have the disease! You call your mom to say your goodbyes. You start making arrangements (okay, maybe negotiating your debts is the last thing you’d do). You even consider throwing a farewell party!


But then, a mathematically enlightened mind steps in and tells you, “Actually, the probability that you have the disease is only 23%.”  Wait, what?! Relief washes over you, but also confusion—how the hell is that possible?! Let me tell you my friend that the answer lies in Bayes’ Theorem. So grab a coffee, sit tight, and let’s unpack this mystery and enjoy the image!



Bayes' Theorem: The “New Religion” That Shook the 18th Century


Picture this: It’s the 18th century, and the world is full of big ideas. People are debating, discovering, and challenging everything from science to philosophy. But one idea stands out—a seemingly simple concept that stirs up a whole lot of controversy. And the guy behind it? A humble priest named Thomas Bayes.


The funny thing is, Bayes never even got to see his own big idea take off. His friend, Richard Price, found his unfinished manuscript, cleaned it up, and published it in 1763. That’s when everything changed. What was so special about it? Well, let’s just say it broke all the old-school rules about certainty and knowledge.


It all comes down to one small, but mighty formula:




Let’s break it down:


  • P(A): Your prior belief—what you think is true before any new evidence shows up.

  • P(B | A): The chance of seeing the new evidence if your belief is correct.

  • P(B): The chance of seeing the new evidence.

  • P(A | B): Your updated belief after you see the new evidence.


Hold up! Before you panic, just know this: It’s not as scary as it looks.


Let’s Make It Fun: The Flaky Friend


Alright, time to make this real. Imagine you have that one friend—the flaky one who always promises to show up, but half the time bails. You’ve invited them for coffee 10 times, and they only show up 5 times. So, your prior belief that they’ll show up today is 50%. You’ve learned the hard way to not get your hopes up too much.


Now, here’s the twist: You invite them for coffee again, and this time, you get a text that says, “I’m on my way!” Hold up. In the past, whenever they sent that message, they showed up 80% of the time.


Bayes’ magic? You update your belief. Now, instead of just thinking there’s a 50% chance they’ll show, you bump it up. The text changes everything. You're thinking like a Bayesian now! Or, maybe you don’t even have to try—your brain is naturally doing this already! Does this mean our brains are Bayesian by nature?


The Controversy Around Bayes


Let’s explain it with Bayes using Laplace’s example: How do you know the sun will rise tomorrow? Easy—you think it is almost certain because you have seen it happen every day of your life. However, we can never be 100% sure.


Now, imagine we move to another planet. On your first day, you are unsure if the sun will rise. You see the sky starting to brighten—what is the probability that the sun will rise given this evidence? Based on prior experience (your knowledge of the universe), it's likely. As the sky continues to brighten, your belief that the sun will rise increases. This updated belief is your posterior probability—your new probability after incorporating evidence.


But wait, what is your prior probability in this new planet? You could use your knowledge about the universe, or your experience from Earth, or just assume 50% since it’s a new place. It’s all subjective! And this is the controversial part about Bayes—your prior can be different from mine and therefore produce a different posterior probability!


Back in the 18th century, things weren’t all roses. Racism and the idea of a “superior race” were starting to take root, and eugenics—basically the belief that we could “improve” humanity by selectively breeding people—was actually being taught in universities. Yeah, it was as bad as it sounds. These ideas were used to justify the oppression of other races and cultures. Many well-known mathematicians of the time actually supported these views, and accepting Bayes' ideas would have meant accepting some level of subjectivity. They weren’t ready to do that—especially when it came to such a controversial topic like eugenics.


When I called Bayesianism a “new religion”, I wasn’t being totally serious, but kind of. For Bayesians, it’s more than just a math tool—it’s a whole way of thinking. And honestly, when you start looking at how our brains process information, it’s easy to see why they think it makes sense. It’s almost like our brains are running Bayesian calculations all the time, even if we’re doing it unconsciously. Maybe we’re like little Bayesian machines, and we don’t even know it!


Back to Our Deadly Disease Problem


Let’s finally get back to our original problem(I almost forgot about it!).


We need to calculate:




The key idea here is you cannot just use the accuracy of the test as your only source of information! You also need the prior probability—the base rate of the disease in the population.


Let’s say:


  • 0.3% of the population has the disease (P(Disease) = 0.003).

  • The test is 99% sensitive (true positive rate = P(Positive | Disease) = 0.99).

  • The test is 99% specific (false positive rate = P(Positive | No Disease) = 0.01).


Applying Bayes' Theorem, the updated probability of actually having the disease given that you tested positive is around 22.96%(I leave the math so you have some fun!)


So instead of freaking out with a near-certain death sentence, Bayes tells you: “Hey, relax. It’s still concerning, but way less certain than you thought.”


All this talk about prior probability really sounds like what we humans call "common sense." You see the world through the lens of your own experiences, so maybe "common sense" isn’t all that common after all. After all, everyone has their own unique set of prior experiences that shape how they view things. And that's when empathy becomes so important—recognizing that we can be wrong and that others may see the world differently.






 
 
 

1 Comment


liza chichua
liza chichua
Mar 18

Love this!!

Like
bottom of page