Bayes' theorem
Updated: 12/11/2025, 9:09:00 AM Wikipedia source
Bayes' theorem (alternatively Bayes' law or Bayes' rule, after Thomas Bayes ()) gives a mathematical rule for inverting conditional probabilities, allowing the probability of a cause to be found given its effect. For example, with Bayes' theorem, the probability that a patient has a disease given that they tested positive for that disease can be found using the probability that the test yields a positive result when the disease is present. The theorem was developed in the 18th century by Bayes and independently by Pierre-Simon Laplace. One of Bayes' theorem's many applications is Bayesian inference, an approach to statistical inference, where it is used to invert the probability of observations given a model configuration (i.e., the likelihood function) to obtain the probability of the model configuration given the observations (i.e., the posterior probability).
Tables
| TestActual | Positive | Negative | Total | |
| User | 45 | 5 | 50 | |
| Non-user | 190 | 760 | 950 | |
| Total | 235 | 765 | 1000 | |
| 90% sensitive, 80% specific, PPV=45/235 ≈ 19% | ||||
| TestActual | Positive | Negative | Total | |
| User | 50 | 0 | 50 | |
| Non-user | 190 | 760 | 950 | |
| Total | 240 | 760 | 1000 | |
| 100% sensitive, 80% specific, PPV=50/240 ≈ 21% | ||||
| TestActual | Positive | Negative | Total | |
| User | 45 | 5 | 50 | |
| Non-user | 47 | 903 | 950 | |
| Total | 92 | 908 | 1000 | |
| 90% sensitive, 95% specific, PPV=45/92 ≈ 49% | ||||
| SymptomCancer | Yes | No | Total | |
| Yes | 1 | 0 | 1 | |
| No | 10 | 99989 | 99999 | |
| Total | 11 | 99989 | 100000 |
| ConditionMachine | Defective | Flawless | Total | |
| A | 10 | 190 | 200 | |
| B | 9 | 291 | 300 | |
| C | 5 | 495 | 500 | |
| Total | 24 | 976 | 1000 |
| BackgroundProposition | B | ¬ (not B) | Total |
| A | P ( B | A ) ⋅ P ( A ) {\displaystyle P(B|A)\cdot P(A)} = P ( A | B ) ⋅ P ( B ) {\displaystyle =P(A|B)\cdot P(B)} | P ( ¬ B | A ) ⋅ P ( A ) {\displaystyle P( eg B|A)\cdot P(A)} = P ( A | ¬ B ) ⋅ P ( ¬ B ) {\displaystyle =P(A| eg B)\cdot P( eg B)} | P ( A ) {\displaystyle P(A)} |
| ¬ (not A) | P ( B | ¬ A ) ⋅ P ( ¬ A ) {\displaystyle P(B| eg A)\cdot P( eg A)} = P ( ¬ A | B ) ⋅ P ( B ) {\displaystyle =P( eg A|B)\cdot P(B)} | P ( ¬ B | ¬ A ) ⋅ P ( ¬ A ) {\displaystyle P( eg B| eg A)\cdot P( eg A)} = P ( ¬ A | ¬ B ) ⋅ P ( ¬ B ) {\displaystyle =P( eg A| eg B)\cdot P( eg B)} | P ( ¬ A ) {\displaystyle P( eg A)} = 1 − P ( A ) {\displaystyle 1-P(A)} |
| Total | P ( B ) {\displaystyle P(B)} | P ( ¬ B ) = 1 − P ( B ) {\displaystyle P( eg B)=1-P(B)} | 1 |
| Hypothesis | Hypothesis 1: Patient is a carrier | Hypothesis 2: Patient is not a carrier |
| Prior Probability | 1/2 | 1/2 |
| Conditional Probability that all four offspring will be unaffected | (1/2) ⋅ (1/2) ⋅ (1/2) ⋅ (1/2) = 1/16 | About 1 |
| Joint Probability | (1/2) ⋅ (1/16) = 1/32 | (1/2) ⋅ 1 = 1/2 |
| Posterior Probability | (1/32) / (1/32 + 1/2) = 1/17 | (1/2) / (1/32 + 1/2) = 16/17 |
| MotherFather | W Homozygous for the wild-type allele (a non-carrier) | M Heterozygous(a CF carrier) |
| W Homozygous for the wild-type allele (a non-carrier) | WW | MW |
| M Heterozygous (a CF carrier) | MW | MM (affected by cystic fibrosis) |
| Hypothesis | Hypothesis 1: Patient is a carrier | Hypothesis 2: Patient is not a carrier |
| Prior Probability | 2/3 | 1/3 |
| Conditional Probability of a negative test | 1/10 | 1 |
| Joint Probability | 1/15 | 1/3 |
| Posterior Probability | 1/6 | 5/6 |
References
- Laplace refined Bayes's theorem over a period of decades: Laplace announced his independent discovery of Bayes' theoremhttp://gallica.bnf.fr/ark:/12148/bpt6k77596b/f32.image
- Liberty's Apostlehttps://www.uwp.co.uk/book/libertys-apostle-richard-price-his-life-and-times/
- David Hartley on Human Naturehttps://books.google.com/books?id=NCu6HhGlAB8C&pg=PA243
- Philosophical Transactions of the Royal Society of Londonhttps://doi.org/10.1098%2Frstl.1763.0053
- Holland, pp. 46–7.
- Price: Political Writingshttps://books.google.com/books?id=xdH-gjy2vzUC&pg=PR23
- Mitchell 1911, p. 314.
- Classical Probability in the Enlightenmenthttps://books.google.com/books?id=oq8XNbKyUewC&pg=PA268
- The History of Statistics: The Measurement of Uncertainty Before 1900https://books.google.com/books?id=M7yvkERHIIMC&pg=PA99
- Scientific Inferencehttps://archive.org/details/scientificinfere0000jeff
- The American Statisticianhttps://doi.org/10.1080%2F00031305.1983.10483122
- Stats, Data and Models
- The American Statisticianhttps://doi.org/10.1080%2F00031305.1986.10475370
- Significancehttps://doi.org/10.1111%2Fj.1740-9713.2013.00638.x
- The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines & Emerged Triumphant from Two Centuries of Controversyhttps://archive.org/details/theorythatwouldn0000mcgr
- Jahresbericht der Deutschen Mathematiker-Vereinigunghttps://doi.org/10.1365%2Fs13291-013-0069-z
- Kendall's Advanced Theory of Statistics: Volume I – Distribution Theory
- Foundations of the Theory of Probability
- Probability based on Radon measureshttp://archive.org/details/probabilitybased0000tjur
- Scandinavian Journal of Statisticshttps://doi.org/10.1111%2Fsjos.12550
- Monte Carlo Statistical Methodshttp://worldcat.org/oclc/1159112760
- Ten Great Ideas About Chance
- Frank Ramsey: A Sheer Excess of Powers
- Stanford Encyclopedia of Philosophyhttps://plato.stanford.edu/entries/probability-interpret/
- Bayesian Statisticshttps://www.york.ac.uk/depts/maths/histstat/pml1/bayes/book.htm
- Trinity Universityhttps://web.archive.org/web/20040821012342/http://www.trinity.edu/cbrown/bayesweb/
- Cuemathhttps://www.cuemath.com/data/bayes-theorem/
- Probabilistic Graphical Modelshttps://web.archive.org/web/20140427083249/http://pgm.stanford.edu/
- Genetics in Medicinehttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC6752283
- Genetics in Medicinehttps://doi.org/10.1097%2F01.GIM.0000139511.83336.8F
- Cystic Fibrosis Foundationhttps://www.cff.org/What-is-CF/Genetics/Types-of-CFTR-Mutations/
- MedlinePlushttps://ghr.nlm.nih.gov/gene/CFTR#location