Topzle Topzle

Bayes' theorem

Updated: 12/11/2025, 9:09:00 AM Wikipedia source

Bayes' theorem (alternatively Bayes' law or Bayes' rule, after Thomas Bayes ()) gives a mathematical rule for inverting conditional probabilities, allowing the probability of a cause to be found given its effect. For example, with Bayes' theorem, the probability that a patient has a disease given that they tested positive for that disease can be found using the probability that the test yields a positive result when the disease is present. The theorem was developed in the 18th century by Bayes and independently by Pierre-Simon Laplace. One of Bayes' theorem's many applications is Bayesian inference, an approach to statistical inference, where it is used to invert the probability of observations given a model configuration (i.e., the likelihood function) to obtain the probability of the model configuration given the observations (i.e., the posterior probability).

Tables

· Examples › Drug testing › Sensitivity or specificity
User
User
TestActual
User
Positive
45
Negative
5
Col 4
50
Non-user
Non-user
TestActual
Non-user
Positive
190
Negative
760
Col 4
950
Total
Total
TestActual
Total
Positive
235
Negative
765
Col 4
1000
90% sensitive, 80% specific, PPV=45/235 ≈ 19%
90% sensitive, 80% specific, PPV=45/235 ≈ 19%
TestActual
90% sensitive, 80% specific, PPV=45/235 ≈ 19%
TestActual
Positive
Negative
Total
User
45
5
50
Non-user
190
760
950
Total
235
765
1000
90% sensitive, 80% specific, PPV=45/235 ≈ 19%
· Examples › Drug testing › Sensitivity or specificity
User
User
TestActual
User
Positive
50
Negative
0
Col 4
50
Non-user
Non-user
TestActual
Non-user
Positive
190
Negative
760
Col 4
950
Total
Total
TestActual
Total
Positive
240
Negative
760
Col 4
1000
100% sensitive, 80% specific, PPV=50/240 ≈ 21%
100% sensitive, 80% specific, PPV=50/240 ≈ 21%
TestActual
100% sensitive, 80% specific, PPV=50/240 ≈ 21%
TestActual
Positive
Negative
Total
User
50
0
50
Non-user
190
760
950
Total
240
760
1000
100% sensitive, 80% specific, PPV=50/240 ≈ 21%
· Examples › Drug testing › Sensitivity or specificity
User
User
TestActual
User
Positive
45
Negative
5
Col 4
50
Non-user
Non-user
TestActual
Non-user
Positive
47
Negative
903
Col 4
950
Total
Total
TestActual
Total
Positive
92
Negative
908
Col 4
1000
90% sensitive, 95% specific, PPV=45/92 ≈ 49%
90% sensitive, 95% specific, PPV=45/92 ≈ 49%
TestActual
90% sensitive, 95% specific, PPV=45/92 ≈ 49%
TestActual
Positive
Negative
Total
User
45
5
50
Non-user
47
903
950
Total
92
908
1000
90% sensitive, 95% specific, PPV=45/92 ≈ 49%
· Examples › Cancer rate
Yes
Yes
SymptomCancer
Yes
Yes
1
No
0
Col 4
1
No
No
SymptomCancer
No
Yes
10
No
99989
Col 4
99999
Total
Total
SymptomCancer
Total
Yes
11
No
99989
Col 4
100000
SymptomCancer
Yes
No
Total
Yes
1
0
1
No
10
99989
99999
Total
11
99989
100000
· Examples › Defective item rate
A
A
ConditionMachine
A
Defective
10
Flawless
190
Col 4
200
B
B
ConditionMachine
B
Defective
9
Flawless
291
Col 4
300
C
C
ConditionMachine
C
Defective
5
Flawless
495
Col 4
500
Total
Total
ConditionMachine
Total
Defective
24
Flawless
976
Col 4
1000
ConditionMachine
Defective
Flawless
Total
A
10
190
200
B
9
291
300
C
5
495
500
Total
24
976
1000
Contingency table · Forms › Events › Alternative form
A
A
BackgroundProposition
A
B
P ( B | A ) ⋅ P ( A ) {\displaystyle P(B|A)\cdot P(A)} = P ( A | B ) ⋅ P ( B ) {\displaystyle =P(A|B)\cdot P(B)}
⁠ ¬ B {\displaystyle \lnot B} ⁠(not B)
P ( ¬ B | A ) ⋅ P ( A ) {\displaystyle P(\neg B|A)\cdot P(A)} = P ( A | ¬ B ) ⋅ P ( ¬ B ) {\displaystyle =P(A|\neg B)\cdot P(\neg B)}
Total
⁠ P ( A ) {\displaystyle P(A)} ⁠
⁠ ¬ A {\displaystyle \neg A} ⁠(not A)
⁠ ¬ A {\displaystyle \neg A} ⁠(not A)
BackgroundProposition
⁠ ¬ A {\displaystyle \neg A} ⁠(not A)
B
P ( B | ¬ A ) ⋅ P ( ¬ A ) {\displaystyle P(B|\neg A)\cdot P(\neg A)} = P ( ¬ A | B ) ⋅ P ( B ) {\displaystyle =P(\neg A|B)\cdot P(B)}
⁠ ¬ B {\displaystyle \lnot B} ⁠(not B)
P ( ¬ B | ¬ A ) ⋅ P ( ¬ A ) {\displaystyle P(\neg B|\neg A)\cdot P(\neg A)} = P ( ¬ A | ¬ B ) ⋅ P ( ¬ B ) {\displaystyle =P(\neg A|\neg B)\cdot P(\neg B)}
Total
P ( ¬ A ) {\displaystyle P(\neg A)} = 1 − P ( A ) {\displaystyle 1-P(A)}
Total
Total
BackgroundProposition
Total
B
⁠ P ( B ) {\displaystyle P(B)} ⁠
⁠ ¬ B {\displaystyle \lnot B} ⁠(not B)
P ( ¬ B ) = 1 − P ( B ) {\displaystyle P(\neg B)=1-P(B)}
Total
1
BackgroundProposition
B
⁠ ¬ ⁠(not B)
Total
A
P ( B | A ) ⋅ P ( A ) {\displaystyle P(B|A)\cdot P(A)} = P ( A | B ) ⋅ P ( B ) {\displaystyle =P(A|B)\cdot P(B)}
P ( ¬ B | A ) ⋅ P ( A ) {\displaystyle P( eg B|A)\cdot P(A)} = P ( A | ¬ B ) ⋅ P ( ¬ B ) {\displaystyle =P(A| eg B)\cdot P( eg B)}
⁠ P ( A ) {\displaystyle P(A)} ⁠
⁠ ¬ ⁠(not A)
P ( B | ¬ A ) ⋅ P ( ¬ A ) {\displaystyle P(B| eg A)\cdot P( eg A)} = P ( ¬ A | B ) ⋅ P ( B ) {\displaystyle =P( eg A|B)\cdot P(B)}
P ( ¬ B | ¬ A ) ⋅ P ( ¬ A ) {\displaystyle P( eg B| eg A)\cdot P( eg A)} = P ( ¬ A | ¬ B ) ⋅ P ( ¬ B ) {\displaystyle =P( eg A| eg B)\cdot P( eg B)}
P ( ¬ A ) {\displaystyle P( eg A)} = 1 − P ( A ) {\displaystyle 1-P(A)}
Total
⁠ P ( B ) {\displaystyle P(B)} ⁠
P ( ¬ B ) = 1 − P ( B ) {\displaystyle P( eg B)=1-P(B)}
1
· Applications › Genetics
Prior Probability
Prior Probability
Hypothesis
Prior Probability
Hypothesis 1: Patient is a carrier
1/2
Hypothesis 2: Patient is not a carrier
1/2
Conditional Probability that all four offspring will be unaffected
Conditional Probability that all four offspring will be unaffected
Hypothesis
Conditional Probability that all four offspring will be unaffected
Hypothesis 1: Patient is a carrier
(1/2) ⋅ (1/2) ⋅ (1/2) ⋅ (1/2) = 1/16
Hypothesis 2: Patient is not a carrier
About 1
Joint Probability
Joint Probability
Hypothesis
Joint Probability
Hypothesis 1: Patient is a carrier
(1/2) ⋅ (1/16) = 1/32
Hypothesis 2: Patient is not a carrier
(1/2) ⋅ 1 = 1/2
Posterior Probability
Posterior Probability
Hypothesis
Posterior Probability
Hypothesis 1: Patient is a carrier
(1/32) / (1/32 + 1/2) = 1/17
Hypothesis 2: Patient is not a carrier
(1/2) / (1/32 + 1/2) = 16/17
Hypothesis
Hypothesis 1: Patient is a carrier
Hypothesis 2: Patient is not a carrier
Prior Probability
1/2
1/2
Conditional Probability that all four offspring will be unaffected
(1/2) ⋅ (1/2) ⋅ (1/2) ⋅ (1/2) = 1/16
About 1
Joint Probability
(1/2) ⋅ (1/16) = 1/32
(1/2) ⋅ 1 = 1/2
Posterior Probability
(1/32) / (1/32 + 1/2) = 1/17
(1/2) / (1/32 + 1/2) = 16/17
· Applications › Genetics
W Homozygous for the wild-type allele (a non-carrier)
W Homozygous for the wild-type allele (a non-carrier)
MotherFather
W Homozygous for the wild-type allele (a non-carrier)
W Homozygous for the wild-type allele (a non-carrier)
WW
M Heterozygous(a CF carrier)
MW
M Heterozygous (a CF carrier)
M Heterozygous (a CF carrier)
MotherFather
M Heterozygous (a CF carrier)
W Homozygous for the wild-type allele (a non-carrier)
MW
M Heterozygous(a CF carrier)
MM (affected by cystic fibrosis)
MotherFather
W Homozygous for the wild-type allele (a non-carrier)
M Heterozygous(a CF carrier)
W Homozygous for the wild-type allele (a non-carrier)
WW
MW
M Heterozygous (a CF carrier)
MW
MM (affected by cystic fibrosis)
· Applications › Genetics
Prior Probability
Prior Probability
Hypothesis
Prior Probability
Hypothesis 1: Patient is a carrier
2/3
Hypothesis 2: Patient is not a carrier
1/3
Conditional Probability of a negative test
Conditional Probability of a negative test
Hypothesis
Conditional Probability of a negative test
Hypothesis 1: Patient is a carrier
1/10
Hypothesis 2: Patient is not a carrier
1
Joint Probability
Joint Probability
Hypothesis
Joint Probability
Hypothesis 1: Patient is a carrier
1/15
Hypothesis 2: Patient is not a carrier
1/3
Posterior Probability
Posterior Probability
Hypothesis
Posterior Probability
Hypothesis 1: Patient is a carrier
1/6
Hypothesis 2: Patient is not a carrier
5/6
Hypothesis
Hypothesis 1: Patient is a carrier
Hypothesis 2: Patient is not a carrier
Prior Probability
2/3
1/3
Conditional Probability of a negative test
1/10
1
Joint Probability
1/15
1/3
Posterior Probability
1/6
5/6

References

  1. Laplace refined Bayes's theorem over a period of decades: Laplace announced his independent discovery of Bayes' theorem
    http://gallica.bnf.fr/ark:/12148/bpt6k77596b/f32.image
  2. Liberty's Apostle
    https://www.uwp.co.uk/book/libertys-apostle-richard-price-his-life-and-times/
  3. David Hartley on Human Nature
    https://books.google.com/books?id=NCu6HhGlAB8C&pg=PA243
  4. Philosophical Transactions of the Royal Society of London
    https://doi.org/10.1098%2Frstl.1763.0053
  5. Holland, pp. 46–7.
  6. Price: Political Writings
    https://books.google.com/books?id=xdH-gjy2vzUC&pg=PR23
  7. Mitchell 1911, p. 314.
  8. Classical Probability in the Enlightenment
    https://books.google.com/books?id=oq8XNbKyUewC&pg=PA268
  9. The History of Statistics: The Measurement of Uncertainty Before 1900
    https://books.google.com/books?id=M7yvkERHIIMC&pg=PA99
  10. Scientific Inference
    https://archive.org/details/scientificinfere0000jeff
  11. The American Statistician
    https://doi.org/10.1080%2F00031305.1983.10483122
  12. Stats, Data and Models
  13. The American Statistician
    https://doi.org/10.1080%2F00031305.1986.10475370
  14. Significance
    https://doi.org/10.1111%2Fj.1740-9713.2013.00638.x
  15. The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines & Emerged Triumphant from Two Centuries of Controversy
    https://archive.org/details/theorythatwouldn0000mcgr
  16. Jahresbericht der Deutschen Mathematiker-Vereinigung
    https://doi.org/10.1365%2Fs13291-013-0069-z
  17. Kendall's Advanced Theory of Statistics: Volume I – Distribution Theory
  18. Foundations of the Theory of Probability
  19. Probability based on Radon measures
    http://archive.org/details/probabilitybased0000tjur
  20. Scandinavian Journal of Statistics
    https://doi.org/10.1111%2Fsjos.12550
  21. Monte Carlo Statistical Methods
    http://worldcat.org/oclc/1159112760
  22. Ten Great Ideas About Chance
  23. Frank Ramsey: A Sheer Excess of Powers
  24. Stanford Encyclopedia of Philosophy
    https://plato.stanford.edu/entries/probability-interpret/
  25. Bayesian Statistics
    https://www.york.ac.uk/depts/maths/histstat/pml1/bayes/book.htm
  26. Trinity University
    https://web.archive.org/web/20040821012342/http://www.trinity.edu/cbrown/bayesweb/
  27. Cuemath
    https://www.cuemath.com/data/bayes-theorem/
  28. Probabilistic Graphical Models
    https://web.archive.org/web/20140427083249/http://pgm.stanford.edu/
  29. Genetics in Medicine
    https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6752283
  30. Genetics in Medicine
    https://doi.org/10.1097%2F01.GIM.0000139511.83336.8F
  31. Cystic Fibrosis Foundation
    https://www.cff.org/What-is-CF/Genetics/Types-of-CFTR-Mutations/
  32. MedlinePlus
    https://ghr.nlm.nih.gov/gene/CFTR#location
Image
Source:
Tip: Wheel or +/− to zoom, drag to pan, Esc to close.