Cromwell's rule

Subjective probability is a measure of the expectation that an event will occur, or that a statement is true. Probabilities are given a value between 0 (the event will definitely not occur) and 1 (the event is absolutely certain to occur). The nearer the probability of an event tends towards 1, the more certain it is that the event will occur. The nearer the probability tends towards 0, the more certain it is that the event will not occur.

Cromwell's rule, named by statistician Dennis Lindley,[1] states that the use of prior probabilities of 0 or 1 should be avoided, except when applied to statements that are logically true or false. For instance, Lindley would allow one to say that Pr(2+2 = 4) = 1, where Pr represents the probability. In other words, arithmetically, the number 2 added to the number 2 will certainly equal 4.

The reference is to Oliver Cromwell. Cromwell wrote to the synod of the Church of Scotland on August 5, 1650, including a phrase that has become well known and frequently quoted:

I beseech you, in the bowels of Christ, think it possible that you may be mistaken.[2]

As Lindley puts it, assigning a probability should "leave a little probability for the moon being made of green cheese; it can be as small as 1 in a million, but have it there since otherwise an army of astronauts returning with samples of the said cheese will leave you unmoved."[3] Similarly, in assessing the likelihood that tossing a coin will result in either a head or a tail facing upwards, there is a possibility, albeit remote, that the coin will land on its edge and remain in that position.

If the prior probability assigned to a hypothesis is 0 or 1, then, by Bayes' theorem, the posterior probability (probability of the hypothesis, given the evidence) is forced to be 0 or 1 as well; no evidence, no matter how strong, could have any influence.

A strengthened version of Cromwell's rule, applying also to statements of arithmetic and logic, alters the first rule of probability, or the convexity rule, 0 ≤ P(A) ≤ 1, to 0 < P(A) < 1.

Cromwell's rule: Bayesian divergence (pessimistic)

An example of Bayesian divergence of opinion is in Appendix A of Sharon Bertsch McGrayne's 2011 book The Theory That Would Not Die: How Bayes' Rule Cracked The Enigma Code, Hunted Down Russian Submarines, & Emerged Triumphant from Two Centuries of Controversy.[4] In McGrayne's example (suggested by Albert Mandansky), Tim and Susan disagree as to whether a stranger who has two fair coins and one unfair coin (one with heads on both sides) has tossed one of the two fair coins or the unfair one; the stranger has tossed one of his coins three times and it has come up heads each time. Tim judges that the stranger picked the coin randomly, i.e., assumes a prior probability distribution in which each coin had a 1/3 chance of being the one picked. Applying Bayesian inference, Tim then calculates an 80% probability that the result of three consecutive heads was achieved by using the unfair coin. Susan assumes the stranger either chose the unfair coin (in which case the prior probability the tossed coin is the unfair coin is one) or chose one of the other coins (in which case the prior probability the tossed coin is the unfair one is zero). Consequently, Susan calculates the probability that three (or any number of consecutive heads) were tossed with the unfair coin must be one or zero; if still more heads are thrown, Susan gains no more certainty that the unfair coin was picked than she had after the first head; Tim and Susan's probabilities do not converge.[5]

Cromwell's rule: Bayesian convergence (optimistic)

An example of Bayesian convergence of opinion is in Nate Silver's 2012 book The Signal and the Noise: Why so many predictions fail — but some don't.[6] After stating, "Absolutely nothing useful is realized when one person who holds that there is a 0 (zero) percent probability of something argues against another person who holds that the probability is 100 percent", Silver describes a simulation where three investors start out with initial guesses of 10%, 50% and 90% that the stock market is in a bull market; by the end of the simulation (shown in a graph), "all of the investors conclude they are in a bull market with almost (although not exactly of course) 100 percent certainty."

See also

References

  1. Jackman, Simon (2009) Bayesian Analysis for the Social Sciences, Wiley. ISBN 978-0-470-01154-6 (ebook ISBN 978-0-470-68663-8)
  2. Carlyle, Thomas, ed. (1855). Oliver Cromwell's Letters and Speeches 1. New York: Harper. p. 448.
  3. Lindley, Dennis (1991). Making Decisions (2 ed.). Wiley. p. 104. ISBN 0-471-90808-8.
  4. McGrayne, Sharon Bertsch. (2011). The Theory That Would Not Die: How Bayes' Rule Cracked The Enigma Code, Hunted Down Russian Submarines, & Emerged Triumphant from Two Centuries of Controversy. New Haven: Yale University Press. ISBN 9780300169690; OCLC 670481486 The Theory That Would Not Die, pages 263-265 at Google Books
  5. McGrayne, Sharon Bertsch. "Bayes Examples (2nd example)". same as Appendix A. Retrieved 4/10/2013. Check date values in: |access-date= (help)
  6. Silver, Nate (2012). The Signal and the Noise: Why so many predictions fail -- but some don't. New York: Penguin. pp. 258–261. ISBN 978-1-59-420411-1.
This article is issued from Wikipedia - version of the Monday, April 18, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.