Skip to main content
Log in

Why Bayesians Needn’t Be Afraid of Observing Many Non-black Non-ravens

  • Article
  • Published:
Journal for General Philosophy of Science Aims and scope Submit manuscript

An Erratum to this article was published on 01 December 2012

Abstract

According to Hempel’s raven paradox, the observation of one non-black non-raven confirms the hypothesis that all ravens are black. Bayesians such as Howson and Urbach (Scientific reasoning: the Bayesian approach, 2nd edn. Open Court, Chicago, 1993) claim that the raven paradox can be solved by spelling out the concept of confirmation in the sense of the relevance criterion. Siebel (J Gen Philos Sci 35:313–329, 2004) disputes the adequacy of this Bayesian solution. He claims that spelling out the concept of confirmation in the relevance sense lets the raven paradox reappear as soon as numerous non-black non-ravens are observed. It is shown in this paper that Siebel’s objection to the Bayesian solution is flawed. Nevertheless, the objection made by Siebel may give us an idea of how Bayesians can successfully handle situations in which we observe more than one non-black non-raven.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

Notes

  1. Cf. Fitelson and Hawthorne (2010) for an overview. For a bibliography of mostly Bayesian approaches cf. Vranas (2004).

  2. This translates to: Our background knowledge comprises the information that there are far more non-ravens than ravens among the things that are non-black. The relevant probabilities P(¬Ri|¬Si) [that is P(¬Rai|¬Bai) in our terms] are thus close to 1.

  3. Jardine (1965) considers cases in which n > 1 objects are sampled at random with replacement.

  4. Given that we assume that the number of black non-ravens remains the same, of course.

  5. All values in Table 1 were obtained using the open source computing environment R Project for Statistical Computing version 2.13.1.

  6. Note that the degree of confirmation that the evidence confers on the hypothesis H would have been even smaller if a stronger touchstone for the plausibility of (S1) had been chosen.

  7. Let us note that, in comparison to the overall number of German citizens, only very few people may have seen the commercial in question and that only very few people may have considered buying the product that was advertised. If this is the case, then the situation described here would be isomorphic to a situation in which there are far more non-black non-ravens than ravens or black things.

References

  • Fitelson, B., & Hawthorne, J. (2010). How Bayesian confirmation theory handles the paradox of the ravens. In E. Ells & J. H. Fetzer (Eds.), The place of probability in science (pp. 247–275). Dordrecht: Springer.

    Chapter  Google Scholar 

  • Goodman, N. (1983). Fact, fiction, and forecast (4th ed.). Cambridge: Harvard University Press.

    Google Scholar 

  • Grinstead, C. M., & Snell, J. L. (1998). Introduction to probability (2nd ed.). AMS.

  • Hempel, C. G. (1945). Studies in the logic of confirmation. Mind, 54, 1–26, 97–121.

  • Howson, C., & Urbach, P. (1993). Scientific reasoning: The Bayesian approach (2nd ed.). Chicago: Open Court.

    Google Scholar 

  • Jardine, R. (1965). The resolution of the confirmation paradox. Australasian Journal of Philosophy, 43, 359–368.

    Article  Google Scholar 

  • Sainsbury, R. M. (1995). Paradoxes (2nd ed.). Cambridge: Cambridge University Press.

    Google Scholar 

  • Siebel, M. (2004). Der Rabe und der Bayesianist. Journal for General Philosophy of Science, 35, 313–329.

    Article  Google Scholar 

  • Vranas, P. B. M. (2004). Hempel’s raven paradox: A lacuna in the standard Bayesian solution. British Journal for the Philosophy of Science, 55, 545–560.

    Article  Google Scholar 

Download references

Acknowledgments

I would like to thank two anonymous reviewers for their helpful comments. Furthermore I am grateful to Mark Siebel for encouraging me to submit this paper. Last but not least I would like to thank Nils Springhorn, Marvin Schiller, Owino Eloka, Hannes Bajohr and Michael Schippers for interesting discussions about the raven paradox and the problems related to it.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Florian F. Schiller.

Appendices

Appendix 1

The probability of P(¬Rai & ¬Bai) can be determined in the following way: There are x non-black non-ravens and δ objects that are not non-black non-ravens so that we have all in all x + δ objects to sample from. Imagine that you draw i objects at random without replacement from the x + δ objects. The probability of drawing k non-black non-ravens while the ith object is a non-black non-raven as well (so that all in all k + 1 non-black non-ravens are drawn) can then be determined by dividing the number of favourable events by the number of possible events. For the first i − 1 draws there are x over k possibilities to draw a non-black non-raven while there are δ over i − 1 − k possibilities to draw an object which is not a non-black non-raven. Thus x over k and δ over i − 1 − k have to be multiplied to get the overall number of possibilities to obtain k non-black non-ravens in the first i − 1 draws. In the ith draw there still remain x − k possibilities to draw a non-black non-raven. Hence, x over k times δ over i − 1 − k needs to be multiplied by x − k to obtain the number of favourable events. The number of possible events is determined in a similar manner. There are x + δ over i − 1 possibilities to sample i − 1 objects while there remain x + δ − (i − 1) possibilities to draw the ith object. Thus we have x + δ over i − 1 times x + δ − (i − 1) for the number of possible events. Dividing favourable events by possible events yields

$$ \frac{{\left( {\begin{array}{*{20}c} {\text{x}} \\ {\text{k}} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} \updelta \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right)({\text{x}} - {\text{k}})}}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}. $$

The number k of non-black non-ravens drawn in the first i − 1 draws may vary between 0 and k. The event of drawing m non-black non-ravens (0 ≤ m ≤ k) in the first (i − 1)th draws as well as a non-black non-raven in the ith draw excludes the event of drawing n non-black non-ravens (0 ≤ n ≤ k) in the first (i − 1)th draws as well as a non-black non-raven in the ith draw if m ≠ n. In order to obtain the probability for sampling a non-black non-raven on the ith draw we can thus simply sum up the probability for every k (0 ≤ k ≤ i − 1):

$$ {\text{P}}(\neg {\text{Ra}}_{\text{i}} \,\& \,\neg {\text{Ba}}_{\text{i}} ) = \sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\frac{{\left( {\begin{array}{*{20}c} {\text{x}} \\ {\text{k}} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} \updelta \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right)({\text{x}} - {\text{k}})}}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}}. $$

Let us now determine the probability of P(¬Rai & ¬Bai| ¬Ra1 & ¬Ba1&...&¬Rai−1 & ¬Bai−1). There are (x − 0)· (x − 1) ·...· (x − (i − 1)) possibilities to draw only non-black non-ravens in i draws while there are (x + δ − 0)· (x + δ − 1) · … · (x + δ − (i − 1)) possibilities to draw from the overall x + δ objects. Thus we have

$$ {\text{P}}(\neg {\text{Ra}}_{\text{i}} \,\& \,\neg {\text{Ba}}_{\text{i}} |\neg {\text{Ra}}_{ 1} \,\& \,\neg {\text{Ba}}_{ 1} {\text{\& }} \ldots {\text{\& }} \neg {\text{Ra}}_{{{\text{i}} - 1}} \,\& \,\neg {\text{Ba}}_{{{\text{i}} - 1}} ) \, = \prod\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\frac{{{\text{x}} - {\text{k}}}}{{{\text{x}} + \updelta - {\text{k}}}}}. $$

Appendix 2

$$ \begin{aligned} \sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\frac{{\left( {\begin{array}{*{20}c} {\text{x}} \\ {\text{k}} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta - {\text{x}}} \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right)({\text{x}} - {\text{k}})}}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}} & = \frac{1}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}\sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\left( {\begin{array}{*{20}c} x \\ {\text{k}} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} \updelta \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right)(x - {\text{k}})} \\ & = \frac{1}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}\left( {{\text{x}}\sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\left( {\begin{array}{*{20}c} {\text{x}} \\ {\text{k}} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta - {\text{x}}} \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right) - } \sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {{\text{k}}\left( {\begin{array}{*{20}c} {\text{x}} \\ {\text{k}} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} \updelta \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right)} } \right) \\ & = \frac{1}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}\left( {{\text{x}}\overbrace {{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)}}^{{{\text{by}}\,{\text{Vandermonde}}'{\text{s}}\,{\text{Identity}}}} - {\text{x}}\sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\frac{\text{k}}{\text{k}}\left( {\begin{array}{*{20}c} {{\text{x}} - 1} \\ {{\text{k}} - 1} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta - {\text{x}}} \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right)} } \right) \\ & = \frac{\text{x}}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}\left( {\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right) - \sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\left( {\begin{array}{*{20}c} {{\text{x}} - 1} \\ {{\text{k}} - 1} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta - 1 - ({\text{x}} - 1)} \\ {{\text{i}} - 1 - 1 - ({\text{k}} - 1)} \\ \end{array} } \right)} } \right) \\ & = \frac{\text{x}}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}\left( {\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right) - \overbrace {{\frac{{({\text{i}} - 1)}}{{({\text{x}} + \updelta )}}\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)}}^{{{\text{by}}\,{\text{Vandermonde}}'{\text{s}}\,{\text{Identity}}}}} \right) \\ & = \frac{\text{x}}{{({\text{x}} + \updelta - ({\text{i}} - 1))}}\left( {1 - \frac{{({\text{i}} - 1)}}{{({\text{x}} + \updelta )}}} \right) = \frac{{{\text{x}}({\text{x}} + \updelta - ({\text{i}} - 1))}}{{({\text{x}} + \updelta )({\text{x}} + \updelta - ({\text{i}} - 1))}} = \frac{\text{x}}{{{\text{x}} + \updelta }} \\ \end{aligned} $$

Appendix 3

With “Appendix 1” in the background it is easy to see that

$$ {\text{P}}({\text{Ra}}_{\text{i}} \,\& \,\neg {\text{Ba}}_{\text{i}} ) = \sum\limits_{{{\text{k}} = 0}}^{{{\text{i}} - 1}} {\frac{{\left( {\begin{array}{*{20}c} \upbeta \\ {\text{k}} \\ \end{array} } \right)\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta - \upbeta } \\ {{\text{i}} - 1 - {\text{k}}} \\ \end{array} } \right)(\upbeta - {\text{k}})}}{{\left( {\begin{array}{*{20}c} {{\text{x}} + \updelta } \\ {{\text{i}} - 1} \\ \end{array} } \right)({\text{x}} + \updelta - ({\text{i}} - 1))}}}. $$

By systematically replacing the relevant occurrences of “x” with “β” in “Appendix 2” we obtain a proof that P(Rai & ¬Bai) = β/(x + δ) for all i. Thus we have:

$$ {\text{P}}(\neg {\text{Ra}}_{\text{i}} |\neg {\text{Ba}}_{\text{i}} ) = \frac{{{\text{P}}(\neg {\text{Ra}}_{\text{i}} \,\& \,\neg {\text{Ba}}_{\text{i}} )}}{{{\text{P}}({\text{Ra}}_{\text{i}} \,\& \,\neg {\text{Ba}}_{\text{i}} ) + {\text{P}}(\neg {\text{Ra}}_{\text{i}} \,\& \,\neg {\text{Ba}}_{\text{i}} )}} = \frac{{\frac{\text{x}}{{{\text{x}} + \updelta }}}}{{\frac{\upbeta }{{{\text{x}} + \updelta }} + \frac{\text{x}}{{{\text{x}} + \updelta }}}} = \frac{\text{x}}{{{\text{x}} + \upbeta }}. $$

Rights and permissions

Reprints and permissions

About this article

Cite this article

Schiller, F.F. Why Bayesians Needn’t Be Afraid of Observing Many Non-black Non-ravens. J Gen Philos Sci 43, 77–88 (2012). https://doi.org/10.1007/s10838-012-9179-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10838-012-9179-z

Keywords

Navigation