Skip to main content
Log in

Towards an Informational Pragmatic Realism

  • Published:
Minds and Machines Aims and scope Submit manuscript

Abstract

I discuss the design of the method of entropic inference as a general framework for reasoning under conditions of uncertainty. The main contribution of this discussion is to emphasize the pragmatic elements in the derivation. More specifically: (1) Probability theory is designed as the uniquely natural tool for representing states of incomplete information. (2) An epistemic notion of information is defined in terms of its relation to the Bayesian beliefs of ideally rational agents. (3) The method of updating from a prior to a posterior probability distribution is designed through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting framework includes as special cases both MaxEnt and Bayes’ rule. It therefore unifies entropic and Bayesian methods into a single general inference scheme. I find that similar pragmatic elements are an integral part of Putnam’s internal realism, of Floridi’s informational structural realism, and also of van Fraasen’s empiricist structuralism. I conclude with the conjecture that their valuable insights can be incorporated into a single coherent doctrine—an informational pragmatic realism.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. The Perceian slant of Floridi’s pragmatism is perhaps more explicit in (Floridi 1994, 2010).

  2. For our purposes we do not need to be particularly precise about the meaning of the term ‘knowledge’. Note however that under a pragmatic conception of truth there is no real difference between a ‘justified belief’ and the more explicit but redundant ‘justified true belief’.

  3. Strictly the tool for updating is relative entropy. However, as we shall later see, all entropies are relative to some prior and therefore the qualifier relative is redundant and can be dropped. This is somewhat analogous to the situation with energy: it is implicitly understood that all energies are relative to some reference frame but there is no need to constantly refer to a relative energy.

  4. I make no attempt to provide a review of the literature on entropic inference. The following incomplete list reflects only some contributions that are directly related to the particular approach described in this paper: (Caticha 2012; Jaynes 2003; Shore and Johnson 1980; 1981; Williams 1980; Skilling 1988; Rodríguez 1991; Caticha and Giffin 2006).

  5. (Ellis 1985) gives a similar pragmatic position which emphasizes explanatory power.

  6. The argument below follows (Caticha 2009). It is an elaboration of the pioneering work of Cox (1946) (see also Jaynes 2003).

  7. In contrast, (Cox 1946) sought a representation of AND, [ab|c] = f([a|c], [b|ac]), and negation, [\({{a}^{\sim}}\)|c] = g([a|c]).

  8. We refer to ideally rational agents who have fully processed all information acquired in the past. Humans do not normally behave this way; they often change their minds by processes that are not fully conscious.

  9. The independence requirement is rather subtle and one must be careful about its precise implementation. The robustness of the design is shown by exhibiting an alternative version that takes the form of a consistency constraint: Whenever systems are known to be independent it should not matter whether the analysis treats them jointly or separately. (Caticha 2012; Caticha and Giffin 2006)

  10. (Skilling 1988) deals with the more general problem of ranking positive additive distributions which also include, e.g., intensity distributions.

  11. We denote priors by q, candidate posteriors by lower case p, and the selected posterior by upper case P.

  12. The density \(\exp S(\theta)\) is a scalar function; it is the probability per unit invariant volume dV = g 1/2 n (θ)d nθ.

References

  • Adriaans, P.W., van Benthem, & J.F.A.K. (Eds.) (2008) Handbook of philosophy of information. Amsterdam: Elsevier.

    Google Scholar 

  • Amari, S. (1985). Differential-geometrical methods in statistics. Berlin: Springer

    Book  MATH  Google Scholar 

  • Bohr, N. (1937). Atomic Theory and the Description of Nature. Cambridge: Cambridge University Press. (Reprinted by Ox Bow Press, 1987).

  • Bohr, N. (1958). Essays 1932-1957 on Atomic Physics and Human Knowledge. (Wiley, 1958; reprinted by Ox Bow Press, 1987).

  • Caticha, A. (1998). Consistency and linearity in quantum theory. Physics Letters, A244, 13.

    Article  Google Scholar 

  • Caticha, A. (1998). Consistency, amplitudes and probabilities in quantum theory. Physical Review, A57, 1572.

    Article  Google Scholar 

  • Caticha, A. (2000). Insufficient reason and entropy in quantum theory. Foundations of Physics 30, 227.

    Article  MathSciNet  Google Scholar 

  • Caticha, A. (2011). Entropic dynamics, time, and quantum theory. American Journal of Physics A44, 225303.

    MathSciNet  Google Scholar 

  • Caticha, A. (2001). Maximum entropy, fluctuations and priors. In A. Mohammad-Djafari (Ed.). Bayesian methods and maximum entropy in science and engineering, AIP Conf. Proc. 568, 94 (2001) (arXiv.org/abs/math-ph/0008017).

  • Caticha, A. (2009). Quantifying Rational Belief. In P. Goggans, et al. (Ed.) Bayesian Inference and maximum entropy methods in science and engineering, AIP Conf. Proc. 1193, 60 (arXiv.org/abs/0908.3212).

  • Caticha, A. (2012). Entropic inference and the foundations of physics (monograph commissioned by the 11th Brazilian Meeting on Bayesian Statistics—EBEB-2012 (USP Press, São Paulo, Brazil 2012); online at http://www.albany.edu/physics/ACaticha-EIFP-book.pdf.

  • Caticha, A., & Giffin, A. (2006). Updating probabilities. In Ali Mohammad-Djafari (Eds). Bayesian inference and maximum entropy methods. Science and Engineering. AIP Conference of Proc. 872, 31 (arxiv.org/abs/physics/0608185).

  • Cover, T., & Thomas, J. (1991). Elements of information theory. New York: Wiley.

    Book  MATH  Google Scholar 

  • Cox, R.T. (1946). Probability, frequency and reasonable expectation. American Journal of Physics 14, 1.

    Article  MATH  MathSciNet  Google Scholar 

  • Ellis, B. (1985). What science aims to do. In P. Churchland, C. Hooker (Eds.), Images of Science. Chicago: University of Chicago Press, 1985); reprinted in [20].

  • Fine, A. (1996). The Shaky game—Einstein realism and the quantum theory. Chicago: University of Chicago Press.

    Google Scholar 

  • Floridi, L. (1994). Scepticism and the Search for Knowledge: a Peirceish Answer to a Kantian doubt. Transactions of the Charles S. Peirce Society, 30, 543.

    Google Scholar 

  • Floridi, L. (2008). A defence of informational structural realism. Synthese 161, 219.

    Article  Google Scholar 

  • Floridi, L. (2010). Information, possible worlds, and the cooptation of scepticism. Synthese 175.1, 63.

    Article  MATH  MathSciNet  Google Scholar 

  • Floridi, L. (2011). The philosophy of information. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Godfrey-Smith, P. (2003). Theory and reality. Chicago: University of Chicago Press.

    Book  Google Scholar 

  • Golan, A. (2008). Information and entropy in econometrics—A review and synthesis. Foundations and Trends in Econometrics 2, 1–145.

    Article  MathSciNet  Google Scholar 

  • James, W. (1956). The will to believe. New York: Dover.

    Google Scholar 

  • Jaynes, E.T. (1957). Information Theory and Statistical Mechanics. Physical Review, 106, 620.

    Article  MATH  MathSciNet  Google Scholar 

  • Jaynes, E.T. (1957). Information Theory and Statistical Mechanics II. Physical Review, 108, 171.

    Article  MathSciNet  Google Scholar 

  • Jaynes, E.T. (1983). Papers on probability, statistics and statistical physics. In R.D. Rosenkrantz (Ed.), Dordrecht: Reidel.

  • Jaynes, E.T. (2003). Probability theory: The logic of science. Cambridge: Cambridge University Press.

  • Papineau, D. (Eds.) (1996). The philosophy of science. Oxford: Oxford University Press.

    Google Scholar 

  • Putnam, H. (1975). Mathematics, matter, and method, Vol. 1. Cambridge: Cambridge University Press.

    Google Scholar 

  • Putnam, H. (1981). Reason, truth and history. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Putnam, H. (1979). How to be an internal realist and a transcendental idealist (at the same time). In: Language logic and philosophy, Proceedings of the 4th international Wittgenstein symposium (Kirchberg/Wechsel, Austria 1979).

  • Rodríguez, C.C. (1991). Entropic priors. In W.T. Grandy Jr.,& L.H. Schick (Eds.), Maximum Entropy and Bayesian Methods. Dordrecht: Kluwer.

    Google Scholar 

  • Shore, J.E., Johnson, R.W. (1980). IEEE Transactions on Information Theory IT-26, 26.

    Article  MathSciNet  Google Scholar 

  • Shore, J.E., Johnson, R.W. (1981). IEEE Transactions on Information Theory IT-27, 26.

    MathSciNet  Google Scholar 

  • Skilling, J. (1988). The axioms of maximum entropy. In G.J. Erickson, & C.R. Smith (Eds.), Maximum-entropy and Bayesian methods in science and engineering. Dordrecht: Kluwer.

    Google Scholar 

  • Stapp, H.P. (1972). The Copenhagen interpretation. Am. J. Phys. 40, 1098.

    Article  Google Scholar 

  • van Fraasen, B.C. (1980). The Scientific Image. Oxford: Clarendon.

    Book  Google Scholar 

  • van Fraasen, B.C. (1997). Structure and perspective: philosophical perplexity and paradox. In M. L. Dalla Chiara, et al. (Eds.), Logic and scientific methods. (p. 511). Netherlands: Kluwer.

    Chapter  Google Scholar 

  • van Fraasen, B.C. (2006). Structure: Its Shadow and Substance. The British Journal for the Philosophy of Science 57, 275.

    Article  Google Scholar 

  • van Fraasen, B.C. (2006). Representation: The problem for structuralism. Philosophy of Science 73, 536.

    Article  Google Scholar 

  • Williams, P.M. (1980). Bayesian Conditionalization and the Principle of Minimum Relative Information. British Journal for the Philosophy of Science, 31, 131.

    Article  Google Scholar 

Download references

Acknowledgments

I am grateful to C. Cafaro, N. Caticha, A. Giffin, A. Golan, P. Goyal, K. Knuth, C. Rodríguez, M. Reginatto, and J. Skilling for many useful discussions on entropic inference; and also to A. Beavers and L. Floridi for the invitation to participate in this symposium.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ariel Caticha.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Caticha, A. Towards an Informational Pragmatic Realism. Minds & Machines 24, 37–70 (2014). https://doi.org/10.1007/s11023-013-9322-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11023-013-9322-6

Keywords

Navigation