Skip to main content
Log in

Limited role of entropy in information economics

  • Published:
Theory and Decision Aims and scope Submit manuscript

Abstract

‘Information transmitted’ is defined as the amount by which added evidence (or ‘message received’) diminishes ‘uncertainty’. The latter is characterized by some properties intuitively suggested by this word and possessed by conditional entropy, a parameter of the posterior probability distribution. However, conditional entropy shares these properties with some other concave symmetric functions on the probability space.

Moreover, a given transmission channel (or, in the context of statistical inference, a given experiment) yields a higher maximum expected benefit than anotherto any user if and only ifall concave functions of the posterior probability vector have higher values for the former channel (or experiment). Hence one information system (channel, experiment) may be preferable to another for a given user although its transmission rate, in entropy terms, is lower.

But only entropy has the economically relevant property of measuring, in the limit, the expected length of efficiently coded messages sent in long sequences. Thus, while irrelevant to the value (maximum expected benefit) of an information system and to the costs of observing, estimating, and deciding, entropy formulas are indeed relevant to the cost of communicating, i.e., of storing, coding and transmitting messages.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Aczél, J., ‘On Different Characterizations of Entropies’, in M. Beharaet al. (eds.),Probability and Information Theory, 1–11, Springer (1969).

  2. Blackwell, D., ‘Equivalent Comparisons of Experiments’,Ann. Math. Stat. 24(1953) 265–272.

    Google Scholar 

  3. Blackwell, D. and Girshick, A.,Theory of Games and Statistical Decisions, McGraw-Hill (1970).

  4. V. Boehm, Personal Communication.

  5. DeGroot, M. H., ‘Uncertainty, Information and Sequential Experiments’,Ann. Math. Stat. 33 (1962) 404–419.

    Google Scholar 

  6. DeGroot, M. H.,Optimal Statistical Decisions, McGraw-Hill (1970).

  7. Feinstein, A.,Foundations of Information Theory, McGraw-Hill (1958).

  8. Marschak, J., ‘Economics of Information Systems’,J. Amer. Stat. Assn. 66 (1971) 192–219.

    Google Scholar 

  9. Marschak, J., ‘Optimal Systems for Information and Decision’, inTechniques of Optimization, Academic Press (1972).

  10. Marschak, J. and Miyasawa, K., ‘Economic Comparability of Information Systems’,Intern. Econ. Rev. 9 (1968) 137–174.

    Google Scholar 

  11. Savage, L. J.,The Foundations of Statistics, Wiley (1954).

  12. Schroedinger, E.,Statistical Thermodynamics, Cambridge University Press (1948).

  13. Shannon, C., ‘The Mathematical Theory of Communication’,Bell Syst. Tech. J. (1948).

  14. Wolfowitz, T.,Coding Theorems of Information Theory, Springer (1961).

  15. Zangwill, W. I.,Nonlinear Programming. A United Approach. Prentice-Hall, 1969.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Marschak, J. Limited role of entropy in information economics. Theor Decis 5, 1–7 (1974). https://doi.org/10.1007/BF00140297

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00140297

Keywords

Navigation