Cognitive Science 42 (5):1410-1456 (2018)

Authors
Abstract
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people’s goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the reduction thereof. However, a variety of alternative entropy metrics are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism.
Keywords Entropy  Information search  Probabilistic models  Uncertainty  Value of information
Categories (categorize this paper)
DOI 10.1111/cogs.12613
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 64,132
Through your library

References found in this work BETA

The Logic of Scientific Discovery.K. Popper - 1959 - British Journal for the Philosophy of Science 10 (37):55-57.
Reasoning.Peter C. Wason - 1966 - In B. Foss (ed.), New Horizons in Psychology. Harmondsworth: Penguin Books. pp. 135-151.

View all 63 references / Add more references

Citations of this work BETA

Variety of Evidence.Jürgen Landes - 2020 - Erkenntnis 85 (1):183-223.
Variety of evidence and the elimination of hypotheses.Jürgen Landes - 2020 - European Journal for Philosophy of Science 10 (2):1-17.

View all 7 citations / Add more citations

Similar books and articles

A Generalization of Shannon's Information Theory.Chenguang Lu - 1999 - Int. J. Of General Systems 28 (6):453-490.
Information Dynamics.Amos Golan - 2014 - Minds and Machines 24 (1):19-36.
“Search” Vs. “Browse”: A Theory Of Error Grounded In Radical Ignorance.Anthony Evans & Jeffrey Friedman - 2011 - Critical Review: A Journal of Politics and Society 23 (1):73-104.
Complementarity in Vision and Cognition.Charles Q. Wu - 1997 - Philosophical Psychology 10 (4):481 – 488.
Search for Syllogistic Structure of Semantic Information.Marcin J. Schroeder - 2012 - Journal of Applied Non-Classical Logics 22 (1-2):83-103.

Analytics

Added to PP index
2018-06-18

Total views
60 ( #179,671 of 2,454,729 )

Recent downloads (6 months)
2 ( #303,745 of 2,454,729 )

How can I increase my downloads?

Downloads

My notes