Graduate studies at Western
Minds and Machines 9 (4):565-593 (1999)
|Abstract||Intractability and optimality are two sides of one coin: Optimal models are often intractable, that is, they tend to be excessively complex, or NP-hard. We explain the meaning of NP-hardness in detail and discuss how modem computer science circumvents intractability by introducing heuristics and shortcuts to optimality, often replacing optimality by means of sufficient sub-optimality. Since the principles of decision theory dictate balancing the cost of computation against gain in accuracy, statistical inference is currently being reshaped by a vigorous new trend: the science of simplicity. Simple models, as we show for specific cases, are not just tractable, they also tend to be robust. Robustness is the ability of a model to extract relevant information from data, disregarding noise.Recently, Gigerenzer, Todd and the ABC Research Group (1999) have put forward a collection of fast and frugal heuristics as simple, boundedly rational inference strategies used by the unaided mind in real world inference problems. This collection of heuristics has suggestively been called the adaptive toolbox. In this paper we will focus on a comparison task in order to illustrate the simplicity and robustness of some of the heuristics in the adaptive toolbox in contrast to the intractability and the fragility of optimal solutions. We will concentrate on three important classes of models for comparison-based inference and, in each of these classes, search for that to be used as benchmarks to evaluate the performance of fast and frugal heuristics: lexicographic trees, linear modes and Bayesian networks. Lexicographic trees are interesting because they are particularly simple models that have been used by humans throughout the centuries. Linear models have been traditionally used by cognitive psychologists as models for human inference, while Bayesian networks have only recently been introduced in statistics and computer science. Yet it is the Bayesian networks that are the best possible benchmarks for evaluating the fast and frugal heuristics, as we will show in this paper.|
|Keywords||algorithms Bayesian networks generalization heuristics linear models NP-hardness|
|Categories||categorize this paper)|
|Through your library||Configure|
Similar books and articles
X. T. Wang (2000). From Simon 's Scissors for Rationality to Abc's Adaptive Toolbox. Behavioral and Brain Sciences 23 (5):765-766.
Laura Martignon & Ulrich Hoffrage (2002). Fast, Frugal, and Fit: Simple Heuristics for Paired Comparison. Theory and Decision 52 (1):29-71.
Clare Harries & Mandeep K. Dhami (2000). On the Descriptive Validity and Prescriptive Utility of Fast and Frugal Models. Behavioral and Brain Sciences 23 (5):753-754.
Peter M. Todd & Gerd Gigerenzer (2000). Précis of Simple Heuristics That Make Us Smart. Behavioral and Brain Sciences 23 (5):727-741.
José Luis Bermúdez (2000). Rationality, Logic, and Fast and Frugal Heuristics. Behavioral and Brain Sciences 23 (5):744-745.
Richard Cooper (2000). Simple Heuristics Could Make Us Smart; but Which Heuristics Do We Apply When? Behavioral and Brain Sciences 23 (5):746-746.
James Shanteau & Rickey P. Thomas (2000). Fast and Frugal Heuristics: What About Unfriendly Environments? Behavioral and Brain Sciences 23 (5):762-763.
Gerd Gigerenzer (1999). Simple Heuristics That Make Us Smart. Oxford University Press.
R. Duncan Luce (2000). Fast, Frugal, and Surprisingly Accurate Heuristics. Behavioral and Brain Sciences 23 (5):757-758.
Added to index2009-01-28
Total downloads9 ( #122,437 of 739,325 )
Recent downloads (6 months)1 ( #61,243 of 739,325 )
How can I increase my downloads?