45 found
Order:
Disambiguations
Noah D. Goodman [44]Noah Goodman [2]
  1.  39
    A counterfactual simulation model of causal judgments for physical events.Tobias Gerstenberg, Noah D. Goodman, David A. Lagnado & Joshua B. Tenenbaum - 2021 - Psychological Review 128 (5):936-975.
  2.  97
    Rational Use of Cognitive Resources: Levels of Analysis Between the Computational and the Algorithmic.Thomas L. Griffiths, Falk Lieder & Noah D. Goodman - 2015 - Topics in Cognitive Science 7 (2):217-229.
    Marr's levels of analysis—computational, algorithmic, and implementation—have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   64 citations  
  3. Knowledge and Implicature: Modeling Language Understanding as Social Cognition.Noah D. Goodman & Andreas Stuhlmüller - 2013 - Topics in Cognitive Science 5 (1):173-184.
    Is language understanding a special case of social cognition? To help evaluate this view, we can formalize it as the rational speech-act theory: Listeners assume that speakers choose their utterances approximately optimally, and listeners interpret an utterance by using Bayesian inference to “invert” this model of the speaker. We apply this framework to model scalar implicature (“some” implies “not all,” and “N” implies “not more than N”). This model predicts an interaction between the speaker's knowledge state and the listener's interpretation. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   53 citations  
  4.  92
    One and Done? Optimal Decisions From Very Few Samples.Edward Vul, Noah Goodman, Thomas L. Griffiths & Joshua B. Tenenbaum - 2014 - Cognitive Science 38 (4):599-637.
    In many learning or inference tasks human behavior approximates that of a Bayesian ideal observer, suggesting that, at some level, cognition can be described as Bayesian inference. However, a number of findings have highlighted an intriguing mismatch between human behavior and standard assumptions about optimality: People often appear to make decisions based on just one or a few samples from the appropriate posterior probability distribution, rather than using the full distribution. Although sampling-based approximations are a common way to implement Bayesian (...)
    No categories
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   56 citations  
  5.  41
    The language of generalization.Michael Henry Tessler & Noah D. Goodman - 2019 - Psychological Review 126 (3):395-436.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  6.  64
    The logical primitives of thought: Empirical foundations for compositional cognitive models.Steven T. Piantadosi, Joshua B. Tenenbaum & Noah D. Goodman - 2016 - Psychological Review 123 (4):392-424.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   39 citations  
  7.  78
    The double-edged sword of pedagogy: Instruction limits spontaneous exploration and discovery.Elizabeth Bonawitz, Patrick Shafto, Hyowon Gweon, Noah D. Goodman, Elizabeth Spelke & Laura Schulz - 2011 - Cognition 120 (3):322-330.
  8.  38
    A Rational Analysis of Rule‐Based Concept Learning.Noah D. Goodman, Joshua B. Tenenbaum, Jacob Feldman & Thomas L. Griffiths - 2008 - Cognitive Science 32 (1):108-154.
    This article proposes a new model of human concept learning that provides a rational analysis of learning feature‐based concepts. This model is built upon Bayesian inference for a grammatically structured hypothesis space—a concept language of logical rules. This article compares the model predictions to human generalization judgments in several well‐known category learning experiments, and finds good agreement for both average and individual participant generalizations. This article further investigates judgments for a broad set of 7‐feature concepts—a more natural setting in several (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   69 citations  
  9.  38
    When redundancy is useful: A Bayesian approach to “overinformative” referring expressions.Judith Degen, Robert D. Hawkins, Caroline Graf, Elisa Kreiss & Noah D. Goodman - 2020 - Psychological Review 127 (4):591-621.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  10. Adjectival vagueness in a Bayesian model of interpretation.Daniel Lassiter & Noah D. Goodman - 2017 - Synthese 194 (10):3801-3836.
    We derive a probabilistic account of the vagueness and context-sensitivity of scalar adjectives from a Bayesian approach to communication and interpretation. We describe an iterated-reasoning architecture for pragmatic interpretation and illustrate it with a simple scalar implicature example. We then show how to enrich the apparatus to handle pragmatic reasoning about the values of free variables, explore its predictions about the interpretation of scalar adjectives, and show how this model implements Edgington’s Vagueness: a reader, 1997) account of the sorites paradox, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   21 citations  
  11.  31
    The Division of Labor in Communication: Speakers Help Listeners Account for Asymmetries in Visual Perspective.Robert D. Hawkins, Hyowon Gweon & Noah D. Goodman - 2021 - Cognitive Science 45 (3):e12926.
    Recent debates over adults' theory of mind use have been fueled by surprising failures of perspective-taking in communication, suggesting that perspective-taking may be relatively effortful. Yet adults routinely engage in effortful processes when needed. How, then, should speakers and listeners allocate their resources to achieve successful communication? We begin with the observation that the shared goal of communication induces a natural division of labor: The resources one agent chooses to allocate toward perspective-taking should depend on their expectations about the other's (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  12.  83
    Bootstrapping in a language of thought: A formal model of numerical concept learning.Steven T. Piantadosi, Joshua B. Tenenbaum & Noah D. Goodman - 2012 - Cognition 123 (2):199-217.
  13. (1 other version)The Structure and Dynamics of Scientific Theories: A Hierarchical Bayesian Perspective.Leah Henderson, Noah D. Goodman, Joshua B. Tenenbaum & James F. Woodward - 2010 - Philosophy of Science 77 (2):172-200.
    Hierarchical Bayesian models (HBMs) provide an account of Bayesian inference in a hierarchically structured hypothesis space. Scientific theories are plausibly regarded as organized into hierarchies in many cases, with higher levels sometimes called ‘paradigms’ and lower levels encoding more specific or concrete hypotheses. Therefore, HBMs provide a useful model for scientific theory change, showing how higher‐level theory change may be driven by the impact of evidence on lower levels. HBMs capture features described in the Kuhnian tradition, particularly the idea that (...)
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   37 citations  
  14.  45
    Learning a theory of causality.Noah D. Goodman, Tomer D. Ullman & Joshua B. Tenenbaum - 2011 - Psychological Review 118 (1):110-119.
    No categories
    Direct download (7 more)  
     
    Export citation  
     
    Bookmark   34 citations  
  15.  41
    Where science starts: Spontaneous experiments in preschoolers’ exploratory play.Claire Cook, Noah D. Goodman & Laura E. Schulz - 2011 - Cognition 120 (3):341-349.
    No categories
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   30 citations  
  16.  17
    The effects of information utility and teachers’ knowledge on evaluations of under-informative pedagogy across development.Ilona Bass, Elizabeth Bonawitz, Daniel Hawthorne-Madell, Wai Keen Vong, Noah D. Goodman & Hyowon Gweon - 2022 - Cognition 222 (C):104999.
  17.  38
    Going beyond the evidence: Abstract laws and preschoolers’ responses to anomalous data.Laura E. Schulz, Noah D. Goodman, Joshua B. Tenenbaum & Adrianna C. Jenkins - 2008 - Cognition 109 (2):211-223.
  18.  41
    Computational Models of Emotion Inference in Theory of Mind: A Review and Roadmap.Desmond C. Ong, Jamil Zaki & Noah D. Goodman - 2019 - Topics in Cognitive Science 11 (2):338-357.
    An important, but relatively neglected, aspect of human theory of mind is emotion inference: understanding how and why a person feels a certain why is central to reasoning about their beliefs, desires and plans. The authors review recent work that has begun to unveil the structure and determinants of emotion inference, organizing them within a unified probabilistic framework.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  19.  40
    From partners to populations: A hierarchical Bayesian account of coordination and convention.Robert D. Hawkins, Michael Franke, Michael C. Frank, Adele E. Goldberg, Kenny Smith, Thomas L. Griffiths & Noah D. Goodman - 2023 - Psychological Review 130 (4):977-1016.
  20.  82
    The imaginary fundamentalists: The unshocking truth about Bayesian cognitive science.Nick Chater, Noah Goodman, Thomas L. Griffiths, Charles Kemp, Mike Oaksford & Joshua B. Tenenbaum - 2011 - Behavioral and Brain Sciences 34 (4):194-196.
    If Bayesian Fundamentalism existed, Jones & Love's (J&L's) arguments would provide a necessary corrective. But it does not. Bayesian cognitive science is deeply concerned with characterizing algorithms and representations, and, ultimately, implementations in neural circuits; it pays close attention to environmental structure and the constraints of behavioral data, when available; and it rigorously compares multiple models, both within and across papers. J&L's recommendation of Bayesian Enlightenment corresponds to past, present, and, we hope, future practice in Bayesian cognitive science.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  21.  45
    Affective cognition: Exploring lay theories of emotion.Desmond C. Ong, Jamil Zaki & Noah D. Goodman - 2015 - Cognition 143 (C):141-162.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  22.  98
    Learning to Learn Causal Models.Charles Kemp, Noah D. Goodman & Joshua B. Tenenbaum - 2010 - Cognitive Science 34 (7):1185-1243.
    Learning to understand a single causal system can be an achievement, but humans must learn about multiple causal systems over the course of a lifetime. We present a hierarchical Bayesian framework that helps to explain how learning about several causal systems can accelerate learning about systems that are subsequently encountered. Given experience with a set of objects, our framework learns a causal model for each object and a causal schema that captures commonalities among these causal models. The schema organizes the (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  23.  27
    Remembrance of inferences past: Amortization in human hypothesis generation.Ishita Dasgupta, Eric Schulz, Noah D. Goodman & Samuel J. Gershman - 2018 - Cognition 178 (C):67-81.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  24.  54
    How many kinds of reasoning? Inference, probability, and natural language semantics.Daniel Lassiter & Noah D. Goodman - 2015 - Cognition 136 (C):123-134.
    No categories
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   9 citations  
  25.  84
    The Strategic Use of Noise in Pragmatic Reasoning.Leon Bergen & Noah D. Goodman - 2015 - Topics in Cognitive Science 7 (2):336-350.
    We combine two recent probabilistic approaches to natural language understanding, exploring the formal pragmatics of communication on a noisy channel. We first extend a model of rational communication between a speaker and listener, to allow for the possibility that messages are corrupted by noise. In this model, common knowledge of a noisy channel leads to the use and correct understanding of sentence fragments. A further extension of the model, which allows the speaker to intentionally reduce the noise rate on a (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   8 citations  
  26.  22
    Characterizing the Dynamics of Learning in Repeated Reference Games.Robert D. Hawkins, Michael C. Frank & Noah D. Goodman - 2020 - Cognitive Science 44 (6):e12845.
    The language we use over the course of conversation changes as we establish common ground and learn what our partner finds meaningful. Here we draw upon recent advances in natural language processing to provide a finer‐grained characterization of the dynamics of this learning process. We release an open corpus (>15,000 utterances) of extended dyadic interactions in a classic repeated reference game task where pairs of participants had to coordinate on how to refer to initially difficult‐to‐describe tangram stimuli. We find that (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  27.  31
    The Interactions of Rational, Pragmatic Agents Lead to Efficient Language Structure and Use.Benjamin N. Peloquin, Noah D. Goodman & Michael C. Frank - 2020 - Topics in Cognitive Science 12 (1):433-445.
    Despite their diversity, human languages share consistent properties and regularities. Wherefrom does this consistency arise? And does it tell us something about the problem that all languages need to solve? The authors provide an intriguing analyses which focuses on the “communicative function of ambiguity” whose resolution entailed an equally intriguing “speaker–listener cross‐entropy objective for measuring the efficiency of linguistic systems from first principles of efficient language use.”.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  28. Learning causal schemata.Charles Kemp, Noah D. Goodman & Joshua B. Tenenbaum - 2007 - In McNamara D. S. & Trafton J. G. (eds.), Proceedings of the 29th Annual Cognitive Science Society. Cognitive Science Society. pp. 389--394.
     
    Export citation  
     
    Bookmark   9 citations  
  29.  61
    A Computational Model of Linguistic Humor in Puns.Justine T. Kao, Roger Levy & Noah D. Goodman - 2016 - Cognitive Science 40 (5):1270-1285.
    Humor plays an essential role in human interactions. Precisely what makes something funny, however, remains elusive. While research on natural language understanding has made significant advancements in recent years, there has been little direct integration of humor research with computational models of language understanding. In this paper, we propose two information-theoretic measures—ambiguity and distinctiveness—derived from a simple model of sentence processing. We test these measures on a set of puns and regular sentences and show that they correlate significantly with human (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  30.  28
    Extremely costly intensifiers are stronger than quite costly ones.Erin D. Bennett & Noah D. Goodman - 2018 - Cognition 178:147-161.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  31. Cause and intent: Social reasoning in causal learning.Noah D. Goodman, Chris L. Baker & Joshua B. Tenenbaum - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society. pp. 2759--2764.
     
    Export citation  
     
    Bookmark   7 citations  
  32. How tall is Tall? compositionality, statistics, and gradable adjectives.Lauren A. Schmidt, Noah D. Goodman, David Barner & Joshua B. Tenenbaum - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society.
     
    Export citation  
     
    Bookmark   7 citations  
  33. Probabilistic semantics and pragmatics : uncertainty in language and thought.Noah D. Goodman & Daniel Lassiter - 1996 - In Shalom Lappin (ed.), The handbook of contemporary semantic theory. Cambridge, Mass., USA: Blackwell Reference.
     
    Export citation  
     
    Bookmark   7 citations  
  34. Informative communication in word production and word learning.Michael C. Frank, Noah D. Goodman, Peter Lai & Joshua B. Tenenbaum - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society.
     
    Export citation  
     
    Bookmark   5 citations  
  35.  16
    Resolving uncertainty in plural predication.Gregory Scontras & Noah D. Goodman - 2017 - Cognition 168 (C):294-311.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  36.  13
    Probabilistic programming versus meta-learning as models of cognition.Desmond C. Ong, Tan Zhi-Xuan, Joshua B. Tenenbaum & Noah D. Goodman - 2024 - Behavioral and Brain Sciences 47:e158.
    We summarize the recent progress made by probabilistic programming as a unifying formalism for the probabilistic, symbolic, and data-driven aspects of human cognition. We highlight differences with meta-learning in flexibility, statistical assumptions and inferences about cogniton. We suggest that the meta-learning approach could be further strengthened by considering Connectionist and Bayesian approaches, rather than exclusively one or the other.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  37.  20
    Analyzing Machine‐Learned Representations: A Natural Language Case Study.Ishita Dasgupta, Demi Guo, Samuel J. Gershman & Noah D. Goodman - 2020 - Cognitive Science 44 (12):e12925.
    As modern deep networks become more complex, and get closer to human‐like capabilities in certain domains, the question arises as to how the representations and decision rules they learn compare to the ones in humans. In this work, we study representations of sentences in one such artificial system for natural language processing. We first present a diagnostic test dataset to examine the degree of abstract composable structure represented. Analyzing performance on these diagnostic tests indicates a lack of systematicity in representations (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  38. Beyond Boolean logic: exploring representation languages for learning complex concepts.Steven T. Piantadosi, Joshua B. Tenenbaum & Noah D. Goodman - 2010 - In S. Ohlsson & R. Catrambone (eds.), Proceedings of the 32nd Annual Conference of the Cognitive Science Society. Cognitive Science Society. pp. 859--864.
  39. Relational and role-governed categories: Views from psychology, computational modeling, and linguistics.Micah B. Goldwater, Noah D. Goodman, Stephen Wechsler & Gregory L. Murphy - 2009 - In N. A. Taatgen & H. van Rijn (eds.), Proceedings of the 31st Annual Conference of the Cognitive Science Society.
  40.  30
    Compositionality in rational analysis: Grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Jacob Feldman - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
    Direct download  
     
    Export citation  
     
    Bookmark   2 citations  
  41.  44
    Logic, Probability, and Pragmatics in Syllogistic Reasoning.Michael Henry Tessler, Joshua B. Tenenbaum & Noah D. Goodman - 2022 - Topics in Cognitive Science 14 (3):574-601.
    Topics in Cognitive Science, Volume 14, Issue 3, Page 574-601, July 2022.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  42.  44
    Comparing pluralities.Gregory Scontras, Peter Graff & Noah D. Goodman - 2012 - Cognition 123 (1):190-197.
  43.  27
    Warm (for Winter): Inferring Comparison Classes in Communication.Michael Henry Tessler & Noah D. Goodman - 2022 - Cognitive Science 46 (3):e13095.
    Cognitive Science, Volume 46, Issue 3, March 2022.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  44. Compositionality in rational analysis: grammar-based induction for concept learning.Noah D. Goodman, Joshua B. Tenenbaum, Thomas L. Griffiths & Feldman & Jacob - 2008 - In Nick Chater & Mike Oaksford (eds.), The Probabilistic Mind: Prospects for Bayesian Cognitive Science. Oxford University Press.
     
    Export citation  
     
    Bookmark   1 citation  
  45.  22
    Avoiding frostbite: It helps to learn from others.Michael Henry Tessler, Noah D. Goodman & Michael C. Frank - 2017 - Behavioral and Brain Sciences 40.
    Machines that learn and think like people must be able to learn from others. Social learning speeds up the learning process and – in combination with language – is a gateway to abstract and unobservable information. Social learning also facilitates the accumulation of knowledge across generations, helping people and artificial intelligences learn things that no individual could learn in a lifetime.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark