Search results for 'Syntactic Model' (try it on Scholar)

1000+ found
Sort by:
  1. Julia Ponzio (2009). The Rhythm of Laughter: Derrida's Contribution to a Syntactic Model of Interpretation. Derrida Today 2 (2):234-244.score: 180.0
    The focus of this paper is Derrida's idea of rhythm. I will analyse how the idea of rhythm can work in a contemporary semiotic, and in particular in a semiotic of interpretation, in order to eliminate the confusion between interpretation and semantics and to constitute a syntactic model of interpretation. In ‘The Double Session’ Derrida uses the Greek word rytmos in order to indicate the ‘law of spacing’. Rytmos is a form that is always about to change or (...)
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  2. Marcelo Da Silva Corrêa & Edward Hermann Haeusler (1997). A Concrete Categorical Model for the Lambek Syntactic Calculus. Mathematical Logic Quarterly 43 (1):49-59.score: 168.0
    No categories
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  3. David Reitter, Frank Keller & Johanna D. Moore (2011). A Computational Cognitive Model of Syntactic Priming. Cognitive Science 35 (4):587-637.score: 156.0
    The psycholinguistic literature has identified two syntactic adaptation effects in language production: rapidly decaying short-term priming and long-lasting adaptation. To explain both effects, we present an ACT-R model of syntactic priming based on a wide-coverage, lexicalized syntactic theory that explains priming as facilitation of lexical access. In this model, two well-established ACT-R mechanisms, base-level learning and spreading activation, account for long-term adaptation and short-term priming, respectively. Our model simulates incremental language production and in a (...)
    Direct download (9 more)  
     
    My bibliography  
     
    Export citation  
  4. Stella Frank, Sharon Goldwater & Frank Keller (2013). Adding Sentence Types to a Model of Syntactic Category Acquisition. Topics in Cognitive Science 5 (3):495-521.score: 156.0
    The acquisition of syntactic categories is a crucial step in the process of acquiring syntax. At this stage, before a full grammar is available, only surface cues are available to the learner. Previous computational models have demonstrated that local contexts are informative for syntactic categorization. However, local contexts are affected by sentence-level structure. In this paper, we add sentence type as an observed feature to a model of syntactic category acquisition, based on experimental evidence showing that (...)
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  5. Dorit Ben Shalom (2000). Trace Deletion and Friederici's (1995) Model of Syntactic Processing. Behavioral and Brain Sciences 23 (1):22-23.score: 144.0
    This commentary discusses the relation between Grodzinsky's target article and Friederici's (1995) model of syntactic processing. The two models can be made more compatible if it is assumed that people with Broca's aphasia have a problem in trace construction rather than trace deletion, and that the process of trace construction takes place during the second early syntactic substage of Friederici's model.
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  6. Andrew Carstairs-McCarthy (1998). The Frame/Content Model and Syntactic Evolution. Behavioral and Brain Sciences 21 (4):515-516.score: 120.0
    The frame/content theory suggests that chewing was tinkered into speaking. A simple extrapolation of this approach suggests that syllable structure may have been tinkered into syntax. That would explain the widely noted parallels between sentence structure and syllable structure, and also the otherwise mysterious pervasiveness of the grammatical distinction between sentences and noun phrases.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  7. Amit Dubey, Frank Keller & Patrick Sturt (2008). A Probabilistic Corpus-Based Model of Syntactic Parallelism. Cognition 109 (3):326-344.score: 120.0
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  8. T. A. Cartwright & M. R. Brent (1997). Early Acquisition of Syntactic Categories: A Formal Model. Cognition 63:121-170.score: 120.0
     
    My bibliography  
     
    Export citation  
  9. Daniel Jurafsky (1996). A Probabilistic Model of Lexical and Syntactic Access and Disambiguation. Cognitive Science 20 (2):137-194.score: 120.0
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  10. Theo Vosse & Gerard Kempen (2000). Syntactic Structure Assembly in Human Parsing: A Computational Model Based on Competitive Inhibition and a Lexicalist Grammar. Cognition 75 (2):105-143.score: 120.0
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  11. Sebastian Lutz (2014). What's Right with a Syntactic Approach to Theories and Models? Erkenntnis:1-18.score: 80.0
    Syntactic approaches in the philosophy of science, which are based on formalizations in predicate logic, are often considered in principle inferior to semantic approaches, which are based on formalizations with the help of structures. To compare the two kinds of approach, I identify some ambiguities in common semantic accounts and explicate the concept of a structure in a way that avoids hidden references to a specific vocabulary. From there, I argue that contrary to common opinion (i) unintended models do (...)
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  12. Cory D. Wright (2000). Eliminativist Undercurrents in the New Wave Model of Psychoneural Reduction. Journal of Mind and Behavior 21 (4):413-436.score: 66.0
    "New wave" reductionism aims at advancing a kind of reduction that is stronger than unilateral dependency of the mental on the physical. It revolves around the idea that reduction between theoretical levels is a matter of degree, and can be laid out on a continuum between a "smooth" pole (theoretical identity) and a "bumpy" pole (extremely revisionary). It also entails that both higher and lower levels of the reductive relationship sustain some degree of explanatory autonomy. The new wave predicts that (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  13. Mark D. Roberts (2004). P-Model Alternative to the T-Model. Web Journal of Formal, Computational and Logical Linguistics 5:1-18.score: 66.0
    Standard linguistic analysis of syntax uses the T-model. This model requires the ordering: D-structure > S-structure > LF, where D-structure is the sentences deep structure, S-structure is its surface structure, and LF is its logical form. Between each of these representations there is movement which alters the order of the constituent words; movement is achieved using the principles and parameters of syntactic theory. Psychological analysis of sentence production is usually either serial or connectionist. Psychological serial models do (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  14. Robert F. Hadley (1991). A Sense-Based, Process Model of Belief. Minds and Machines 1 (3):279-320.score: 66.0
    A process-oriented model of belief is presented which permits the representation of nested propositional attitudes within first-order logic. The model (NIM, for nested intensional model) is axiomatized, sense-based (via intensions), and sanctions inferences involving nested epistemic attitudes, with different agents and different times. Because NIM is grounded upon senses, it provides a framework in which agents may reason about the beliefs of another agent while remaining neutral with respect to the syntactic forms used to express the (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  15. W. Garrett Mitchener (2011). A Mathematical Model of Prediction-Driven Instability: How Social Structure Can Drive Language Change. [REVIEW] Journal of Logic, Language and Information 20 (3):385-396.score: 66.0
    I discuss a stochastic model of language learning and change. During a syntactic change, each speaker makes use of constructions from two different idealized grammars at variable rates. The model incorporates regularization in that speakers have a slight preference for using the dominant idealized grammar. It also includes incrementation: The population is divided into two interacting generations. Children can detect correlations between age and speech. They then predict where the population’s language is moving and speak according to (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  16. Judith Gaspers & Philipp Cimiano (2014). A Computational Model for the Item‐Based Induction of Construction Networks. Cognitive Science 38 (2):439-488.score: 66.0
    According to usage-based approaches to language acquisition, linguistic knowledge is represented in the form of constructions—form-meaning pairings—at multiple levels of abstraction and complexity. The emergence of syntactic knowledge is assumed to be a result of the gradual abstraction of lexically specific and item-based linguistic knowledge. In this article, we explore how the gradual emergence of a network consisting of constructions at varying degrees of complexity can be modeled computationally. Linguistic knowledge is learned by observing natural language utterances in an (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  17. Dale Jacquette (1990). Intentionality and Stich's Theory of Brain Sentence Syntax. Philosophical Quarterly 40 (159):169-82.score: 60.0
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  18. Spyros Galanis (2011). Syntactic Foundations for Unawareness of Theorems. Theory and Decision 71 (4):593-614.score: 60.0
    We provide a syntactic model of unawareness. By introducing multiple knowledge modalities, one for each sub-language, we specifically model agents whose only mistake in reasoning (other than their unawareness) is to underestimate the knowledge of more aware agents. We show that the model is a complete and sound axiomatization of the set-theoretic model of Galanis (University of Southampton Discussion paper 709, 2007) and compare it with other unawareness models in the literature.
    No categories
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  19. Stephen P. Stich (1983). From Folk Psychology to Cognitive Science. MIT Press.score: 60.0
  20. Marcello Guarini (2001). A Defence of Connectionism Against the "Syntactic" Argument. Synthese 128 (3):287-317.score: 54.0
    In "Representations without Rules, Connectionism and the Syntactic Argument'', Kenneth Aizawa argues against the view that connectionist nets can be understood as processing representations without the use of representation-level rules, and he provides a positive characterization of how to interpret connectionist nets as following representation-level rules. He takes Terry Horgan and John Tienson to be the targets of his critique. The present paper marshals functional and methodological considerations, gleaned from the practice of cognitive modelling, to argue against Aizawa's characterization (...)
    Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  21. David Pesetsky, Cyclic Linearization of Syntactic Structure.score: 54.0
    This paper proposes an architecture for the mapping between syntax and phonology — in particular, that aspect of phonology that determines ordering. In Fox and Pesetsky (in prep.), we will argue that this architecture, when combined with a general theory of syntactic domains ("phases"), provides a new understanding of a variety of phenomena that have received diverse accounts in the literature. This shorter paper focuses on two processes, both drawn from Scandinavian: the familiar process of Object Shift and the (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  22. Willem J. M. Levelt, Antje S. Meyer & Ardi Roelofs (2004). Relations of Lexical Access to Neural Implementation and Syntactic Encoding. Behavioral and Brain Sciences 27 (2):299-301.score: 54.0
    How can one conceive of the neuronal implementation of the processing model we proposed in our target article? In his commentary (Pulvermüller 1999, reprinted here in this issue), Pulvermüller makes various proposals concerning the underlying neural mechanisms and their potential localizations in the brain. These proposals demonstrate the compatibility of our processing model and current neuroscience. We add further evidence on details of localization based on a recent meta-analysis of neuroimaging studies of word production (Indefrey & Levelt 2000). (...)
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  23. Holly P. Branigan & Martin J. Pickering (2004). Syntactic Representation in the Lemma Stratum. Behavioral and Brain Sciences 27 (2):296-297.score: 54.0
    Levelt, Roelofs, & Meyer (henceforth Levelt et al. 1999) propose a model of production incorporating a lemma stratum, which is concerned with the syntactic characteristics of lexical entries. We suggest that syntactic priming experiments provide evidence about how such syntactic information is represented, and that this evidence can be used to extend Levelt et al.'s model. Evidence from syntactic priming experiments also supports Levelt et al.'s conjecture that the lemma stratum is shared between the (...)
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  24. Luca Bellotti (2007). Formalization, Syntax and the Standard Model of Arithmetic. Synthese 154 (2):199 - 229.score: 54.0
    I make an attempt at the description of the delicate role of the standard model of arithmetic for the syntax of formal systems. I try to assess whether the possible instability in the notion of finiteness deriving from the nonstandard interpretability of arithmetic affects the very notions of syntactic metatheory and of formal system. I maintain that the crucial point of the whole question lies in the evaluation of the phenomenon of formalization. The ideas of Skolem, Zermelo, Beth (...)
    No categories
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  25. Danny Fox, Cyclic Linearization of Syntactic Structure.score: 54.0
    This paper proposes an architecture for the mapping between syntax and phonology — in particular, that aspect of phonology that determines ordering. In Fox and Pesetsky (in prep.), we will argue that this architecture, when combined with a general theory of syntactic domains ("phases"), provides a new understanding of a variety of phenomena that have received diverse accounts in the literature. This shorter paper focuses on two processes, both drawn from Scandinavian: the familiar process of Object Shift and the (...)
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  26. George E. Weaver (1994). Syntactic Features and Synonymy Relations: A Unified Treatment of Some Proofs of the Compactness and Interpolation Theorems. Studia Logica 53 (2):325 - 342.score: 54.0
    This paper introduces the notion of syntactic feature to provide a unified treatment of earlier model theoretic proofs of both the compactness and interpolation theorems for a variety of two valued logics including sentential logic, first order logic, and a family of modal sentential logic includingM,B,S 4 andS 5. The compactness papers focused on providing a proof of the consequence formulation which exhibited the appropriate finite subset. A unified presentation of these proofs is given by isolating their essential (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  27. Christopher Manning, Natural Language Grammar Induction Using a Constituent-Context Model.score: 54.0
    This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
    No categories
    Direct download  
     
    My bibliography  
     
    Export citation  
  28. Thomas R. Shultz & Alan C. Bale (2006). Neural Networks Discover a Near-Identity Relation to Distinguish Simple Syntactic Forms. Minds and Machines 16 (2):107-139.score: 54.0
    Computer simulations show that an unstructured neural-network model [Shultz, T. R., & Bale, A. C. (2001). Infancy, 2, 501–536] covers the essential features␣of infant learning of simple grammars in an artificial language [Marcus, G. F., Vijayan, S., Bandi Rao, S., & Vishton, P. M. (1999). Science, 283, 77–80], and generalizes to examples both outside and inside of the range of training sentences. Knowledge-representation analyses confirm that these networks discover that duplicate words in the sentences are nearly identical and that (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  29. Nicholas Asher & Hajime Wada (1988). A Computational Account of Syntactic, Semantic and Discourse Principles for Anaphora Resolution. Journal of Semantics 6 (1):309-344.score: 54.0
    We present a unified framework for the computational implementation of syntactic, semantic, pragmatic and even “stylistic” constraints on anaphora. We build on our BUILDERS implementation of Discourse Representation (DR) Theory and Lexical Functional Grammar (LFG) discussed in Wada & Asher (1986). We develop and argue for a semantically based processing model for anaphora resolution that exploits a number of desirable features: (1) the partial semantics provided by the discourse representation structures (DRSs) of DR theory, (2) the use of (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  30. Jeremy Avigad & Richard Sommer (1997). A Model-Theoretic Approach to Ordinal Analysis. Bulletin of Symbolic Logic 3 (1):17-52.score: 54.0
    We describe a model-theoretic approach to ordinal analysis via the finite combinatorial notion of an α-large set of natural numbers. In contrast to syntactic approaches that use cut elimination, this approach involves constructing finite sets of numbers with combinatorial properties that, in nonstandard instances, give rise to models of the theory being analyzed. This method is applied to obtain ordinal analyses of a number of interesting subsystems of first- and second-order arithmetic.
    Direct download (9 more)  
     
    My bibliography  
     
    Export citation  
  31. Shimon Edelman, Automatic Acquisition and Efficient Representation of Syntactic Structures.score: 54.0
    The distributional principle according to which morphemes that occur in identical contexts belong, in some sense, to the same category [1] has been advanced as a means for extracting syntactic structures from corpus data. We extend this principle by applying it recursively, and by using mutual information for estimating category coherence. The resulting model learns, in an unsupervised fashion, highly structured, distributed representations of syntactic knowledge from corpora. It also exhibits promising behavior in tasks usually thought to (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  32. Christopher Manning, A Generative Model for Semantic Role Labeling.score: 54.0
    Determining the semantic role of sentence constituents is a key task in determining sentence meanings lying behind a veneer of variant syntactic expression. We present a model of natural language generation from semantics using the FrameNet semantic role and frame ontology. We train the model using the FrameNet corpus and apply it to the task of automatic semantic role and frame identification, producing results competitive with previous work (about 70% role labeling accuracy). Unlike previous models used for (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  33. Christopher Manning, Fast Exact Inference with a Factored Model for Natural Language Parsing.score: 54.0
    We present a novel generative model for natural language tree structures in which semantic (lexical dependency) and syntactic (PCFG) structures are scored with separate models. This factorization provides conceptual simplicity, straightforward opportunities for separately improving the component models, and a level of performance comparable to similar, non-factored models. Most importantly, unlike other modern parsing models, the factored model admits an extremely effective A* parsing algorithm, which enables efficient, exact inference.
    No categories
    Direct download  
     
    My bibliography  
     
    Export citation  
  34. Gianluca Giorgolo, Shalom Lappin & Alexander Clark, Towards a Statistical Model of Grammaticality.score: 54.0
    The question of whether it is possible to characterise grammatical knowledge in probabilistic terms is central to determining the relationship of linguistic representation to other cognitive domains. We present a statistical model of grammaticality which maps the probabilities of a statistical model for sentences in parts of the British National Corpus (BNC) into grammaticality scores, using various functions of the parameters of the model. We test this approach with a classifier on test sets containing different levels of (...)
    No categories
    Direct download  
     
    My bibliography  
     
    Export citation  
  35. Christopher D. Manning & Bill MacCartney, An Extended Model of Natural Logic.score: 54.0
    We propose a model of natural language inference which identifies valid inferences by their lexical and syntactic features, without full semantic interpretation. We extend past work in natural logic, which has focused on semantic containment and monotonicity, by incorporating both semantic exclusion and implicativity. Our model decomposes an inference problem into a sequence of atomic edits linking premise to hypothesis; predicts a lexical semantic relation for each edit; propagates these relations upward through a semantic composition tree according (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  36. Dan Klein & Christopher D. Manning, Fast Exact Inference with a Factored Model for Natural Language Parsing.score: 54.0
    We present a novel generative model for natural language tree structures in which semantic (lexical dependency) and syntactic (PCFG) structures are scored with separate models. This factorization provides conceptual simplicity, straightforward opportunities for separately improving the component models, and a level of performance comparable to similar, non-factored models. Most importantly, unlike other modern parsing models, the factored model admits an extremely effective A* parsing algorithm, which enables efficient, exact inference.
    No categories
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  37. Francis Jeffry Pelletier, The Effect of Syntactic Form on Simple.score: 54.0
    In this paper we report preliminary results on how people revise or update a previously held set of beliefs. When intelligent agents learn new things which conflict with their current belief set, they must revise their belief set. When the new information does not conflict, they merely must update their belief set. Various AI theories have been proposed to achieve these processes. There are two general dimensions along which these theories differ: whether they are syntactic-based or model-based, and (...)
    No categories
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  38. Cynthia A. Thompson, Roger Levy & Christopher D. Manning, A Generative Model for Semantic Role Labeling.score: 54.0
    Determining the semantic role of sentence constituents is a key task in determining sentence meanings lying behind a veneer of variant syntactic expression. We present a model of natural language generation from semantics using the FrameNet semantic role and frame ontology. We train the model using the FrameNet corpus and apply it to the task of automatic semantic role and frame identification, producing results competitive with previous work (about 70% role labeling accuracy). Unlike previous models used for (...)
    No categories
    Direct download  
     
    My bibliography  
     
    Export citation  
  39. Dan Klein & Christopher D. Manning, Natural Language Grammar Induction Using a Constituent-Context Model.score: 54.0
    This paper presents a novel approach to the unsupervised learning of syntactic analyses of natural language text. Most previous work has focused on maximizing likelihood according to generative PCFG models. In contrast, we employ a simpler probabilistic model over trees based directly on constituent identity and linear context, and use an EM-like iterative procedure to induce structure. This method produces much higher quality analyses, giving the best published results on the ATIS dataset.
    No categories
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  40. Matthew W. Crocker Afra Alishahi, Afsaneh Fazly, Judith Koehne (2012). Sentence-Based Attentional Mechanisms in Word Learning: Evidence From a Computational Model. Frontiers in Psychology 3.score: 48.0
    When looking for the referents of nouns, adults and young children are sensitive to cross- situational statistics (Yu & Smith, 2007; Smith & Yu, 2008). In addition, the linguistic context that a word appears in has been shown to act as a powerful attention mechanism for guiding sentence processing and word learning (Landau & Gleitman, 1985; Altmann & Kamide, 1999; Kako & Trueswell, 2000). Koehne & Crocker (2010, 2011) investigate the interaction between cross-situational evidence and guidance from the sentential context (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  41. Afra Alishahi, Afsaneh Fazly, Judith Koehne & Matthew W. Crocker (2012). Sentence-Based Attentional Mechanisms in Word Learning: Evidence From a Computational Model. Frontiers in Psychology 3.score: 48.0
    When looking for the referents of nouns, adults and young children are sensitive to cross- situational statistics (Yu & Smith, 2007; Smith & Yu, 2008). In addition, the linguistic context that a word appears in has been shown to act as a powerful attention mechanism for guiding sentence processing and word learning (Landau & Gleitman, 1985; Altmann & Kamide, 1999; Kako & Trueswell, 2000). Koehne & Crocker (2010, 2011) investigate the interaction between cross-situational evidence and guidance from the sentential context (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  42. D. C. Gooding & T. R. Addis (2008). Modelling Experiments as Mediating Models. Foundations of Science 13 (1):17-35.score: 46.0
    Syntactic and structural models specify relationships between their constituents but cannot show what outcomes their interaction would produce over time in the world. Simulation consists in iterating the states of a model, so as to produce behaviour over a period of simulated time. Iteration enables us to trace the implications and outcomes of inference rules and other assumptions implemented in the models that make up a theory. We apply this method to experiments which we treat as models of (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  43. Edward P. Stabler (2013). Two Models of Minimalist, Incremental Syntactic Analysis. Topics in Cognitive Science 5 (3):611-633.score: 42.0
    Minimalist grammars (MGs) and multiple context-free grammars (MCFGs) are weakly equivalent in the sense that they define the same languages, a large mildly context-sensitive class that properly includes context-free languages. But in addition, for each MG, there is an MCFG which is strongly equivalent in the sense that it defines the same language with isomorphic derivations. However, the structure-building rules of MGs but not MCFGs are defined in a way that generalizes across categories. Consequently, MGs can be exponentially more succinct (...)
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  44. Ruth Kempson & Ronnie Cann, Dynamic Syntax and Dialogue Modelling: Preliminaries for a Dialogue-Driven Account of Syntactic Change.score: 40.0
    No categories
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  45. Tomasz Skura (1994). Syntactic Refutations Against Finite Models in Modal Logic. Notre Dame Journal of Formal Logic 35 (4):595-605.score: 40.0
  46. Kathleen M. Carbary, Ellen E. Frohning & Michael K. Tanenhaus (2010). Context, Syntactic Priming, and Referential Form in an Interactive Dialogue Task: Implications for Models of Alignment. In S. Ohlsson & R. Catrambone (eds.), Proceedings of the 32nd Annual Conference of the Cognitive Science Society. Cognitive Science Society. 109--114.score: 40.0
    No categories
    Direct download  
     
    My bibliography  
     
    Export citation  
  47. David Chan & Fookkee Chua (1994). Suppression of Valid Inferences: Syntactic Views, Mental Models, and Relative Salience. Cognition 53 (3):217-238.score: 40.0
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  48. Robert Schwartz (2004). Goodman and the Demise of Syntactic and Semantic Models. In Dov M. Gabbay, John Woods & Akihiro Kanamori (eds.), Handbook of the History of Logic. Elsevier. 10--391.score: 40.0
    No categories
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  49. Piotr Kulicki (2013). On Minimal Models for Pure Calculi of Names. Logic and Logical Philosophy 22 (4):429–443.score: 38.0
    By pure calculus of names we mean a quantifier-free theory, based on the classical propositional calculus, which defines predicates known from Aristotle’s syllogistic and Leśniewski’s Ontology. For a large fragment of the theory decision procedures, defined by a combination of simple syntactic operations and models in two-membered domains, can be used. We compare the system which employs `ε’ as the only specific term with the system enriched with functors of Syllogistic. In the former, we do not need an empty (...)
    Direct download (10 more)  
     
    My bibliography  
     
    Export citation  
  50. Björn Kralemann & Claas Lattmann (2013). Models as Icons: Modeling Models in the Semiotic Framework of Peirce's Theory of Signs. Synthese 190 (16):3397-3420.score: 38.0
    In this paper, we try to shed light on the ontological puzzle pertaining to models and to contribute to a better understanding of what models are. Our suggestion is that models should be regarded as a specific kind of signs according to the sign theory put forward by Charles S. Peirce, and, more precisely, as icons, i.e. as signs which are characterized by a similarity relation between sign (model) and object (original). We argue for this (1) by analyzing from (...)
    No categories
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
1 — 50 / 1000