Results for 'Semantic vector models'

992 found
Order:
  1.  58
    Semantic Vector Models and Functional Models for Pregroup Grammars.Anne Preller & Mehrnoosh Sadrzadeh - 2011 - Journal of Logic, Language and Information 20 (4):419-443.
    We show that vector space semantics and functional semantics in two-sorted first order logic are equivalent for pregroup grammars. We present an algorithm that translates functional expressions to vector expressions and vice-versa. The semantics is compositional, variable free and invariant under change of order or multiplicity. It includes the semantic vector models of Information Retrieval Systems and has an interior logic admitting a comprehension schema. A sentence is true in the interior logic if and only (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  2.  91
    Vector space semantics: A model-theoretic analysis of locative prepositions. [REVIEW]Joost Zwarts & Yoad Winter - 2000 - Journal of Logic, Language and Information 9 (2):169-211.
    This paper introduces a compositional semantics of locativeprepositional phrases which is based on a vector space ontology.Model-theoretic properties of prepositions like monotonicity andconservativity are defined in this system in a straightforward way.These notions are shown to describe central inferences with spatialexpressions and to account for the grammaticality of prepositionmodification. Model-theoretic constraints on the set of possibleprepositions in natural language are specified, similar to the semanticuniversals of Generalized Quantifier Theory.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   17 citations  
  3.  45
    Computational Exploration of Metaphor Comprehension Processes Using a Semantic Space Model.Akira Utsumi - 2011 - Cognitive Science 35 (2):251-296.
    Recent metaphor research has revealed that metaphor comprehension involves both categorization and comparison processes. This finding has triggered the following central question: Which property determines the choice between these two processes for metaphor comprehension? Three competing views have been proposed to answer this question: the conventionality view (Bowdle & Gentner, 2005), aptness view (Glucksberg & Haught, 2006b), and interpretive diversity view (Utsumi, 2007); these views, respectively, argue that vehicle conventionality, metaphor aptness, and interpretive diversity determine the choice between the categorization (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  4.  11
    Ernest Lepore.What Model-Theoretic Semantics Cannot Do - 1997 - In Peter Ludlow (ed.), Readings in the Philosophy of Language. MIT Press.
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  5.  29
    A data-driven computational semiotics: The semantic vector space of Magritte’s artworks.Jean-François Chartier, Davide Pulizzotto, Louis Chartrand & Jean-Guy Meunier - 2019 - Semiotica 2019 (230):19-69.
    The rise of big digital data is changing the framework within which linguists, sociologists, anthropologists, and other researchers are working. Semiotics is not spared by this paradigm shift. A data-driven computational semiotics is the study with an intensive use of computational methods of patterns in human-created contents related to semiotic phenomena. One of the most promising frameworks in this research program is the Semantic Vector Space (SVS) models and their methods. The objective of this article is to (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6. Static and dynamic vector semantics for lambda calculus models of natural language.Mehrnoosh Sadrzadeh & Reinhard Muskens - 2018 - Journal of Language Modelling 6 (2):319-351.
    Vector models of language are based on the contextual aspects of language, the distributions of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, compositional properties of words and how they compose to form sentences. In the truth conditional approach, the denotation of a sentence determines its truth conditions, which can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  7.  24
    Reasoning with vectors: A continuous model for fast robust inference.Dominic Widdows & Trevor Cohen - 2015 - Logic Journal of the IGPL 23 (2):141-173.
    This article describes the use of continuous vector space models for reasoning with a formal knowledge base. The practical significance of these models is that they support fast, approximate but robust inference and hypothesis generation, which is complementary to the slow, exact, but sometimes brittle behaviour of more traditional deduction engines such as theorem provers.The article explains the way logical connectives can be used in semantic vector models, and summarizes the development of Predication-based (...) Indexing, which involves the use of Vector Symbolic Architectures to represent the concepts and relationships from a knowledge base of subject-predicate-object triples. Experiments show that the use of continuous models for formal reasoning is not only possible, but already demonstrably effective for some recognized informatics tasks, and showing promise in other traditional problem areas. Examples described in this article include: predicting new uses for existing drugs in biomedical informatics; removing unwanted meanings from search results in information retrieval and concept navigation; type inference from attributes; comparing words based on their orthography; and representing tabular data, including modelling numerical values.The algorithms and techniques described in this article are all publicly released and freely available in the Semantic Vectors open-source software package.1. (shrink)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  8. Vector space models of lexical meaning.Stephen Clark - 2015 - In Shalom Lappin & Chris Fox (eds.), Handbook of Contemporary Semantic Theory. Wiley-Blackwell.
     
    Export citation  
     
    Bookmark   4 citations  
  9.  45
    Lambek vs. Lambek: Functorial vector space semantics and string diagrams for Lambek calculus.Bob Coecke, Edward Grefenstette & Mehrnoosh Sadrzadeh - 2013 - Annals of Pure and Applied Logic 164 (11):1079-1100.
    The Distributional Compositional Categorical model is a mathematical framework that provides compositional semantics for meanings of natural language sentences. It consists of a computational procedure for constructing meanings of sentences, given their grammatical structure in terms of compositional type-logic, and given the empirically derived meanings of their words. For the particular case that the meaning of words is modelled within a distributional vector space model, its experimental predictions, derived from real large scale data, have outperformed other empirically validated methods (...)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  10.  96
    Composition in Distributional Models of Semantics.Jeff Mitchell & Mirella Lapata - 2010 - Cognitive Science 34 (8):1388-1429.
    Vector-based models of word meaning have become increasingly popular in cognitive science. The appeal of these models lies in their ability to represent meaning simply by using distributional information under the assumption that words occurring within similar contexts are semantically similar. Despite their widespread use, vector-based models are typically directed at representing words in isolation, and methods for constructing representations for phrases or sentences have received little attention in the literature. This is in marked contrast (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   24 citations  
  11.  20
    A Type-Driven Vector Semantics for Ellipsis with Anaphora Using Lambek Calculus with Limited Contraction.Gijs Wijnholds & Mehrnoosh Sadrzadeh - 2019 - Journal of Logic, Language and Information 28 (2):331-358.
    We develop a vector space semantics for verb phrase ellipsis with anaphora using type-driven compositional distributional semantics based on the Lambek calculus with limited contraction of Jäger. Distributional semantics has a lot to say about the statistical collocation based meanings of content words, but provides little guidance on how to treat function words. Formal semantics on the other hand, has powerful mechanisms for dealing with relative pronouns, coordinators, and the like. Type-driven compositional distributional semantics brings these two models (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  12.  35
    The Large‐Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth.Mark Steyvers & Joshua B. Tenenbaum - 2005 - Cognitive Science 29 (1):41-78.
    We present statistical analyses of the large‐scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small‐world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of connections follow power laws that indicate a scale‐free pattern of connectivity, with most nodes having relatively few connections joined together through a small number of hubs with many connections. These (...)
    No categories
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   80 citations  
  13.  13
    The Large‐Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth.Mark Steyvers & Joshua B. Tenenbaum - 2005 - Cognitive Science 29 (1):41-78.
    We present statistical analyses of the large‐scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small‐world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of connections follow power laws that indicate a scale‐free pattern of connectivity, with most nodes having relatively few connections joined together through a small number of hubs with many connections. These (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   54 citations  
  14.  10
    The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth.Mark Steyvers & Joshua B. Tenenbaum - 2005 - Cognitive Science 29 (1):41-78.
    We present statistical analyses of the large‐scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small‐world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of connections follow power laws that indicate a scale‐free pattern of connectivity, with most nodes having relatively few connections joined together through a small number of hubs with many connections. These (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   27 citations  
  15.  8
    Identifying the Correlations Between the Semantics and the Phonology of American Sign Language and British Sign Language: A Vector Space Approach.Aurora Martinez del Rio, Casey Ferrara, Sanghee J. Kim, Emre Hakgüder & Diane Brentari - 2022 - Frontiers in Psychology 13.
    Over the history of research on sign languages, much scholarship has highlighted the pervasive presence of signs whose forms relate to their meaning in a non-arbitrary way. The presence of these forms suggests that sign language vocabularies are shaped, at least in part, by a pressure toward maintaining a link between form and meaning in wordforms. We use a vector space approach to test the ways this pressure might shape sign language vocabularies, examining how non-arbitrary forms are distributed within (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  16.  9
    Items Outperform Adjectives in a Computational Model of Binary Semantic Classification.Evgeniia Diachek, Sarah Brown-Schmidt & Sean M. Polyn - 2023 - Cognitive Science 47 (9):e13336.
    Semantic memory encompasses one's knowledge about the world. Distributional semantic models, which construct vector spaces with embedded words, are a proposed framework for understanding the representational structure of human semantic knowledge. Unlike some classic semantic models, distributional semantic models lack a mechanism for specifying the properties of concepts, which raises questions regarding their utility for a general theory of semantic knowledge. Here, we develop a computational model of a binary (...) classification task, in which participants judged target words for the referent's size or animacy. We created a family of models, evaluating multiple distributional semantic models, and mechanisms for performing the classification. The most successful model constructed two composite representations for each extreme of the decision axis (e.g., one averaging together representations of characteristically big things and another of characteristically small things). Next, the target item was compared to each composite representation, allowing the model to classify more than 1,500 words with human‐range performance and to predict response times. We propose that when making a decision on a binary semantic classification task, humans use task prompts to retrieve instances representative of the extremes on that semantic dimension and compare the probe to those instances. This proposal is consistent with the principles of the instance theory of semantic memory. (shrink)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17.  16
    Investigating the Extent to which Distributional Semantic Models Capture a Broad Range of Semantic Relations.Kevin S. Brown, Eiling Yee, Gitte Joergensen, Melissa Troyer, Elliot Saltzman, Jay Rueckl, James S. Magnuson & Ken McRae - 2023 - Cognitive Science 47 (5):e13291.
    Distributional semantic models (DSMs) are a primary method for distilling semantic information from corpora. However, a key question remains: What types of semantic relations among words do DSMs detect? Prior work typically has addressed this question using limited human data that are restricted to semantic similarity and/or general semantic relatedness. We tested eight DSMs that are popular in current cognitive and psycholinguistic research (positive pointwise mutual information; global vectors; and three variations each of Skip-gram (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  18.  8
    Probing the Representational Structure of Regular Polysemy via Sense Analogy Questions: Insights from Contextual Word Vectors.Jiangtian Li & Blair C. Armstrong - 2024 - Cognitive Science 48 (3):e13416.
    Regular polysemes are sets of ambiguous words that all share the same relationship between their meanings, such as CHICKEN and LOBSTER both referring to an animal or its meat. To probe how a distributional semantic model, here exemplified by bidirectional encoder representations from transformers (BERT), represents regular polysemy, we analyzed whether its embeddings support answering sense analogy questions similar to “is the mapping between CHICKEN (as an animal) and CHICKEN (as a meat) similar to that which maps between LOBSTER (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  19. Context Update for Lambdas and Vectors.Reinhard Muskens & Mehrnoosh Sadrzadeh - 2016 - In Maxime Amblard, Philippe de Groote, Sylvain Pogodalla & Christian Rétoré (eds.), Logical Aspects of Computational Linguistics. Celebrating 20 Years of LACL (1996–2016). Berlin, Germany: Springer. pp. 247--254.
    Vector models of language are based on the contextual aspects of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, the denotations of phrases, and their compositional properties. In the latter approach the denotation of a sentence determines its truth conditions and can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In this short paper, we develop a vector (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  20.  12
    Probing Lexical Ambiguity: Word Vectors Encode Number and Relatedness of Senses.Barend Beekhuizen, Blair C. Armstrong & Suzanne Stevenson - 2021 - Cognitive Science 45 (5):e12943.
    Lexical ambiguity—the phenomenon of a single word having multiple, distinguishable senses—is pervasive in language. Both the degree of ambiguity of a word (roughly, its number of senses) and the relatedness of those senses have been found to have widespread effects on language acquisition and processing. Recently, distributional approaches to semantics, in which a word's meaning is determined by its contexts, have led to successful research quantifying the degree of ambiguity, but these measures have not distinguished between the ambiguity of words (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  21.  17
    Exploring What Is Encoded in Distributional Word Vectors: A Neurobiologically Motivated Analysis.Akira Utsumi - 2020 - Cognitive Science 44 (6):e12844.
    The pervasive use of distributional semantic models or word embeddings for both cognitive modeling and practical application is because of their remarkable ability to represent the meanings of words. However, relatively little effort has been made to explore what types of information are encoded in distributional word vectors. Knowing the internal knowledge embedded in word vectors is important for cognitive modeling using distributional semantic models. Therefore, in this paper, we attempt to identify the knowledge encoded in (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  22. Evolutionary Semantics of Anthropogenesis and Bioethics of Nbic-Technologies.Valentin Cheshko, Yulia Kosova & Valery Glazko - 2015 - Biogeosystem Technique 5 (3):256-266.
    The co-evolutionary concept of tri-modal stable evolutionary strategy (SESH) of Homo sapiens is developed. The concept based on the principle of evolutionary complementarity of anthropogenesis: value of evolutionary risk and evolutionary path of human evolution are defined by descriptive (evolutionary efficiency) and creative-teleological (evolutionary correctness) parameters simultaneously, that cannot be instrumental reduced to others ones. Resulting volume of both parameters define the vectors of human evolution by two gear mechanism ˗ genetic and cultural co-evolution and techno-humanitarian balance. Explanatory model and (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  23.  17
    Neuromimetic Semantics: Coordination, Quantification, and Collective Predicates.Harry Howard - 2004 - Elsevier.
    This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, (...)
    Direct download  
     
    Export citation  
     
    Bookmark  
  24.  16
    Holographic Declarative Memory: Distributional Semantics as the Architecture of Memory.M. A. Kelly, Nipun Arora, Robert L. West & David Reitter - 2020 - Cognitive Science 44 (11):e12904.
    We demonstrate that the key components of cognitive architectures (declarative and procedural memory) and their key capabilities (learning, memory retrieval, probability judgment, and utility estimation) can be implemented as algebraic operations on vectors and tensors in a high‐dimensional space using a distributional semantics model. High‐dimensional vector spaces underlie the success of modern machine learning techniques based on deep learning. However, while neural networks have an impressive ability to process data to find patterns, they do not typically model high‐level cognition, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  25.  63
    A Cognition Knowledge Representation Model Based on Multidimensional Heterogeneous Data.Dong Zhong, Yi-An Zhu, Lanqing Wang, Junhua Duan & Jiaxuan He - 2020 - Complexity 2020:1-17.
    The information in the working environment of industrial Internet is characterized by diversity, semantics, hierarchy, and relevance. However, the existing representation methods of environmental information mostly emphasize the concepts and relationships in the environment and have an insufficient understanding of the items and relationships at the instance level. There are also some problems such as low visualization of knowledge representation, poor human-machine interaction ability, insufficient knowledge reasoning ability, and slow knowledge search speed, which cannot meet the needs of intelligent and (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  26.  30
    Automatic semantic edge labeling over legal citation graphs.Ali Sadeghian, Laksshman Sundaram, Daisy Zhe Wang, William F. Hamilton, Karl Branting & Craig Pfeifer - 2018 - Artificial Intelligence and Law 26 (2):127-144.
    A large number of cross-references to various bodies of text are used in legal texts, each serving a different purpose. It is often necessary for authorities and companies to look into certain types of these citations. Yet, there is a lack of automatic tools to aid in this process. Recently, citation graphs have been used to improve the intelligibility of complex rule frameworks. We propose an algorithm that builds the citation graph from a document and automatically labels each edge according (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  27.  50
    Some comments on Lehrer semantics.Alfred Schramm - 2012 - Philosophical Studies 161 (1):109-117.
    Lehrer Semantics, as it was devised by Adrienne and Keith Lehrer, is imbedded in a comprehensive web of thought and observations of language use and development, communication, and social interaction, all these as empirical phenomena. Rather than for a theory, I take it for a ‘‘model’’ of the kind which gives us guidance in how to organize linguistic and language-related phenomena. My comments on it are restricted to three aspects: In 2 I deal with the question of how Lehrerian sense (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  28.  4
    Integrating Semantic Zoning Information with the Prediction of Road Link Speed Based on Taxi GPS Data.He Bing, Xu Zhifeng, Xu Yangjie, Hu Jinxing & Ma Zhanwu - 2020 - Complexity 2020:1-14.
    Road link speed is one of the important indicators for traffic states. In order to incorporate the spatiotemporal dynamics and correlation characteristics of road links into speed prediction, this paper proposes a method based on LDA and GCN. First, we construct a trajectory dataset from map-matched GPS location data of taxis. Then, we use the LDA algorithm to extract the semantic function vectors of urban zones and quantify the spatial dynamic characteristics of road links based on taxi trajectories. Finally, (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  29.  15
    Emotional Valence Precedes Semantic Maturation of Words: A Longitudinal Computational Study of Early Verbal Emotional Anchoring.José Á Martínez-Huertas, Guillermo Jorge-Botana & Ricardo Olmos - 2021 - Cognitive Science 45 (7):e13026.
    We present a longitudinal computational study on the connection between emotional and amodal word representations from a developmental perspective. In this study, children's and adult word representations were generated using the latent semantic analysis (LSA) vector space model and Word Maturity methodology. Some children's word representations were used to set a mapping function between amodal and emotional word representations with a neural network model using ratings from 9‐year‐old children. The neural network was trained and validated in the child (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  30.  8
    Incremental Composition in Distributional Semantics.Matthew Purver, Mehrnoosh Sadrzadeh, Ruth Kempson, Gijs Wijnholds & Julian Hough - 2021 - Journal of Logic, Language and Information 30 (2):379-406.
    Despite the incremental nature of Dynamic Syntax, the semantic grounding of it remains that of predicate logic, itself grounded in set theory, so is poorly suited to expressing the rampantly context-relative nature of word meaning, and related phenomena such as incremental judgements of similarity needed for the modelling of disambiguation. Here, we show how DS can be assigned a compositional distributional semantics which enables such judgements and makes it possible to incrementally disambiguate language constructs using vector space semantics. (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  31.  13
    Similarity Judgment Within and Across Categories: A Comprehensive Model Comparison.Russell Richie & Sudeep Bhatia - 2021 - Cognitive Science 45 (8):e13030.
    Similarity is one of the most important relations humans perceive, arguably subserving category learning and categorization, generalization and discrimination, judgment and decision making, and other cognitive functions. Researchers have proposed a wide range of representations and metrics that could be at play in similarity judgment, yet have not comprehensively compared the power of these representations and metrics for predicting similarity within and across different semantic categories. We performed such a comparison by pairing nine prominent vector semantic representations (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  32.  14
    The Role of Negative Information in Distributional Semantic Learning.Brendan T. Johns, Douglas J. K. Mewhort & Michael N. Jones - 2019 - Cognitive Science 43 (5):e12730.
    Distributional models of semantics learn word meanings from contextual co‐occurrence patterns across a large sample of natural language. Early models, such as LSA and HAL (Landauer & Dumais, 1997; Lund & Burgess, 1996), counted co‐occurrence events; later models, such as BEAGLE (Jones & Mewhort, 2007), replaced counting co‐occurrences with vector accumulation. All of these models learned from positive information only: Words that occur together within a context become related to each other. A recent class of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  33.  11
    Verb Metaphoric Extension Under Semantic Strain.Daniel King & Dedre Gentner - 2022 - Cognitive Science 46 (5):e13141.
    Cognitive Science, Volume 46, Issue 5, May 2022.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  34.  15
    A vector model for psychophysical judgment.John Ross & Vincent di Lollo - 1968 - Journal of Experimental Psychology 77 (3p2):1.
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  35.  8
    Network Pseudohealth Information Recognition Model: An Integrated Architecture of Latent Dirichlet Allocation and Data Block Update.Jie Zhang, Pingping Sun, Feng Zhao, Qianru Guo & Yue Zou - 2020 - Complexity 2020:1-12.
    The wanton dissemination of network pseudohealth information has brought great harm to people’s health, life, and property. It is important to detect and identify network pseudohealth information. Based on this, this paper defines the concepts of pseudohealth information, data block, and data block integration, designs an architecture that combines the latent Dirichlet allocation algorithm and data block update integration, and proposes the combination algorithm model. In addition, crawler technology is used to crawl the pseudohealth information transmitted on the Sina Weibo (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  36.  3
    Research on Quantitative Model of Brand Recognition Based on Sentiment Analysis of Big Data.Lichun Zhou - 2022 - Frontiers in Psychology 13.
    This paper takes laptops as an example to carry out research on quantitative model of brand recognition based on sentiment analysis of big data. The basic idea is to use web crawler technology to obtain the most authentic and direct information of different laptop brands from first-line consumers from public spaces such as buyer reviews of major e-commerce platforms, including review time, text reviews, satisfaction ratings and relevant user information, etc., and then analyzes consumers’ sentimental tendencies and recognition status of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  37.  61
    Towards a Cognitive Model of Genre.Vlastimil Zuska - 2000 - Theoria: Revista de Teoría, Historia y Fundamentos de la Ciencia 15 (3):481-495.
    The paper offers a new model of genre. The model employs Deleuze and Guattari's concepts of plane of immanence, chaos, and, in particular, concepts and approaches of cognitive science. Genre in general and the film genre in particular are modelled as a multidimensional space with a network of vector sequences, as a plane of immanence with individual works in the role of concepts, as a cluster category without a centre. That genre model provides more explanatory power than the recent (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  38.  5
    Using the Ship-Gram Model for Japanese Keyword Extraction Based on News Reports.Miao Teng - 2021 - Complexity 2021:1-9.
    In this paper, we conduct an in-depth study of Japanese keyword extraction from news reports, train external computer document word sets from text preprocessing into word vectors using the Ship-gram model in the deep learning tool Word2Vec, and calculate the cosine distance between word vectors. In this paper, the sliding window in TextRank is designed to connect internal document information to improve the in-text semantic coherence. The main idea is to use not only the statistical and structural features of (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  39.  14
    Multidimensional vector model of stimulus–response compatibility.Motonori Yamaguchi & Robert W. Proctor - 2012 - Psychological Review 119 (2):272-303.
  40. Semantics as Model-Based Science.Seth Yalcin - 2018 - In Derek Ball & Brian Rabern (eds.), The Science of Meaning: Essays on the Metatheory of Natural Language Semantics. Oxford University Press. pp. 334-360.
    This paper critiques a number of standard ways of understanding the role of the metalanguage in a semantic theory for natural language, including the idea that disquotation plays a nontrivial role in any explanatory natural language semantics. It then proposes that the best way to understand the role of a semantic metalanguage involves recognizing that semantics is a model-based science. The metalanguage of semantics is language for articulating features of the theorist's model. Models are understood as mediating (...)
    Direct download  
     
    Export citation  
     
    Bookmark   14 citations  
  41.  18
    Enhanced Twitter Sentiment Analysis Using Hybrid Approach and by Accounting Local Contextual Semantic.Nisheeth Joshi & Itisha Gupta - 2019 - Journal of Intelligent Systems 29 (1):1611-1625.
    This paper addresses the problem of Twitter sentiment analysis through a hybrid approach in which SentiWordNet (SWN)-based feature vector acts as input to the classification model Support Vector Machine. Our main focus is to handle lexical modifier negation during SWN score calculation for the improvement of classification performance. Thus, we present naive and novel shift approach in which negation acts as both sentiment-bearing word and modifier, and then we shift the score of words from SWN based on their (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  42. EVOLUTIONARY RISK OF HIGH HUME TECHNOLOGIES. Article 3. EVOLUTIONARY SEMANTICS AND BIOETHICS.V. T. Cheshko, L. V. Ivanitskaya & V. I. Glazko - 2016 - Integrative Annthropology (1):21-27.
    The co-evolutionary concept of three-modal stable evolutionary strategy of Homo sapiens is developed. The concept based on the principle of evolutionary complementarity of anthropogenesis: value of evolutionary risk and evolutionary path of human evolution are defined by descriptive (evolutionary efficiency) and creative-teleological (evolutionary correctness) parameters simultaneously, that cannot be instrumental reduced to other ones. Resulting volume of both parameters define the vectors of biological, social, cultural and techno-rationalistic human evolution by two gear mechanism — genetic and cultural co-evolution and techno-humanitarian (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  43.  3
    Fuzzy Generalised Quantifiers for Natural Language in Categorical Compositional Distributional Semantics.Mǎtej Dostál, Mehrnoosh Sadrzadeh & Gijs Wijnholds - 2021 - In Mojtaba Mojtahedi, Shahid Rahman & MohammadSaleh Zarepour (eds.), Mathematics, Logic, and their Philosophies: Essays in Honour of Mohammad Ardeshir. Springer. pp. 135-160.
    Recent work on compositional distributional models shows that bialgebras over finite dimensional vector spaces can be applied to treat generalised quantifiersGeneralised quantifiers for natural language. That technique requires one to construct the vector space over powersets, and therefore is computationally costly. In this paper, we overcome this problem by considering fuzzy versions of quantifiers along the lines of ZadehZadeh, L. A., within the category of many valued relationsMany valued relations. We show that this category is a concrete (...)
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  44.  2
    Pregroup Grammars, Their Syntax and Semantics.Mehrnoosh Sadrzadeh - 2021 - In Claudia Casadio & Philip J. Scott (eds.), Joachim Lambek: The Interplay of Mathematics, Logic, and Linguistics. Springer Verlag. pp. 347-376.
    Pregroup grammars were developed in 1999 and stayed Lambek’s preferred algebraic model of grammar. The set-theoretic semantics of pregroups, however, faces an ambiguity problem. In his latest book, Lambek suggests that this problem might be overcome using finite dimensional vector spaces rather than sets. What is the right notion of composition in this setting, direct sum or tensor product of spaces?
    No categories
    Direct download  
     
    Export citation  
     
    Bookmark  
  45. Expanding the vector model for dispositionalist approaches to causation.Joseph A. Baltimore - 2019 - Synthese 196 (12):5083-5098.
    Neuron diagrams are heavily employed in academic discussions of causation. Stephen Mumford and Rani Lill Anjum, however, offer an alternative approach employing vector diagrams, which this paper attempts to develop further. I identify three ways in which dispositionalists have taken the activities of powers to be related: stimulation, mutual manifestation, and contribution combination. While Mumford and Anjum do provide resources for representing contribution combination, which might be sufficient for their particular brand of dispositionalism, I argue that those resources are (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  46. Discovering Binary Codes for Documents by Learning Deep Generative Models.Geoffrey Hinton & Ruslan Salakhutdinov - 2011 - Topics in Cognitive Science 3 (1):74-91.
    We describe a deep generative model in which the lowest layer represents the word-count vector of a document and the top layer represents a learned binary code for that document. The top two layers of the generative model form an undirected associative memory and the remaining layers form a belief net with directed, top-down connections. We present efficient learning and inference procedures for this type of generative model and show that it allows more accurate and much faster retrieval than (...)
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  47.  10
    Changes in the midst of a construction network: a diachronic construction grammar approach to complex prepositions denoting internal location.Guillaume Desagulier - 2022 - Cognitive Linguistics 33 (2):339-386.
    Linguists have debated whether complex prepositions deserve a constituent status, but none have proposed a dynamic model that can both predict what construal a given pattern imposes and account for the emergence of non-spatial readings. This paper reframes the debate on constituency as a justification of the constructional status of complex prepositional patterns from a historical perspective. It focuses on the Prep NP IL of NP lm construction, which denotes a relation of internal location between a located entity and a (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  48.  5
    Commodity Image Classification Based on Improved Bag-of-Visual-Words Model.Huadong Sun, Xu Zhang, Xiaowei Han, Xuesong Jin & Zhijie Zhao - 2021 - Complexity 2021:1-10.
    With the increasing scale of e-commerce, the complexity of image content makes commodity image classification face great challenges. Image feature extraction often determines the quality of the final classification results. At present, the image feature extraction part mainly includes the underlying visual feature and the intermediate semantic feature. The intermediate semantics of the image acts as a bridge between the underlying features and the advanced semantics of the image, which can make up for the semantic gap to a (...)
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  49. The Semantic or Model-Theoretic View of Theories and Scientific Realism.Anjan Chakravartty - 2001 - Synthese 127 (3):325-345.
    The semantic view of theoriesis one according to which theoriesare construed as models of their linguisticformulations. The implications of thisview for scientific realism have been little discussed. Contraryto the suggestion of various champions of the semantic view,it is argued that this approach does not makesupport for a plausible scientific realism anyless problematic than it might otherwise be.Though a degree of independence of theory fromlanguage may ensure safety frompitfalls associated with logical empiricism, realism cannot be entertained unless (...) or (abstractedand/or idealized) aspects thereof are spelled out in terms of linguistic formulations (such as mathematical equations),which can be interpreted in terms of correspondencewith the world. The putative advantage of thesemantic approach – its linguistic independence – isthus of no help to the realist. I consider recent treatmentsof the model-theoretic view (Suppe, Giere, Smith), and find that although some of these accounts harbour the promiseof realism, this promise is deceptive. (shrink)
    Direct download (6 more)  
     
    Export citation  
     
    Bookmark   56 citations  
  50.  27
    Algebraic semantics and model completeness for Intuitionistic Public Announcement Logic.Minghui Ma, Alessandra Palmigiano & Mehrnoosh Sadrzadeh - 2014 - Annals of Pure and Applied Logic 165 (4):963-995.
    In the present paper, we start studying epistemic updates using the standard toolkit of duality theory. We focus on public announcements, which are the simplest epistemic actions, and hence on Public Announcement Logic without the common knowledge operator. As is well known, the epistemic action of publicly announcing a given proposition is semantically represented as a transformation of the model encoding the current epistemic setup of the given agents; the given current model being replaced with its submodel relativized to the (...)
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   9 citations  
1 — 50 / 992