77 found
Sort by:
See also:
Profile: Shimon Edelman (Cornell University)
  1.  42 DLs
    Shimon Edelman (1995). Representation, Similarity, and the Chorus of Prototypes. Minds and Machines 5 (1):45-68.
    It is proposed to conceive of representation as an emergent phenomenon that is supervenient on patterns of activity of coarsely tuned and highly redundant feature detectors. The computational underpinnings of the outlined concept of representation are (1) the properties of collections of overlapping graded receptive fields, as in the biological perceptual systems that exhibit hyperacuity-level performance, and (2) the sufficiency of a set of proximal distances between stimulus representations for the recovery of the corresponding distal contrasts between stimuli, as in (...)
    Direct download (12 more)  
     
    My bibliography  
     
    Export citation  
  2.  41 DLs
    Shimon Edelman (1998). Representation is Representation of Similarities. Behavioral and Brain Sciences 21 (4):449-467.
    Intelligent systems are faced with the problem of securing a principled (ideally, veridical) relationship between the world and its internal representation. I propose a unified approach to visual representation, addressing both the needs of superordinate and basic-level categorization and of identification of specific instances of familiar categories. According to the proposed theory, a shape is represented by its similarity to a number of reference shapes, measured in a high-dimensional space of elementary features. This amounts to embedding the stimulus in a (...)
    Direct download (16 more)  
     
    My bibliography  
     
    Export citation  
  3.  27 DLs
    Shimon Edelman & Nathan Intrator (2002). Visual Processing of Object Structure. In M. Arbib (ed.), The Handbook of Brain Theory and Neural Networks. MIT Press
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  4.  22 DLs
    Shimon Edelman & Elise M. Breen (1999). On the Virtues of Going All the Way. Behavioral and Brain Sciences 22 (4):614-614.
    Representational systems need to use symbols as internal stand-ins for distal quantities and events. Barsalou's ideas go a long way towards making the symbol system theory of representation more appealing, by delegating one critical part of the representational burden to image-like entities. The target article, however, leaves the other critical component of any symbol system theory underspecified. We point out that the binding problem can be alleviated if a perceptual symbol system is made to rely on image-like entities not only (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  5.  22 DLs
    Shimon Edelman, Complex Cells and Object Recognition.
    Nearest-neighbor correlation-based similarity computation in the space of outputs of complex-type receptive elds can support robust recognition of 3D objects. Our experiments with four collections of objects resulted in mean recognition rates between 84% (for subordinate-level discrimination among 15 quadruped animal shapes) and 94% (for basic-level recognition of 20 everyday objects), over a 40 40 range of viewpoints, centered on a stored canonical view and related to it by rotations in depth. This result has interesting implications for the design of (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  6.  22 DLs
    Shimon Edelman, Similarity-Based Word Sense Disambiguation.
    We describe a method for automatic word sense disambiguation using a text corpus and a machine- readable dictionary (MRD). The method is based on word similarity and context similarity measures. Words are considered similar if they appear in similar contexts; contexts are similar if they contain similar words. The circularity of this definition is resolved by an iterative, converging process, in which the system learns from the corpus a set of typical usages for each of the senses of the polysemous (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  7.  21 DLs
    Shimon Edelman, Bridging Computational, Formal and Psycholinguistic Approaches to Language.
    We compare our model of unsupervised learning of linguistic structures, ADIOS [1, 2, 3], to some recent work in computational linguistics and in grammar theory. Our approach resembles the Construction Grammar in its general philosophy (e.g., in its reliance on structural generalizations rather than on syntax projected by the lexicon, as in the current generative theories), and the Tree Adjoining Grammar in its computational characteristics (e.g., in its apparent affinity with Mildly Context Sensitive Languages). The representations learned by our algorithm (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  8.  21 DLs
    Shimon Edelman, Tomer Fekete & Neta Zach (eds.) (2012). Being in Time: Dynamical Models of Phenomenal Experience. John Benjamins Pub. Co..
    The chapters comprising this book represent a collective attempt on the part of their authors to redress this aberration.
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  9.  20 DLs
    Shimon Edelman, Evolution of Language Diversity: The Survival of the Fitness.
    We examined the role of fitness, commonly assumed without proof to be conferred by the mastery of language, in shaping the dynamics of language evolution. To that end, we introduced island migration (a concept borrowed from population genetics) into the shared lexicon model of communication (Nowak et al., 1999). The effect of fitness linear in language coherence was compared to a control condition of neutral drift. We found that in the neutral condition (no coherence-dependent fitness) even a small migration rate (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  10.  20 DLs
    Shimon Edelman & Erich D. Jarvis, Evolution of Dynamic Coordination.
    What insights does comparative biology provide for furthering scienti¿ c understanding of the evolution of dynamic coordination? Our discussions covered three major themes: (a) the fundamental unity in functional aspects of neurons, neural circuits, and neural computations across the animal kingdom; (b) brain organization –behavior relationships across animal taxa; and (c) the need for broadly comparative studies of the relationship of neural structures, neural functions, and behavioral coordination. Below we present an overview of neural machinery and computations that are shared (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  11.  19 DLs
    Shimon Edelman (1997). Computational Theories of Object Recognition. Trends in Cognitive Sciences 1 (8):296-304.
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  12.  18 DLs
    Tomer Fekete & Shimon Edelman (2011). Towards a Computational Theory of Experience. Consciousness and Cognition 20 (3):807-827.
    A standing challenge for the science of mind is to account for the datum that every mind faces in the most immediate – that is, unmediated – fashion: its phenomenal experience. The complementary tasks of explaining what it means for a system to give rise to experience and what constitutes the content of experience (qualia) in computational terms are particularly challenging, given the multiple realizability of computation. In this paper, we identify a set of conditions that a computational theory must (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  13.  18 DLs
    Shimon Edelman, Unsupervised Statistical Learning in Vision: Computational Principles, Biological Evidence.
    Unsupervised statistical learning is the standard setting for the development of the only advanced visual system that is both highly sophisticated and versatile, and extensively studied: that of monkeys and humans. In this extended abstract, we invoke philosophical observations, computational arguments, behavioral data and neurobiological findings to explain why computer vision researchers should care about (1) unsupervised learning, (2) statistical inference, and (3) the visual brain. We then outline a neuromorphic approach to structural primitive learning motivated by these considerations, survey (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  14.  17 DLs
    Shimon Edelman, (Object Recognition/Multidimensional Scaling/Computational Model).
    differentiaily rated pairwise similarity when confronted with two pairs of objects, each revolving in a separate window on a computer screen. Subject data were pooled using individually weighted MDS (ref. 11; in all the experiments, the solutions were consistent among subjects). In each trial, the subject had to select among two pairs of shapes the one consisting of the most similar shapes. The subjects were allowed to respond at will; most responded within 10 sec. Proximity (that is, perceived similarity) tables (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  15.  17 DLs
    Shimon Edelman (2008). On the Nature of Minds, Or: Truth and Consequences. Journal of Experimental and Theoretical Ai 20:181-196.
    Are minds really dynamical or are they really symbolic? Because minds are bundles of computations, and because computation is always a matter of interpretation of one system by another, minds are necessarily symbolic. Because minds, along with everything else in the universe, are physical, and insofar as the laws of physics are dynamical, minds are necessarily dynamical systems. Thus, the short answer to the opening question is “yes.” It makes sense to ask further whether some of the computations that constitute (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  16.  17 DLs
    Shimon Edelman (2011). The Metaphysics of Embodiment. International Journal of Machine Consciousness 3 (02):321-.
    Shanahan’s eloquently argued version of the global workspace theory fits well into the emerging understanding of consciousness as a computational phenomenon. His disinclination toward metaphysics notwithstanding, Shanahan’s book can also be seen as supportive of a particular metaphysical stance on consciousness — the computational identity theory.
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  17.  16 DLs
    Shimon Edelman, A New Vision of Language.
    A metaphor that has dominated linguistics for the entire duration of its existence as a discipline views sentences as edifices consisting of Lego-like building blocks. It is assumed that each sentence is constructed (and, on the receiving end, parsed) ab novo, starting (ending) with atomic constituents, to logical semantic specifications, in a recursive process governed by a few precise algebraic rules. The assumptions underlying the Lego metaphor, as it is expressed in generative grammar theories, are: (1) perfect regularity of what (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  18.  16 DLs
    Shimon Edelman & Sharon Duvdevani-Bar (1997). Similarity-Based Viewspace Interpolation and the Categorization of 3D Objects. In Proc. Edinburgh Workshop on Similarity and Categorization.
    Visual objects can be represented by their similarities to a small number of reference shapes or prototypes. This method yields low-dimensional (and therefore computationally tractable) representations, which support both the recognition of familiar shapes and the categorization of novel ones. In this note, we show how such representations can be used in a variety of tasks involving novel objects: viewpoint-invariant recognition, recovery of a canonical view, estimation of pose, and prediction of an arbitrary view. The unifying principle in all these (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  19.  15 DLs
    Shimon Edelman (2008). Computing the Mind: How the Mind Really Works. Oxford University Press.
    The account that Edelman gives in this book is accessible, yet unified and rigorous, and the big picture he presents is supported by evidence ranging from ...
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  20.  15 DLs
    Shimon Edelman, How Seriously Should We Take Minimalist Syntax? A Comment on Lasnik.
    Lasnik’s review of the Minimalist program in syntax [1] offers cognitive scientists help in navigating some of the arcana of the current theoretical thinking in transformational generative grammar. One may observe, however, that this journey is more like a taxi ride gone bad than a free tour: it is the driver who decides on the itinerary, and questioning his choice may get you kicked out. Meanwhile, the meter in the cab of the generative theory of grammar is running, and (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  21.  13 DLs
    Shimon Edelman, On the Virtues of Going All the Way A Commentary on Barsalou.
    Supposing the symbol system postulated by Barsalou is perceptual through and through -- what then? The target article outlines an intriguing and exciting theory of cognition in which (1) wellspecified, event- or object-linked percepts assume the role traditionally allotted to abstract and arbitrary symbols, and (2) perceptual simulation is substituted for processes traditionally believed to require symbol manipulation, such as deductive reasoning. We take a more extreme stance on the role of perception (in particular, vision) in shaping cognition, and propose, (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  22.  13 DLs
    Shimon Edelman, Http://Kybele.Psych.Cornell.Edu/∼Edelman.
    The computational program for theoretical neuroscience initiated by Marr and Poggio (1977) calls for a study of biological information processing on several distinct levels of abstraction. At each of these levels — computational (defining the problems and considering possible solutions), algorithmic (specifying the sequence of operations leading to a solution) and implementational — significant progress has been made in the understanding of cognition. In the past three decades, computational principles have been discovered that are common to a wide range of (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  23.  13 DLs
    Shimon Edelman, Unsupervised Learning of Visual Structure.
    To learn a visual code in an unsupervised manner, one may attempt to capture those features of the stimulus set that would contribute significantly to a statistically efficient representation (as dictated, e.g., by the Minimum Description Length principle). Paradoxically, all the candidate features in this approach need to be known before statistics over them can be computed. This paradox may be circumvented by confining the repertoire of candidate features to actual scene fragments, which resemble the “what+where” receptive fields found in (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  24.  13 DLs
    Shimon Edelman (1995). How Representation Works is More Important Than What Representations Are. Behavioral and Brain Sciences 18 (4):630.
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  25.  13 DLs
    Shimon Edelman & Morten H. Christiansen (2003). How Seriously Should We Take Minimalist Syntax? Trends in Cognitive Sciences 7 (2):60-61.
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  26.  12 DLs
    Shimon Edelman, Learn the Source and Target Languages: (A) Learn a Grammar GA for the Source Language (A). (B) Estimate a Structural Statistical Language Model SSLMA for (A). Given a Grammar (Consisting Of.. [REVIEW]
    (a) Learn a grammar GA for the source language (A). (b) Estimate a structural statistical language model SSLMA for (A). Given a grammar (consisting of terminals and nonterminals) and a partial sentence (sequence of terminals (t1 . . . ti)), an SSLM assigns probabilities to the possible choices of the next terminal ti+1.
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  27.  12 DLs
    Shimon Edelman & Nathan Intrator (2003). Towards Structural Systematicity in Distributed, Statically Bound Visual Representations. Cognitive Science 23 (1):73-110.
    The problem of representing the spatial structure of images, which arises in visual object processing, is commonly described using terminology borrowed from propositional theories of cognition, notably, the concept of compositionality. The classical propositional stance mandates representations composed of symbols, which stand for atomic or composite entities and enter into arbitrarily nested relationships. We argue that the main desiderata of a representational system — productivity and systematicity — can (indeed, for a number of reasons, should) be achieved without recourse to (...)
    Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  28.  12 DLs
    Shimon Edelman, Generalization to Novel Images in Upright and Inverted Faces.
    An image of a face depends not only on its shape, but also on the viewpoint, illumination conditions, and facial expression. A face recognition system must overcome the changes in face appearance induced by these factors. This paper investigate two related questions: the capacity of the human visual system to generalize the recognition of faces to novel images, and the level at which this generalization occurs. We approach this problems by comparing the identi cation and generalization capacity for upright and (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  29.  12 DLs
    Shimon Edelman, Characterizing Motherese: On the Computational Structure of Child-Directed Language.
    We report a quantitative analysis of the cross-utterance coordination observed in child-directed language, where successive utterances often overlap in a manner that makes their constituent structure more prominent, and describe the application of a recently published unsupervised algorithm for grammar induction to the largest available corpus of such language, producing a grammar capable of accepting and generating novel wellformed sentences. We also introduce a new corpus-based method for assessing the precision and recall of an automatically acquired generative grammar without recourse (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  30.  12 DLs
    Shimon Edelman, In Theoretical and Philosophical Psychology.
    By what empirical means can a person determine whether he or she is presently awake or dreaming? Any conceivable test addressing this question, which is a special case of the classical metaphysical doubting of reality, must be statistical (for the same reason that empirical science is, as noted by Hume). Subjecting the experienced reality to any kind of statistical test (for instance, a test for bizarreness) requires, however, that a set of baseline measurements be available. In a dream, or in (...)
    No categories
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  31.  12 DLs
    Shimon Edelman, Vision, Reanimated and Reimagined.
    The publication in 1982 of David Marr’s Vision has delivered a singular boost and a course correction to the science of vision. Thirty years later, cognitive science is being transformed by the new ways of thinking about what it is that the brain computes, how it does that, and, most importantly, why cognition requires these computations and not others. This ongoing process still owes much of its impetus and direction to the sound methodology, engaging style, and unique voice of Marr’s (...)
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  32.  12 DLs
    Shimon Edelman (2003). Generative Grammar with a Human Face? Behavioral and Brain Sciences 26 (6):675-676.
    The theoretical debate in linguistics during the past half-century bears an uncanny parallel to the politics of the (now defunct) Communist Bloc. The parallels are not so much in the revolutionary nature of Chomsky's ideas as in the Bolshevik manner of his takeover of linguistics (Koerner 1994) and in the Trotskyist (“permanent revolution”) flavor of the subsequent development of the doctrine of Transformational Generative Grammar (TGG) (Townsend & Bever 2001, pp. 37–40). By those standards, Jackendoff is quite a party faithful (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  33.  12 DLs
    Shimon Edelman (2008). A Swan, and Pike, and a Crawfish Walk Into a Bar. Journal of Experimental and Theoretical Ai 20:261-268.
    The three commentaries of Van Orden, Spivey and Anderson, and Dietrich (with Markman’s as a backdrop) form a tableau that reminds me of a fable by Ivan Andreevich Krylov (1769 - 1844), in which a swan, a pike, and a crawfish undertake jointly to move a cart laden with goods. What transpires then is not unexpected: the swan strives skyward, the pike pulls toward the river, and the crawfish scrambles backward. The call for papers for the present ecumenically minded special (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  34.  11 DLs
    Shimon Edelman, Trade-off Between Capacity and Generalization in a Model of Memory.
    Although computational considerations suggest that a resource-limited memory system may have to trade off capacity for generalization ability, such a trade-off has not been demonstrated in the past. We describe a simple model of memory that exhibits this trade-off and describe its performance in a variety of tasks.
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  35.  11 DLs
    Shimon Edelman, Unsupervised Efficient Learning and Representation of Language Structure.
    We describe a linguistic pattern acquisition algorithm that learns, in an unsupervised fashion, a streamlined representation of corpus data. This is achieved by compactly coding recursively structured constituent patterns, and by placing strings that have an identical backbone and similar context structure into the same equivalence class. The resulting representations constitute an efficient encoding of linguistic knowledge and support systematic generalization to unseen sentences.
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  36.  11 DLs
    Shimon Edelman (2003). But Will It Scale Up? Not Without Representations. Adaptive Behavior 11:273-275.
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  37.  11 DLs
    Shimon Edelman, Unsupervised Context Sensitive Language Acquisition From a Large Corpus.
    We describe a pattern acquisition algorithm that learns, in an unsupervised fashion, a streamlined representation of linguistic structures from a plain natural-language corpus. This paper addresses the issues of learning structured knowledge from a large-scale natural language data set, and of generalization to unseen text. The implemented algorithm represents sentences as paths on a graph whose vertices are words (or parts of words). Significant patterns, determined by recursive context-sensitive statistical inference, form new vertices. Linguistic constructions are represented by (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  38.  11 DLs
    Shimon Edelman, Automatic Acquisition and Efficient Representation of Syntactic Structures.
    The distributional principle according to which morphemes that occur in identical contexts belong, in some sense, to the same category [1] has been advanced as a means for extracting syntactic structures from corpus data. We extend this principle by applying it recursively, and by using mutual information for estimating category coherence. The resulting model learns, in an unsupervised fashion, highly structured, distributed representations of syntactic knowledge from corpora. It also exhibits promising behavior in tasks usually thought to require representations anchored (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  39.  11 DLs
    Shimon Edelman, Representing 3D Ob Jects by Sets of Activities of Receptiv E Elds.
    Idealized mo dels of receptive elds (RFs) can be used as building blocks for the creation of p owerful distributed computation systems. The present rep ort concentrates on inv estigating the utility of collections of RFs in representing 3D objects under changing viewing conditions. The main requirement in this task is that the pattern of activity of RFs vary as little as p ossible when the object and the camera move relative to each other. I propose a method for representing (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  40.  11 DLs
    Shimon Edelman, The Role of Hierarchy in Learning to Categorize Images.
    Converging evidence from anatomical studies (Maunsell, 1983) and functional analyses (Hubel & Wisesel, 1968) of the nervous system suggests that the feed-forward pathway of the mammalian perceptual system follows a largely hierarchic organization scheme. This may be because hierarchic structures are intrinsically more viable and thus more likely to evolve (Simon, 2002). But it may also be because objects in our environment have a hierarchic structure and the perceptual system has evolved to match it. We conducted a behavioral experiment to (...)
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  41.  10 DLs
    Shimon Edelman, Variation Sets Facilitate Artificial Language Learning.
    Variation set structure — partial alignment of successive utterances in child-directed speech — has been shown to correlate with progress in the acquisition of syntax by children. The present study demonstrates that arranging a certain proportion of utterances in a training corpus in variation sets facilitates word segmentation and phrase structure learning in miniature artifi- cial languages by adults. Our findings have implications for understanding the mechanisms of L1 acquisition by children, and for the development of more efficient algorithms for (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  42.  10 DLs
    Shimon Edelman, On Look-Ahead in Language: Navigating a Multitude of Familiar Paths.
    Language is a rewarding field if you are in the prediction business. A reader who is fluent in English and who knows how academic papers are typically structured will readily come up with several possible guesses as to where the title of this section could have gone, had it not been cut short by the ellipsis. Indeed, in the more natural setting of spoken language, anticipatory processing is a must: performance of machine systems for speech interpretation depends critically on the (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  43.  10 DLs
    Shimon Edelman, Learning Syntactic Constructions From Raw Corpora.
    Construction-based approaches to syntax (Croft, 2001; Goldberg, 2003) posit a lexicon populated by units of various sizes, as envisaged by (Langacker, 1987). Constructions may be specified completely, as in the case of simple morphemes or idioms such as take it to the bank, or partially, as in the expression what’s X doing Y?, where X and Y are slots that admit fillers of particular types (Kay and Fillmore, 1999). Constructions offer an intriguing alternative to traditional rule-based syntax by hinting at (...)
    No categories
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  44.  9 DLs
    Shimon Edelman, Some Tests of an Unsupervised Model of Language Acquisition.
    We outline an unsupervised language acquisition algorithm and offer some psycholinguistic support for a model based on it. Our approach resembles the Construction Grammar in its general philosophy, and the Tree Adjoining Grammar in its computational characteristics. The model is trained on a corpus of transcribed child-directed speech (CHILDES). The model’s ability to process novel inputs makes it capable of taking various standard tests of English that rely on forced-choice judgment and on magnitude estimation of linguistic acceptability. We report encouraging (...)
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  45.  9 DLs
    Shimon Edelman (2014). How to Write a 'How-to-Build-a-Brain' Book. Trends in Cognitive Sciences 18 (3):118-119.
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  46.  9 DLs
    Oren Kolodny, Arnon Lotem & Shimon Edelman (2014). Learning a Generative Probabilistic Grammar of Experience: A Process‐Level Model of Language Acquisition. Cognitive Science 38 (4):227-267.
    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  47.  8 DLs
    Shimon Edelman, Vision Reanimated.
    Computer vision systems are, on most counts, poor performers, when compared to their biological counterparts. The reason for this may be that computer vision is handicapped by an unreasonable assumption regarding what it means to see, which became prevalent as the notions of intrinsic images and of representation by reconstruction took over the field in the late 1970’s. Learning from biological vision may help us to overcome this handicap.
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  48.  8 DLs
    Catherine Caldwell-Harris & Shimon Edelman, Tracks in the Mind: Differential Entrenchment of Common and Rare Liturgical and Everyday Multiword Phrases in Religious and Secular Hebrew Speakers.
    We tested the hypothesis that more frequent exposure to multiword phrases results in deeper entrenchment of their representations, by examining the performance of subjects of different religiosity in the recognition of briefly presented liturgical and secular phrases drawn from several frequency classes. Three of the sources were prayer texts that religious Jews are required to recite on a daily, weekly, and annual basis, respectively; two others were common and rare expressions encountered in the general secular Israeli culture. As expected, linear (...)
    No categories
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  49.  8 DLs
    Shimon Edelman & Tomer Fekete (2012). Being in Time. In Shimon Edelman, Tomer Fekete & Neta Zach (eds.), Being in Time: Dynamical Models of Phenomenal Experience. John Benjamins 88--81.
  50.  8 DLs
    Catherine Caldwell-Harris & Shimon Edelman, Measuring Mental Entrenchment of Phrases with Perceptual Identification, Familiarity Ratings, and Corpus Frequency Statistics.
    Word recognition is the Petri dish of the cognitive sciences. The processes hypothesized to govern naming, identifying and evaluating words have shaped this field since its origin in the 1970s. Techniques to measure lexical processing are not just the back-bone of the typical experimental psychology laboratory, but are now routinely used by cognitive neuroscientists to study brain processing and increasingly by social and clinical psychologists (Eder, Hommel, and De Houwer 2007). Models developed to explain lexical processing have also aspired to (...)
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
1 — 50 / 77