13 found
Order:
  1.  69
    Seeing to hear better: evidence for early audio-visual interactions in speech identification.Jean-Luc Schwartz, Frédéric Berthommier & Christophe Savariaux - 2004 - Cognition 93 (2):69-78.
    Lip reading is the ability to partially understand speech by looking at the speaker's lips. It improves the intelligibility of speech in noise when audio-visual perception is compared with audio-only perception. A recent set of experiments showed that seeing the speaker's lips also enhances sensitivity to acoustic information, decreasing the auditory detection threshold of speech embedded in noise [J. Acoust. Soc. Am. 109 (2001) 2272; J. Acoust. Soc. Am. 108 (2000) 1197]. However, detection is different from comprehension, and it remains (...)
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark   16 citations  
  2.  17
    Transfer of sensorimotor learning reveals phoneme representations in preliterate children.Tiphaine Caudrelier, Lucie Ménard, Pascal Perrier, Jean-Luc Schwartz, Silvain Gerber, Camille Vidou & Amélie Rochet-Capellan - 2019 - Cognition 192 (C):103973.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  3.  31
    Non-local mind from the perspective of social cognition.Jonas Chatel-Goldman, Jean-Luc Schwartz, Christian Jutten & Marco Congedo - 2013 - Frontiers in Human Neuroscience 7.
  4.  33
    The complementary roles of auditory and motor information evaluated in a Bayesian perceptuo-motor model of speech perception.Raphaël Laurent, Marie-Lou Barnaud, Jean-Luc Schwartz, Pierre Bessière & Julien Diard - 2017 - Psychological Review 124 (5):572-602.
    No categories
    Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  5.  14
    The shadow of a doubt? Evidence for perceptuo-motor linkage during auditory and audiovisual close-shadowing.Lucie Scarbel, Denis Beautemps, Jean-Luc Schwartz & Marc Sato - 2014 - Frontiers in Psychology 5.
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6.  82
    Multisensory and sensorimotor interactions in speech perception.Kaisa Tiippana, Riikka Möttönen & Jean-Luc Schwartz - 2015 - Frontiers in Psychology 6.
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  7.  62
    Attention-based maintenance of speech forms in memory: The case of verbal transformations.Christian Abry, Marc Sato, Jean-Luc Schwartz, Hélène Loevenbruck & Marie-Agnès Cathiard - 2003 - Behavioral and Brain Sciences 26 (6):728-729.
    One of the fundamental questions raised by Ruchkin, Grafman, Cameron, and Berndt's (Ruchkin et al.'s) interpretation of no distinct specialized neural networks for short-term storage buffers and long-term memory systems, is that of the link between perception and memory processes. In this framework, we take the opportunity in this commentary to discuss a specific working memory task involving percept formation, temporary retention, auditory imagery, and the attention-based maintenance of information, that is, the verbal transformation effect.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  8.  51
    A new puzzle for the evolution of speech?Christian Abry, Louis-Jean Boë, Rafael Laboissière & Jean-Luc Schwartz - 1998 - Behavioral and Brain Sciences 21 (4):512-513.
    We agree with MacNeilage's claim that speech stems from a volitional vocalization pathway between the cingulate and the supplementary motor area (SMA). We add the vocal self- monitoring system as the first recruitment of the Broca-Wernicke circuit. SMA control for “frames” is supported by wrong consonant-vowel recurring utterance aphasia and an imaging study of quasi-reiterant speech. The role of Broca's area is questioned in the emergence of “content,” because a primary motor mapping, embodying peripheral constraints, seems sufficient. Finally, we reject (...)
    Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  9.  32
    (1 other version)Introduction: Vocalize to Localize? A call for better crosstalk between auditory and visual communication systems researchers: From meerkats to humans.Christian Abry, Anne Vilain & Jean-Luc Schwartz - 2005 - Interaction Studies 5 (3):313-325.
  10.  53
    Integrate, yes, but what and how? A computational approach of sensorimotor fusion in speech.Raphaël Laurent, Clément Moulin-Frier, Pierre Bessière, Jean-Luc Schwartz & Julien Diard - 2013 - Behavioral and Brain Sciences 36 (4):364 - 365.
    We consider a computational model comparing the possible roles of and in phonetic decoding, demonstrating that these two routes can contain similar information in some communication situations and highlighting situations where their decoding performance differs. We conclude that optimal decoding should involve some sort of fusion of association and simulation in the human brain.
    Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  11.  18
    Orofacial somatosensory inputs modulate word segmentation in lexical decision.Rintaro Ogane, Jean-Luc Schwartz & Takayuki Ito - 2020 - Cognition 197 (C):104163.
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  12.  24
    Phonology grounded in sensorimotor speech: Elements of a morphogenesis theory.Jean-Luc Schwartz - 2007 - Interaction Studies 5:313-324.
  13.  12
    (2 other versions)Building a talking baby robot.Jihène Serkhane, Jean-Luc Schwartz & Pierre Bessière - 2005 - Interaction Studies. Social Behaviour and Communication in Biological and Artificial Systemsinteraction Studies / Social Behaviour and Communication in Biological and Artificial Systemsinteraction Studies 6 (2):253-286.
    Speech is a perceptuo-motor system. A natural computational modeling framework is provided by cognitive robotics, or more precisely speech robotics, which is also based on embodiment, multimodality, development, and interaction. This paper describes the bases of a virtual baby robot which consists in an articulatory model that integrates the non-uniform growth of the vocal tract, a set of sensors, and a learning model. The articulatory model delivers sagittal contour, lip shape and acoustic formants from seven input parameters that characterize the (...)
    No categories
    Direct download (2 more)  
     
    Export citation  
     
    Bookmark