David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Jack Alan Reynolds
Learn more about PhilPapers
Cognitive Science 36 (7):1204-1223 (2012)
Most everyday tasks involve multiple modalities, which raises the question of how the processing of these modalities is coordinated by the cognitive system. In this paper, we focus on the coordination of visual attention and linguistic processing during speaking. Previous research has shown that objects in a visual scene are fixated before they are mentioned, leading us to hypothesize that the scan pattern of a participant can be used to predict what he or she will say. We test this hypothesis using a data set of cued scene descriptions of photo-realistic scenes. We demonstrate that similar scan patterns are correlated with similar sentences, within and between visual scenes; and that this correlation holds for three phases of the language production process (target identification, sentence planning, and speaking). We also present a simple algorithm that uses scan patterns to accurately predict associated sentences by utilizing similarity-based retrieval
|Keywords||Scan patterns Scene understanding Cross‐model processing Similarity measures Language production Eye‐movements|
|Categories||categorize this paper)|
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library|
References found in this work BETA
Pia Knoeferle & Matthew W. Crocker (2006). The Coordinated Interplay of Scene, Utterance, and World Knowledge: Evidence From Eye Tracking. Cognitive Science 30 (3):481-529.
Citations of this work BETA
No citations found.
Similar books and articles
Shaun P. Vecera (2000). Toward a Biased Competition Account of Object-Based Segregation and Attention. Brain and Mind 1 (3):353-384.
A. J. Greene, R. D. Easton & L. S. R. LaShell (2001). Visual-Auditory Events: Cross-Modal Perceptual Priming and Recognition Memory. Consciousness and Cognition 10 (3):425-435.
Max Louwerse & Louise Connell (2011). A Taste of Words: Linguistic Context and Perceptual Simulation Predict the Modality of Words. Cognitive Science 35 (2):381-398.
Valerie Gray Hardcastle (2001). Visual Perception is Not Visual Awareness. Behavioral and Brain Sciences 24 (5):985-985.
Ruud Koolen, Martijn Goudbeek & Emiel Krahmer (2013). The Effect of Scene Variation on the Redundant Use of Color in Definite Reference. Cognitive Science 37 (2):395-411.
Michael Kubovy & Michael Schutz (2010). Audio-Visual Objects. Review of Philosophy and Psychology 1 (1):41-61.
Anuenue Kukona & Whitney Tabor (2011). Impulse Processing: A Dynamical Systems Model of Incremental Eye Movements in the Visual World Paradigm. Cognitive Science 35 (6):1009-1051.
David A. Leopold, Melanie Wilke, Alexander Maier & Nikos K. Logothetis (2002). Stable Perception of Visually Ambiguous Patterns. Nature Neuroscience 5 (6):605-609.
Jason S. McCarley & Gregory J. DiGirolamo (2001). One Visual System with Two Interacting Visual Streams. Behavioral and Brain Sciences 25 (1):112-113.
R. Rensink (2000). Visual Search for Change: A Probe Into the Nature of Attentional Processing. Visual Cognition 7:345-376.
Athanasios Raftopoulos (2009). Reference, Perception, and Attention. Philosophical Studies 144 (3):339 - 360.
Added to index2012-04-10
Total downloads16 ( #154,465 of 1,700,235 )
Recent downloads (6 months)5 ( #128,702 of 1,700,235 )
How can I increase my downloads?