Elsevier

Cognition

Volume 206, January 2021, 104487
Cognition

Vividness of recollection is supported by eye movements in individuals with high, but not low trait autobiographical memory

https://doi.org/10.1016/j.cognition.2020.104487Get rights and content

Abstract

There are marked individual differences in the recollection of personal past events or autobiographical memory (AM). Theory concerning the relationship between mnemonic and visual systems suggests that eye movements promote retrieval of spatiotemporal details from memory, yet assessment of this prediction within naturalistic AM has been limited. We examined the relationship of eye movements to free recall of naturalistic AM and how this relationship is modulated by individual differences in AM capacity. Participants freely recalled past episodes while viewing a blank screen under free and fixed viewing conditions. Memory performance was quantified with the Autobiographical Interview, which separates internal (episodic) and external (non-episodic) details. In Study 1, as a proof of concept, fixation rate was predictive of the number of internal (but not external) details recalled across both free and fixed viewing. In Study 2, using an experimenter-controlled staged event (a museum-style tour) the effect of fixations on free recall of internal (but not external) details was again observed. In this second study, however, the fixation-recall relationship was modulated by individual differences in autobiographical memory, such that the coupling between fixations and internal details was greater for those endorsing higher than lower episodic AM. These results suggest that those with congenitally strong AM rely on the visual system to produce episodic details, whereas those with lower AM retrieve such details via other mechanisms.

Introduction

There is strong evidence from neuroimaging studies that the visual system supports recollection, particularly when vivid, perceptual details regarding scenes must be recalled from memory (Greenberg & Knowlton, 2014; Hassabis & Maguire, 2007; Rubin & Umanath, 2015). There are marked individual differences in episodic AM, with some having vivid re-experiencing, and others for whom personal experiences lack visual richness even to the point where they are experienced as generic semantic facts (Palombo et al., 2018). Variation in episodic AM has been linked to variability in functional connectivity of the medial temporal lobe (MTL) memory network with posterior regions responsible for visual perception and imagery (Petrican, Palombo, Sheldon, & Levine, 2020; Sheldon, Farb, Palombo, & Levine, 2016). Individuals with aphantasia (i.e., lack of visual imagery abilities) report impaired AM (Dawes, Keogh, Andrillon, & Pearson, 2020; Greenberg & Knowlton, 2014; Zeman, Dewar, & Sala, 2015).

Relatedly, eye tracking research using laboratory materials suggests that the way in which we explore the world with our eyes is important for encoding details into memory (e.g., Henderson, Williams, & Falk, 2005), and that eye movements are instrumental for subsequently retrieving those details from memory (e.g., Johansson, Holsanova, Dewhurst, & Holmqvist, 2012). Recall of past personal events has been related to the number of saccadic eye movements (El Haj & Lenoble, 2018), especially when event cues evoke vivid and emotional memories (El Haj, Nandrino, Antoine, Boucart, & Lenoble, 2017).

Eye movements may benefit recollection through the reinstatement of spatiotemporal context (Wynn, Shen, & Ryan, 2019) that in turn enables vivid re-experiencing (Ryan, Shen, & Liu, 2020). The oculomotor system may focus internal attention within memorized space (van Ede, Chekroud, & Nobre, 2019), as mediated by the extensive structural and functional connections between the oculomotor and MTL systems (Ryan et al., 2020; Shen, Bezgin, Selvam, MacIntosh, & Ryan, 2016). Yet little research has explored the relationship between visual exploration and quantified episodic AM. Prior studies of eye movements in AM have relied on subjective reports to measure AM (including those using subjective ratings of vividness and detail, e.g., El Haj et al., 2017; Lenoble, Janssen, & El Haj, 2018), and most have manipulated eye movements via task instructions without directly measuring eye movements (for exception, see Lenoble et al., 2018). Moreover, it is unclear how visual exploration behaviour relates to individual differences in AM.

We monitored eye movements in conjunction with a standardized method for evoking and assessing AMs, the Autobiographical Interview (AI; Levine, Svoboda, Hay, Winocur, & Moscovitch, 2002). In this widely used procedure, freely recalled memories are segmented into internal (episodic) and external (non-episodic) details. Internal (but not external) details reflect episodic recollection and are associated with brain structure and function in networks supporting episodic AM, particularly the medial temporal lobes (e.g., Hodgetts et al., 2017; Miller et al., 2020). Simultaneous monitoring of eye movements during free recall of AMs using the AI allowed us to test the prediction that episodic richness as measured by internal (but not external) details would be related to increased visual exploration. There is some evidence that fixating one's eyes compromises the construction and vividness of both laboratory memories (Johansson et al., 2012; Johansson & Johansson, 2014) and AMs (Lenoble et al., 2018) compared to free viewing on a blank screen, which would be consistent with the notion that eye movements facilitate visual imagery in recollection. In our initial ‘proof-of-concept’ Study 1, we predicted that free viewing, specifically, the number (or rate) of gaze fixations, would be associated with increased internal detail production relative to fixed viewing, with no effect on external details for events occurring up to one year prior to testing in healthy adults. Gaze fixations are well known to correspond to informative regions of the visual environment that are characterized by particular features or meaning contained within the external stimulus (Henderson & Hayes, 2017; Yarbus, 1967). Even blank regions can be informative; information previously presented in the now-blank space is retrieved and used to support task performance (Brandt & Stark, 1997; Laeng & Teodorescu, 2002; Richardson & Spivey, 2000). Gaze fixations are the discrete stops made by the eyes for the purpose of extracting information. The number of gaze fixations has been consistently shown to predict subsequent recognition memory (Damiano & Walther, 2019; Loftus, 1972) and is positively correlated with functional activity in the hippocampus (Liu, Shen, Olsen, & Ryan, 2017; for review, see Ryan et al., 2020), a region which has a critical role in episodic autobiographical memory retrieval (Miller et al., 2020; Rosenbaum et al., 2008). For these reasons, we used gaze fixations as our metric of interest in the present study to explore the relationship between visual exploration and autobiographical memory retrieval.

The predicted relationship between the amount of visual exploration (here, gaze fixations) and freely recalled internal (but not external) details held, yet our prediction that this would be specific to free viewing condition was not supported. In Study 2, we sought to replicate this effect with a staged event that controlled for encoding characteristics and delay interval and to incorporate individual differences in a larger sample (n = 91). In the staged event, participants experienced a museum-style audio-guided tour, and then freely recalled the tour two days later. Critically, we assessed individual differences in trait episodic AM capacities using a previously validated questionnaire (Palombo, Williams, Abdi, & Levine, 2013), recruiting participants representing the full range of trait episodic AM. We predicted that the relationship between visual exploration (as assessed by eye movements) and vivid recollection (as assessed by AI internal details) would be moderated by individual differences in trait episodic AM, such that individuals with high trait episodic AM would show a greater degree of coupling between eye movements and internal details than individuals with low trait episodic AM. Such a finding would suggest that eye movements support the degree to which one experiences rich recollection during AM.

Section snippets

Study 1 – Method

Forty-seven young adults (24 females, mean age = 23.3 years, mean education = 16.6 years) were recruited from the Rotman Research Institute volunteer database or from social media postings. Participants were healthy, between the ages of 18–31, had no history of significant psychiatric or neurologic disease, significant traumatic brain injury, or use of medications affecting cognition. Participants selected autobiographical events from within the past year (but not the past week) prior to coming

Study 1 – Results

To verify that the samples from Study 1a and Study 1b were suitably homogeneous to be merged into a combined sample, sex, age, and years of education were included as unstructured covariates. Our model yielded a significant main effect of detail type (b = 8.98, SE = 0.58, F(1, 132.33) = 515.22, p < .001, R2 = 0.796), showing that participants produced more internal than external details across viewing conditions. The main effect of fixation rate fell short of significance (b = 0.00, SE = 0.01, F

Study 2 – Method

Ninety-one healthy young adults (66 females, mean age = 24.77 years [range: 18–35], mean education = 16.56 years) participated. These participants were drawn from a pool of 683 who responded to social media postings to complete online surveys, including the Survey of Autobiographical Memory (SAM), a self-report measure that includes ratings for episodic memory, semantic memory, spatial memory, and future thinking abilities (Palombo et al., 2013), the Object-Spatial Imagery Questionnaire (OSIQ;

Study 2 – Results

To test the hypothesis that individual differences in episodic AM would moderate the effect of fixations on memory details observed in Study 1, we ran a linear mixed-effects model testing the effect of fixation rate on details as described above, but with episodic SAM score included in the model. As predicted, episodic SAM score significantly interacted with fixation rate and detail type in predicting the number of details (b = 5.23−4, SE = 4.57−4, F(1, 259.88) = 4.41, p = .037, R2 = 0.017; see

Discussion

What are the mechanisms by which contextually rich AMs are constructed? Converging anatomical (e.g., Ryan et al., 2020), experimental (e.g., Lenoble et al., 2018), and clinical (e.g., Zeman et al., 2015) evidence suggests a key role for processing of visual images evoked during AM retrieval, although there is little empirical research testing this hypothesis in a naturalistic context. We investigated this hypothesis from the oculomotor perspective, quantifying episodic richness from free recall

Declaration of Competing Interest

The authors declare no conflicts of interest.

Acknowledgements

Thanks to Alissa Papadopoulos, Laura Oliva, Douglas McQuiggan, Arber Kacollja, Malcom Binns, Raluca Petrican, Carina Fan, Yushu Wang, and Ryan Aloysius for their contributions to the research described here. This research was supported by a CIHR grant awarded to Brian Levine and an NSERC grant awarded to Jennifer D. Ryan.

References (59)

  • D.C. Richardson et al.

    Representation, space and Hollywood squares: Looking at things that aren’t there anymore

    Cognition

    (2000)
  • S. Sheldon et al.

    Intrinsic medial temporal lobe connectivity relates to individual differences in episodic autobiographical remembering

    Cortex

    (2016)
  • A. Zeman et al.

    Lives without imagery–Congenital aphantasia

    Cortex

    (2015)
  • I. Anusic et al.

    The nature and structure of correlations among big five ratings: The halo-alpha-beta model

    Journal of Personality and Social Psychology

    (2009)
  • M.J. Armson et al.

    Bridging naturalistic and laboratory assessment of memory: The Baycrest mask fit test

    Memory

    (2017)
  • M.J. Armson et al.

    Maintaining fixation does not increase demands on working memory relative to free viewing

    PeerJ

    (2019)
  • D. Bates et al.

    lme4: Linear mixed-effects models using S4 classes. R package version

    (2012)
  • D.J. Bauer et al.

    Probing interactions in fixed and multilevel regression: Inferential and graphical techniques

    Multivariate Behavioral Research

    (2005)
  • O. Blajenkova et al.

    Object-spatial imagery: A new self-report imagery questionnaire

    Applied Cognitive Psychology

    (2006)
  • M.B. Bone et al.

    Eye movement reinstatement and neural reactivation during mental imagery

    Cerebral Cortex

    (2018)
  • S. Brandt et al.

    Spontaneous eye movements during visual imagery reflect the content of the visual scene

    Journal of Cognitive Neuroscience

    (1997)
  • C. Cherici et al.

    Precision of sustained fixation in trained and untrained observers

    Journal of Vision

    (2018)
  • L.J. Cronbach

    The two disciplines of scientific psychology

    American Psychologist

    (1957)
  • P.J. Curran et al.

    Testing and probing interactions in hierarchical linear growth models

  • A.J. Dawes et al.

    A cognitive profile of multi-sensory imagery, memory and dreaming in aphantasia

    Scientific Reports

    (2020)
  • F. van Ede et al.

    Human gaze tracks attentional focusing in memorized visual space

    Nature Human Behaviour

    (2019)
  • N.B. Diamond et al.

    Linking detail to temporal structure in naturalistic event recall

    Psychological Science

    (2020)
  • C.L. Fan et al.

    Episodic and spatial memory abilities are independent

    Memory & Cognition

    (2020)
  • D.L. Greenberg et al.

    The role of visual imagery in autobiographical memory

    Memory & Cognition

    (2014)
  • Cited by (17)

    • “Look at the future”: Maintained fixation impoverishes future thinking

      2022, Consciousness and Cognition
      Citation Excerpt :

      This assumption can be supported by the study of Wynn et al. (2022) who have suggested that eye movements support future thinking via reinstatement of scene and event schemas, and more broadly, that interactions between eye movement and memory may underlie critical cognitive processes including constructive episodic simulation. The assumption of Wynn et al. (2022) mirrors previous work demonstrating how successful memory retrieval is accompanied by eye movements reflecting the spatial and temporal mnemonic content of the retrieved events (Altmann, 2004; Armson et al., 2021; Foulsham & Kingstone, 2013; Johansson & Johansson, 2013; Wynn et al., 2019). The assumption of Wynn et al. (2022) also mirrors the assumption that eye movements facilitate the reactivation of spatial and/or temporal contextual information from memory, supporting the retrieval or generation of further episodic details (Conti & Irish, 2021; Wynn et al., 2019).

    • The Hippocampal Horizon: Constructing and Segmenting Experience for Episodic Memory

      2022, Neuroscience and Biobehavioral Reviews
      Citation Excerpt :

      Many event boundary studies encapsulate high inter-participant agreement upon given boundaries, implying homogeneity in subsequent memory performance. Yet, there is in fact great individual variance in episodic recollective abilities (Palombo et al., 2018) and while many factors may give rise to this variance, one being oculomotor-hippocampal interactions during encoding as previously discussed (Davis et al., 2021; Armson et al., 2021; Kragel et al., 2021; Meister & Buffalo, 2016), there remains an explanatory gap between event encoding, segmentation and recollection. Recent reports indicate that our daily mental experiences are frequently punctuated by periods of spontaneous thoughts, such as mind-wandering (Christoff et al., 2016) or stimulus-independent perceptions (Waters et al., 2021), with the former recruiting similar neural machinery as we have already mentioned e.g., the hippocampus, wider MTL and the DMN (Christoff et al., 2016; Stawarczyk et al., 2021; O’Callaghan et al., 2019; McCormick et al., 2018; Karapanagiotidis et al., 2017; Ellamil et al., 2016).

    View all citing articles on Scopus
    View full text