Skip to main content

ORIGINAL RESEARCH article

Front. Psychol., 11 June 2019
Sec. Human-Media Interaction
This article is part of the Research Topic The Impact of Virtual and Augmented Reality on Individuals and Society View all 24 articles

Age-Related Differences With Immersive and Non-immersive Virtual Reality in Memory Assessment

  • 1National Institute of Mental Health, Klecany, Czechia
  • 2Third Faculty of Medicine, Charles University, Prague, Czechia

Memory decline associated with physiological aging and age-related neurological disorders has a direct impact on quality of life for seniors. With demographic aging, the assessment of cognitive functions is gaining importance, as early diagnosis can lead to more effective cognitive interventions. In comparison to classic paper-and-pencil approaches, virtual reality (VR) could offer an ecologically valid environment for assessment and remediation of cognitive deficits. Despite the rapid development and application of new technologies, the results of studies aimed at the role of VR immersion in assessing cognitive performance and the use of VR in aging populations are often ambiguous. VR can be presented in a less immersive form, with a desktop platform, or with more advanced technologies like head-mounted displays (HMDs). Both these VR platforms are associated with certain advantages and disadvantages. In this study, we investigated age-related differences related to the use of desktop and HMD platforms during memory assessment using an intra-subject design. Groups of seniors (N = 36) and young adults (N = 25) completed a virtual Supermarket Shopping task using desktop and HMD platforms in a counterbalanced order. Our results show that the senior performances were superior when using the non-immersive desktop platform. The ability to recall a shopping list in the young adult group remained stable regardless of the platform used. With the HMD platform, the performance of the subjects of both groups seemed to be more influenced by fatigue. The evaluated user experiences did not differ between the two platforms, and only minimal and rare side effects were reported by seniors. This implies that highly immersive technology has good acceptance among aging adults. These findings might have implications for the further use of HMD in cognitive assessment and remediation.

Introduction

Cognitive functions play an important role in our everyday lives, governing our thoughts and actions and enabling successful adaptation to changes occurring in the surrounding environment (Sternberg et al., 2012). Our cognitive abilities can be affected during aging by common physiological processes and by neuropsychiatric and neurological disorders such as Alzheimer’s disease (AD) and vascular impairments. In the context of demographic aging, with adults over 65 years of age forming 15% of the entire United States population (United States Census Bureau, 2018) and 19.2% of the European Union population (Eurostat, 2018) the problems associated with older age are gaining in importance. Physiological aging typically accompanies decline across all cognitive domains, mainly in processing speed, divided attention, language, visuospatial abilities, memory, and executive functions (Harada et al., 2013). The most robust manifestation of physiological aging is visible memory decline (Rönnlund et al., 2005); this is subjectively the most relevant for seniors (Harada et al., 2013). In AD diagnostics, episodic memory plays an important role. The deficit in episodic memory in seniors is strongly pronounced and can be demonstrated both in errors of recent autobiographical memory and laboratory assessments using recall and recognition tasks (Rönnlund et al., 2005). The deficit in episodic memory is detectable using neuropsychological measurements up to 10 years before the diagnosis of AD; it could therefore possibly be used as a marker for early diagnosis (Bäckman et al., 2001; Boraxbekk et al., 2015). Early diagnosis can result in better-timed and more effective interventions, which might delay further progression of the cognitive decline (Naqvi et al., 2013). Thus, in the light of increasing life expectancy, the assessment of age-related memory changes is growing in relevance.

Memory deficit is usually assessed using classic paper-and-pencil neuropsychological methods; such methods have been questioned for their lack of ecological validity since 1978 (Neisser, 1978). Ecological validity can be understood as the degree to which experimental conditions approximate conditions in the real-world environment (Tupper and Cicerone, 1990) or the extent to which the test performance or study results can be generalized to real-life settings (Franzen, 1997). Classic neuropsychological tests fail to resemble real-world demands, and there has been increasing interest in neuroscience in the use of advanced technology (Parsons, 2015). Computer technologies enable precise test administration, stimulus presentation, and automatic response recording. Virtual reality (VR) is gaining in popularity due to its ability to present three-dimensional objects and create complex virtual environments (VE) that might be realistic and ecologically valid while also being precisely controllable (Parsons, 2015).

Important term linked to VR is immersion. Immersion was defined by Slater (2009) as a characteristic of the technology used for VE presentation; basically, the higher the quality of the system, the higher the level of immersion (for example, in terms of the tracking latency, the size of the field of view, or the visual quality of the scene and images). Immersion is also determined by the ability of the system to support sensorimotor contingencies, such as how the technology responds to the action performed by the user to perceive reality, e.g., turning the head to change the gaze direction (O’Regan and Noë, 2001).

Despite the obvious benefits of HMD technology (multisensory stimulation, tracking of the head and body movements, higher sense of presence), results of previous studies are not conclusive in terms of the advantages of HMD in assessing cognitive performance nor in its usability in the senior population. Previous studies have shown superior performance either using HMD (Bowman et al., 2009; Murcia-López and Steed, 2016) or using less immersive technology, such as desktop or large screen platforms (Ruddle et al., 1999; Mania and Chalmers, 2001; Sousa Santos et al., 2009). Moreover, the majority of the studies comparing HMD and less immersive technologies in terms of cognitive performance have focused on navigation or spatial memory (Ruddle et al., 1999; Bowman et al., 2009; Sousa Santos et al., 2009; Murcia-López and Steed, 2016); few studies have investigated other cognitive domains (Mania and Chalmers, 2001; Rand et al., 2005). The findings considering preference and usability of HMD seem to be more consistent, showing a preference for higher immersion technologies, mainly in terms of increased motivation (e.g., Moreno and Mayer, 2004; Richards and Taylor, 2015; Parong and Mayer, 2018), more intuitive action control, and greater enjoyment associated with task fulfillment (e.g., Sousa Santos et al., 2009). Most of these studies (except Rand et al., 2005) were conducted on young subjects; their findings cannot be easily generalized to the senior population. There is not enough evidence indicating the applicability and acceptance of HMD for cognitive assessment and training in seniors.

The aims of our study are:

• To evaluate the possible effects of immersion level on episodic memory performance for diagnostic purposes;

• To evaluate user experiences of immersive and non-immersive technology across different age groups; and

• To test the validity of a memory task designed in a complex ecologically valid virtual environment in young adults and seniors in terms of the applied immersion level.

We used an intra-subject design to investigate the role of the level of immersion on performance and user experience in memory assessment. We were interested in the difference in acceptance as evaluated by seniors (60 years and older) and by young adults (up to 40 years old). HMD has been previously considered more intuitive and motivating (Martínez-Arán et al., 2004; Richards and Taylor, 2015; Parong and Mayer, 2018). We therefore hypothesized that the platform used will affect user experience. We expected to find differences between platforms in memory performances, as the more immersive technology is seen as more engaging and thus might result in better cognitive outcomes. This hypothesis is in contrast with some previous findings that associate the HMD platform with lower cognitive performance. We speculate that recent innovations in the technology of virtual glasses might lead to a different outcome.

Materials and Methods

Participants

Thirty-six seniors (13 males and 23 females, mean age = 69.47; SD = 7.39; age range = 60–91) and 25 young adults (9 males and 16 females, mean age = 25.4; SD = 5.13; age range = 19–39) voluntarily participated in this study. All participants signed an informed consent form containing information about the experiment procedure and exclusion criteria. The study was approved by the ethics committee of the NIMH in Klecany. Seniors were recruited from the database of the Department of Cognitive Disorders (NIMH) where they were neuropsychologically evaluated and classified as cognitively healthy. Young adults were recruited from the NIMH database of healthy volunteers to be matched in sex and education level to the group of seniors. Participants were not included in the study if they had major neurological disorders, diagnosed psychiatric illness, recent traumatic brain injury, brain surgery, or another illness involving major visual or movement impairment that would prevent them from participating in the experiment. The groups did not differ in demographic characteristics (apart from age). Detailed characteristics of the groups of seniors and young adults are presented in Table 1. Figure 1 presents group-specific distributions of characteristics related to the computer/videogame experience obtained from the usability questionnaire (see section “Usability Questionnaire”).

TABLE 1
www.frontiersin.org

Table 1. Summary table of demographic characteristics for individual age groups.

FIGURE 1
www.frontiersin.org

Figure 1. Distribution of group characteristics related to their experience with computers and virtual reality. The graphs show the frequency of the answers to the specific statements from the usability questionnaire part I (see Table 2).

Cognitive Evaluation

All participants were assessed using standard neuropsychological methods to briefly evaluate their cognitive performance, particularly learning and declarative memory, psychomotor speed, and mental flexibility.

The Czech version of the Rey Auditory Verbal Learning Test (RAVLT) (Rey, 1964; Preiss, 1999) was used as a standard measure of episodic memory (Pause et al., 2013) evaluating verbal learning and delayed recall. For the group comparison we used the total number of recalled words (RAVLT I-V) and the number of words correctly recalled after a 30-min delay (RAVLT delayed).

The Czech version of the Trail Making Test (TMT) (Reitan and Wolfson, 1985; Preiss and Preiss, 2006) was used as a standard measure of psychomotor speed and attention. Part A (TMT-A) evaluates psychomotor speed and visual attention; part B (TMT-B) is focused on visuospatial working memory and mental flexibility.

The Virtual Supermarket Shopping Task

The virtual Supermarket Shopping Task (vSST) was specifically designed using Unity Engine software1 for assessing episodic memory in an ecologically valid environment. The desktop version of the task was tested on patients with chronic schizophrenia and on healthy young adults (Plechatá, 2017; Plechatá et al., 2017). Other than feasibility testing in a pilot study using both desktop and HMD platforms, no sample of seniors has previously been assessed using the vSST task. The task was originally created in order to assess everyday functioning in a virtual environment that reflects real-world situations. The task is similar to neuropsychological multiple errand tasks, but it is performed in virtual reality, which ensures a safe environment and complete control over the presented stimuli (Parsons, 2015). A similar fully immersive shopping task was recently validated as a measure of episodic memory performance (Corriveau Lecavalier et al., 2018).

The virtual environment of the vSST resembles a grocery store in which the subject is supposed to remember a shopping list and later find and collect recalled items in the virtual shop. Prior to the beginning of the testing, the participant has time to explore the VE and to become familiar with the control system. The length of the exploration phase differed according to the platform used (10 min for HMD and 4 min for desktop). Each trial of the vSST task consist of two phases: the acquisition phase (presentation of the shopping list) and the recall phase (testing the recall of the shopping list by direct collection of individual items in the virtual supermarket). Between the acquisition and recall phases, participants were instructed to play a visuospatial game, the LEU Brain Stimulator2, for 3 min as a distraction task. The length of the delay was directly controlled by the vSST application, and the countdown was displayed on the screen.

The vSST had four consecutive levels of increasing difficulty (requiring remembering three, five, seven, and nine items on the shopping list). The first trial, with three items, was meant as a pretraining trial and its results were not further analyzed. The length of the acquisition phase increased automatically by 5 s for each item added to the list (i.e., 15 s for three items; 25 s for five items; 35 s for seven items; 45 s for nine items). After completing each recall phase, the results (number of errors, trial time, and trajectory) were presented to the participant. The beginning of the next acquisition phase was controlled by the participant, who could start off the next trial by pressing a confirmation button with the mouse or with the HTC VIVE controller.

In order to allow for repeated assessment using the vSST, two task variants of the shopping list were created for each difficulty level (variant A and variant B). Both variants were demonstrated to be comparable in terms of difficulty in the previous study (Plechatá, 2017).

The vSST makes it possible to evaluate three main variables: errors (omissions – missing items, and intrusions – additional items) committed while recalling individual items from the shopping list, time spent solving the task (recalling and picking up the item) and trajectory length (distance traveled in VE). For the purposes of this study, we report only the number of errors directly related to memory recall. Moreover, the movement control was different across the platforms (teleportation in HMD together with free real-world movements vs. walking using a keyboard in the desktop platform); therefore, platforms are not fully comparable in terms of trajectory traveled and solving time.

Usability Questionnaire

For this study, we developed a 55-item usability questionnaire inspired by previous usability studies (Lewis, 1995; Kaufmann and Dünser, 2007). The questionnaire has four main parts, which are summarized in Table 2. Responses considering user experience with platforms and comparison of the platforms were recorded using a five-point Likert scale (ranging from “strongly disagree” designated as 1 to “strongly agree” designated as 5). In the analysis of the questionnaire, we worked with cumulative raw scores for each platform. The cumulative score was computed by combining the score of 14 items. From the UQ II HMD and UQ II D, we extracted nine questions (three of these items were reversed); five more questions were obtained from UQ III. Adverse effects and pleasantness of the platform were analyzed separately based on individual items of the questionnaire. For more information please see the Supplementary Material.

TABLE 2
www.frontiersin.org

Table 2. Structure of usability questionnaire.

Materials

The experiment was conducted in a NIMH VR lab which was a 7 m long × 5 m wide × 3.5 m high open space. HTC VIVE was used as the HMD platform, with a display resolution of 1080 × 1200 pixels per eye. The motor activity of the participants was tracked using the HTC VIVE headset and controller. The movement in VE was enabled using teleport on the HTC VIVE controller (trackpad) and also by physically walking around the room (walking was limited by the room parameters). The controller trigger was used for the selection of objects. For the desktop platform, a 24-inch monitor with a display resolution of 1920 × 1080 pixels was used. The participants controlled their movements and pick up/drop actions using the keyboard arrows and a computer mouse.

Procedure

To compare platform usability and platform influence on measured performance, we used an intra-subject design with a counterbalanced order. The participants performed vSST in two conditions with different levels of immersion according to the platform applied: HMD and desktop. During the experiment, we counterbalanced both the order of the platforms (HMD/desktop) and the two vSST task variants (A/B – sets of the lists to remember) to minimize the practice effect on repeatedly measured performance.

After performing the vSST using the first platform selected according to the counterbalanced order (HMD/desktop, see Figure 2), the participants completed the first two parts of the Usability Questionnaire (UQ I and UQ II HMD/desktop). After performing the vSST using the second platform, participants completed the remaining two parts of the questionnaire (UQ II HMD/desktop and UQ III). Seniors completed a neurocognitive evaluation in a separate session prior to the experiment; young adults were assessed in the end of the experimental procedure.

FIGURE 2
www.frontiersin.org

Figure 2. The experimental design of the task Figure (A) shows the scheme of intra-subject design with the counterbalanced order of the VR platforms. Figures (B,C) show a respondent performing the vSST using desktop (B) and HMD platforms (C). The images were obtained with the participant’s consent. The participant signed an informed consent form regarding their publication.

Statistics Analysis

The statistical analysis was performed using statistical software IBM SPSS Statistics 19. The group differences in the standard cognitive assessment were analyzed by Mann-Whitney U test. Analyses of the differences in vSST performances and user experiences in terms of platform, group and order were examined for statistical significance using ANOVA for repeated measures including the Tukey post hoc test. The individual vSST errors and individual questions from usability questionnaire were analyzed using and Wilcoxon Sign Test.

Results

Results of the Cognitive Evaluation

In order to compare both tested groups in terms of cognitive functioning controlled by the age effect, prior to the statistical analysis, the raw data acquired from the standard neuropsychological methods were transformed to percentiles according to the Czech normative data (Preiss et al., 2012). We used non-parametric Mann-Whitney U test to compare the two groups (seniors and young adults). The normative cognitive performance of seniors in RAVLT and TMT did not differ from that of young adults. The evaluated variables and statistical data for the group comparison can be found in Table 3.

TABLE 3
www.frontiersin.org

Table 3. Results of the cognitive assessment.

The Virtual Supermarket Shopping Task Performance

In vSST, we were mainly interested in the number of errors as a parameter measuring the recall accuracy crucial for assessing memory abilities.

Cumulative vSST Errors

In the statistical comparison, we analyzed cumulative errors consisting of combined omission and intrusion errors made during three levels of task difficulty (for five, seven, and nine items on the list). We used a general linear model (GLM) with ANOVA for repeated measures with platform, group, and order of platforms as within-subject factors to analyze vSST errors (see Figures 3, 4). The analysis revealed the main effect of platform – the difference between the mean of HMD errors 8.31 (SD = 5.21) and the mean of desktop errors 6.98 (SD = 4.88) is significant, F(1,57) = 7.474, p = 0.008. A significant main effect was found also in terms of group (F(1,57) = 45.814, p < 0.001) with the mean of errors 20.5 (SD = 8.03) for seniors and the mean of errors 7.8 (SD = 5.02) for young adults. Furthermore, the GLM analysis revealed two interaction effects, for platform*group F(1,57) = 4.219, p = 0.045 and for platform*order F(1,57) = 6.091, p = 0.017.

FIGURE 3
www.frontiersin.org

Figure 3. Boxplot for cumulative vSST errors (group/platform). The vSST errors are presented separately for specific age groups and according to the used platform. Boxplots represent the following information: the line is plotted at the median, the box extends from the 25th to 75th percentiles, the whiskers are drawn up/down to the 10th and 90th percentile, and points represent the outliers. The results of statistical analysis are visualized as follows: full line markers represent the group effect and group*platform interaction; significance levels are presented as ∗∗∗ p-value < 0.001; n.s., p-value > 0.05.

FIGURE 4
www.frontiersin.org

Figure 4. Boxplot for cumulative vSST errors (group/platform/order). The vSST errors are presented for specific age groups and according to the platform. The platform order is displayed by separate graphs. Boxplots represent the following information – the line is plotted at the median, the box extends from the 25th to 75th percentiles, the whiskers are drawn up/down to the 10th and 90th percentile, and points represent the outliers. The results of statistical analysis are visualized as follows: full line markers represent the platform *order interaction effect presented separately for each platform order; significance levels are presented as ∗∗∗ p-value < 0.001; n.s., p-value > 0.05.

The Tukey post hoc test was used to test these interactions, which revealed a significant difference between the HMD errors (mean 11.43, SD = 4.23) and desktop errors in seniors (mean 9.08, SD = 4.64), p = 0.001. The performance of the group of young adults did not differ across the platforms (p = 0.998). Furthermore, a post hoc test showed the difference between HMD errors (mean 9.34, SD = 5.17) and desktop errors (mean 6.69, SD = 4.68) while performing HMD second (platform*order), p < 0.001, whereas the vSST errors did not differ across the platforms when applying HMD first (p = 0.997). No effect of platform order was found with the desktop platform.

vSST Errors in Individual Trials

Using the Wilcoxon signed rank test, we analyzed particular vSST errors in individual trials for each tested group to further investigate the variance between the platforms. After applying Bonferroni correction for repeated statistical tests, the difference between the two platforms was not significant in terms of individual vSST errors. Table 4 shows the specific values for each platform and group with appropriate statistics.

TABLE 4
www.frontiersin.org

Table 4. Number of errors in individual trials of vSST for each platform and group.

Usability Questionnaire

Cumulative Score

We applied a general linear model (GLM) with ANOVA for repeated measures with platform, group, and order of platforms as within-subject factors to analyze the summary results for the usability of individual platforms (for details, see Figure 5).

FIGURE 5
www.frontiersin.org

Figure 5. Boxplots of cumulative scores of the Usability questionnaire. Boxplots represent the following information – the line is plotted at the median, box extends from the 25th to 75th percentiles, the whiskers are drawn up/down to the 10th and 90th percentile, and points represent the outliers. The results of statistical analysis are visualized as follows: full line markers represent the group effect, dashed line markers represent group*platform interaction effects, significance levels are presented as ∗∗∗ p-value < 0.001; ∗∗ p-value < 0.01; n.s., p-value > 0.05.

The analysis revealed a main effect of group with the mean usability score 105.29 (SD = 11.71) for seniors and 114.64 (SD = 6.40) for young adults [F(1,56) = 10.986, p = 0.002]. Furthermore, the analysis revealed only one interaction effect for platform*group F(1,56) = 6.148, p = 0.016.

For further analysis of this interaction effect, we used the Tukey post hoc test, which revealed a significant difference (p < 0.001) between HMD scores in seniors (mean 50.49, SD = 11.29) and HMD scores in young adults (mean 59.72, SD = 5.86); the user experience with the desktop platform showed no group effect (p = 0.999). There was no significant difference between the platforms’ usability scores in either of the age groups.

Individual Questions

In addition to cumulative scores calculated for individual platforms and groups, we analyzed the results for individual items from sections UQ II HMD and UQ II D. Because of the Likert scale usage, we investigated the difference between the platforms with a non-parametric Wilcoxon-signed rank test. After Bonferroni correction for repeated statistical comparison (α = 0.01), we observed a significant difference between the platforms only in the group of young adults. Specifically, the young adults preferred HMD (mean 4.2, SD = 1.11) over the desktop platform (mean 2.04, SD = 0.97), Z = −3.42, p < 0.001. The young adults also enjoyed the HMD (mean = 4.32, SD = 0.9) significantly more than the desktop (mean 2, SD = 0.81), Z = −3.98, p < 0.001). For details, see Table 5.

TABLE 5
www.frontiersin.org

Table 5. Mean score of individual questions.

Side Effects

In the usability questionnaire sections UQ II HMD and UQ II D, we asked participants about the adverse effects of the specific platform. The participants were asked about unpleasant feelings connected with the task; if they reported the presence of unpleasant feelings, they were asked to specify the feeling (Was the unpleasant feeling connected with experienced discomfort? Select one or more options from the list of the possible adverse effects…). The incidence of the side effects, including their specific characteristics, are reported in Table 6. Importantly, the reported side effects were small and no participant asked to terminate their participation in the study.

TABLE 6
www.frontiersin.org

Table 6. The incidence of reported side effects associated with VR experience.

Discussion

The main findings of the presented study are the significant age-related differences across the tested VR platforms (HMD vs. desktop) that were identified not only in terms of assessed performance but also in user experience. This age-related effect is not surprising as the addressed groups typically differ in experience with new technologies, of which HMD is an example.

Memory Recall

The study aimed to evaluate possible effects of immersion level (desktop vs. HMD platform) on the ability to recall items from a presented shopping list (participant accuracy was expressed as the number of errors in the vSST task). According to our results, the seniors made significantly more errors when using the HMD platform than when using the desktop platform. The vSST recall performance of the young adults was stable regardless of the platform used. Our findings for the senior group are in accordance with some previous studies investigating navigation and spatial memory (Sousa Santos et al., 2009) that associated the desktop platform with superior performance. Similar findings were reported in a study by Mania and Chalmers (2001) that investigated the ability to recall information from a seminar presented in four conditions: a real-world environment, desktop, HMD, and audio-only. According to that study, the memory performance was the best in the real-world scenario and the worst in the HMD platform. Moreover, the memory recall was statistically higher in the desktop platform than in HMD.

Other studies favor the HMD platform in terms of spatial memory recall (Ruddle et al., 1999; Bowman et al., 2009; Murcia-López and Steed, 2016). A possible explanation for such contradictory results is that the benefits of HMD, such as the active movement control and rotation controlled by head movements, are highlighted in studies that assess spatial navigation abilities. This potential of HMD might be overshadowed by different factors in non-spatial memory tasks.

We speculate that the presentation of the recall tasks in HMD can lead to perceptual or cognitive overload; the participants are present “inside” a virtual environment with possibly higher perceptual stimulation (Richards and Taylor, 2015). The possibility that higher immersion is a distracting factor while learning a task has been investigated. Despite the motivational potential of HMD, the higher immersion can distract participants from the studied material (Moreno and Mayer, 2004; Richards and Taylor, 2015; Parong and Mayer, 2018). Makransky et al. (2019) pointed out a possible effect of higher levels of cognitive load (measured by EEG) associated with more immersive technology. These findings may explain the inferior HMD performance observed in the seniors, considering the goal of the task (remembering a shopping list). The difference between the young adult and senior subjects in our study could be thus related to the lower ability to inhibit distracting information in seniors (Moreno and Mayer, 2004).

On the other hand, the higher stimulation and distraction of the HMD platform might in some way reflect its higher ecological validity in comparison to the desktop platform. For this reason, it would be beneficial to add an extra measure of ecological validity in future comparative studies.

Importantly, most of the mentioned studies did not investigate age-related differences. Such a comparison, in terms of acceptance of new technologies and memory assessment, is important, as memory decline is typical in older adults (Small, 2001). A comparison of the different platforms and two age groups (young adults ages 16–35; seniors ages 60–75) was conducted by Rand et al. (2005). The authors used the “Virtual Office” environment, which was developed to assess attention and memory performance (Rizzo et al., 2002). Based on the obtained results, the performance of both age groups was significantly lower when using the HMD platform. These findings are only partially in accordance with our results as the authors observed an inferior HMD performance also in young adults. This difference in the obtained results could be explained by technological progress in HMD devices in recent years.

Regardless of the observed effect of platform on performance in the memory task in seniors, the fact that the group of seniors performed worse in both platforms than the group of young adults confirms the validity of vSST for memory assessment. The validity of the task was also indicated in previous studies conducted on healthy young adults and patients with chronic schizophrenia (Plechatá, 2017; Plechatá et al., 2017).

By counterbalancing the order of the platforms and task variants applied we controlled for possible effects of fatigue and practice effect. A similar approach was applied in other studies (Ruddle et al., 1999; Sousa Santos et al., 2009). Additionally, in our study the platform order was used as a confounding variable in the presented GLM analysis. We expected that previous experience with the task using the desktop platform would improve consecutive HMD performance. Surprisingly, when using the desktop platform first, the participants from both age groups made higher numbers of errors using HMD than they did using the desktop platform. In contrast, if the HMD platform was presented first, the performance was comparable between both platforms.

Several possible factors might have induced this interaction effect. We argue that the HMD performance might be influenced by the fatigue of the subjects (due to the repeated measurement); the results would differ with the desktop platform, as most of the participants had previous experience with the desktop but not with the HMD platform. Higher sensitivity to fatigue in seniors (Eldadah, 2010) can be also associated with the perceptual overload of HMD, mentioned above, which can lead to higher difficulty of the task itself. Unfortunately, to our knowledge none of the previous studies analyzed the effect of the order in which the platforms were applied (Ruddle et al., 1999; Sousa Santos et al., 2009).

User Experience

According to the results of the usability questionnaire, the user experience with HMD or desktop platforms is not comparable across the different age groups. The seniors evaluated the HMD experience differently than the young adult subjects. In general, the young adults evaluated the experience with higher scores than the seniors did. However, in the cumulative score of the questionnaire, we found no significant preference for HMD or desktop platform in the young adult or senior participants. The fact that the young adults scored higher in the usability questionnaire than seniors did regardless of the platform may reflect a difference in their attitude toward the specific task or toward computer technology in general.

In respect to individual categories evaluated in the usability questionnaire, the participants in our study favored neither HMD nor desktop platforms in terms of input controls or intelligibility of the task. Nevertheless, the younger adults stated that they liked the HMD platform more than desktop platform. Similarly, the younger participants enjoyed the experience of using HMD more than using the desktop platform. Our findings are in line with the results of previous studies that favored the HMD platform over desktop and screen platforms (Adamo-Villani and Wilbur, 2008; Sousa Santos et al., 2009) in cognitive assessments of young adults. The participants of these studies preferred HMD in general; they considered it more intuitive (Sousa Santos et al., 2009) and more fun (Adamo-Villani and Wilbur, 2008). As both evaluated factors are closely related to motivation, these results might also be supported by studies focusing on the potential of HMD for educational purposes showing that the more immersive technology increased motivation to study (Moreno and Mayer, 2004; Richards and Taylor, 2015; Parong and Mayer, 2018).

On the other hand, the user experience evaluated by seniors in our study did not reflect these findings as the seniors preferred neither HMD nor the desktop platform. Unfortunately, to our knowledge, the existing studies comparing the two platforms in cognitive assessments did not involve older adults. The only exception is the study by Rand et al. (2005), which did not investigate the platform-dependent difference in the user experience. None of the seniors recruited in our study had previous experience with HMD and virtual reality games, while most of the seniors were experienced with computers. As was demonstrated previously, repeated exposure to immersive VR can lead to a decrease of its adverse effects (Taylor et al., 2011); therefore, it could be expected that it also leads to the improvement in other variables of the user experience. The role of repeated exposure either to HMD or to the task itself should be further studied in order to evaluate its potential for cognitive training and remediation.

Considering the adverse effects of immersive virtual reality, the presence of typical side effects associated with HMD were very low among seniors. Moreover, no cybersickness symptoms were reported in the group of young adults. The higher acceptance of immersive VR in this study without negative side effects could be associated with the design and navigation system used in the task (combination of teleport and active movement).

Limitations

Despite our effort to control for other confounding factors (e.g., by a counterbalanced order of the platforms), we admit that the differences observed in the task performance could have been influenced by other variables.

In particular, the inferior performance in HMD observed in the group of seniors could be associated with the small but important distinction of the experimental procedure. In contrast to the desktop platform, during the HMD condition the participant was instructed first to take off the HMD and then to sit at a nearby table and play a visuospatial game LEU (used as a distractor in both platforms). Thus, with the HMD platform, there was a specific additional distractor in the form of removing the HMD glasses. Moreover, the participants were standing during HMD and sitting while using desktop platform. The different motor involvement in the task and different control system could influence task performance. This effect could be even stronger in a group of seniors with lower visuospatial coordination abilities (Hoogendam et al., 2014). In future studies, the distinction in the experimental setting could be eliminated by adding a distraction task directly into the VR application, thus not requiring participants to take off HMD glasses during the procedure.

Despite the investigation of the role of immersion, we did not study the sense of presence that is typically measured by questionnaires (Slater et al., 1994) after performing the VR task. As the level of presence was not a key variable in this study, it was not investigated mainly due to higher time demands of the experimental procedure in individual participants. It could be, however, beneficial to study the difference in the sense of presence especially in seniors, as it might explain the age-related variance in the platform performance and user experience in more detail. It was previously shown that the sense of presence is typically higher when using more immersive technology (Slater, 2018). A recent study (Corriveau Lecavalier et al., 2018) showed that both young and older adults experience comparable level of presence in immersive VR environment. However, this study also reports positive correlation between the performances measured in a Virtual Shop task aimed at episodic memory and reported sense of presence in seniors. These results do not explain the negative effect of higher immersion on performance of seniors found in our study. This discrepancy should be therefore addressed in future studies.

Finally, despite the reasonable number of participants recruited in this study, the number of subjects with limited or no PC experience made it impossible to evaluate the possible benefits of HMD technology in such participants, especially in the group of seniors. Future studies should investigate the role of ecological validity in terms of VR immersion level and behavioral outcomes of the participants.

Conclusion

In the presented study, we studied the age-related differences between HMD and desktop platforms in memory assessment using an intra-subject design. Groups of seniors and young adults performed a virtual Supermarket Shopping task aimed at episodic memory using HMD and desktop platforms in a counterbalanced order. We focused on the role of the level of immersion on the task performance and its usability. According to our results, the senior performances were inferior in HMD in contrast to the desktop platform. The measured performance of the young adults was stable and comparable regardless of the platform used. In the context of the diagnostic application of VR tasks in seniors, our results indicate that it is necessary to create separate normative data for the task, dependent on the VR platform used for the assessment. Furthermore, the HMD platform was more influenced by fatigue of the participants, as the performance was lower on HMD for both groups when performing HMD as the second platform. In general, the seniors evaluated their user experience lower than the young adults did regardless of the platform used. We did not find any significant platform-related differences in overall user experience in any of the tested groups. However, according to the data obtained in individual items of the questionnaire, the young adults tended to prefer HMD over the desktop platform.

Our results indicate that performing the task with HMD may be more difficult than with the desktop platform; this difficulty may be associated with perceptual overload in the senior subjects. It might also indicate the superior ecological validity of the HMD presented task; this possibility should be studied further. The fact that the user experience did not differ across the platforms used and only minimal side effects were reported indicate that highly immersive technology may be well accepted by aging adults. This may have implications for the further use of HMD in cognitive remediation; this has been proposed in previous studies (Gamito et al., 2014). We hypothesize that with repeated HMD experiences, seniors will find it more motivating and intuitive to use than the desktop platform. However, in the context of diagnostic use of VR in a single session, the benefits of higher immersion are questionable.

Ethics Statement

This study was carried out in accordance with the recommendations of “NIMH CZ Ethics Committee” with written informed consent from all subjects. All subjects gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the “NIMH CZ Ethics Committee.”

Author Contributions

AP was responsible for the design of the experiment and data collection. VS developed the virtual supermarket shopping task. DF was responsible for recruiting the participants. IF supervised the whole study and together with AP was responsible for writing the manuscript.

Funding

This study was funded by the Charles University grant agency project no. 1832218, with financial support from the European Regional Development Fund project “PharmaBrain” no. CZ.02.1.01/0.0/0.0/16_025/0007444 and Technology Agency of the Czech Republic project no. TL01000309.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Acknowledgments

We would like to thank Aleš Bartoš and his team at the Department of Cognitive Disorders NIMH who were responsible for creating the database of healthy senior participants that allowed us to recruit this group of volunteers. We thank Jan Šeliga for the preparing the cumulative dataset, and the students who participated in recruiting and assessing the volunteers, mainly Filip Havlík, Markéta Slezáková, and Hana Šrámková. We also thank Dr. Tereza Nekovářová for her feedback on the study design.

Supplementary Material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2019.01330/full#supplementary-material

Footnotes

  1. ^ https://unity3d.com/
  2. ^ http://www.leubrainstimulator.com/

References

Adamo-Villani, N., and Wilbur, R. B. (2008). “Effects of platform (immersive versus non-immersive) on usability and enjoyment of a virtual learning environment for deaf and hearing children,” in Posters Presented Eurographics Symposium on Virtual Environments, eds B. van Liere and B. Mohler (Genova: The Eurographics Association).

Google Scholar

Bäckman, L., Small, B. J., and Fratiglioni, L. (2001). Stability of the preclinical episodic memory deficit in Alzheimer’s disease. Brain 124, 96–102. doi: 10.1093/brain/124.1.96

PubMed Abstract | CrossRef Full Text | Google Scholar

Boraxbekk, C. -J., Lundquist, A., Nordin, A., Nyberg, L., Nilsson, L.-G., and Adolfsson, R. (2015). Free recall episodic memory performance predicts dementia ten years prior to clinical diagnosis: findings from the betula longitudinal study. Dement. Geriatr. Cogn. Dis. Extra. 5, 191–202. doi: 10.1159/000381535

PubMed Abstract | CrossRef Full Text | Google Scholar

Bowman, D. A., Sowndararajan, A., Ragan, E. D., and Kopper, R. (2009). “Higher levels of immersion improve procedure memorization performance,” in Proceedings of the 15th Joint Virtual Reality Eurographics Conference on Virtual Environments (Genova: The Eurographics Association), 121–128.

Google Scholar

Corriveau Lecavalier, N., Ouellet, É., Boller, B., and Belleville, S. (2018). Use of immersive virtual reality to assess episodic memory: a validation study in older adults. Neuropsychol. Rehabil. doi: 10.1080/09602011.2018.1477684 [Epub ahead of print].

PubMed Abstract | CrossRef Full Text | Google Scholar

Eldadah, B. A. (2010). Fatigue and fatigability in older adults. PM&R 2, 406–413. doi: 10.1016/j.pmrj.2010.03.022

PubMed Abstract | CrossRef Full Text | Google Scholar

Eurostat (2018). People in the EU - Statistics on an Ageing Society - Statistics Explained Available at: https://ec.europa.eu/eurostat/statistics-explained/index.php/People_in_the_EU_-_statistics_on_an_ageing_society (accessed October 29, 2018).

Google Scholar

Franzen, M. D. (1997). “The Validity of Neuropsychological Assesment Procedures,” in Biological and Neuropsychological Mechanisms: Life-Span Developmental Psychology - Conference on Life Span Developmental Psychology ed. H. W. Reese (Morgantown, W VA: Psychology Press), 51–69Google Scholar

Gamito, P., Oliveira, J., Santos, N., Pacheco, J., Morais, D., Saraiva, T., et al. (2014). Virtual exercises to promote cognitive recovery in stroke patients: the comparison between head mounted displays versus screen exposure methods. Int. J. Disabil. Hum. Dev. 13, 337–342. doi: 10.1515/ijdhd-2014-0325

CrossRef Full Text | Google Scholar

Harada, C. N., Natelson Love, M. C., and Triebel, K. L. (2013). Normal cognitive aging. Clin. Geriatr. Med. 29, 737–752. doi: 10.1016/j.cger.2013.07.002

PubMed Abstract | CrossRef Full Text | Google Scholar

Hoogendam, Y. Y., van der Lijn, F., Vernooij, M. W., Hofman, A., Niessen, W. J., van der Lugt, A., et al. (2014). Older age relates to worsening of fine motor skills: a population-based study of middle-aged and elderly persons. Front. Aging Neurosci. 6:259. doi: 10.3389/fnagi.2014.00259

PubMed Abstract | CrossRef Full Text | Google Scholar

Kaufmann, H., and Dünser, A. (2007). “Summary of Usability Evaluations of an Educational Augmented Reality Application,” in Virtual Reality ICVR 2007. Lecture Notes in Computer Science ed. R. Shumaker (Berlin: Springer), 660–669. doi: 10.1007/978-3-540-73335-5_71

CrossRef Full Text | Google Scholar

Lewis, J. R. (1995). IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int. J. Hum. Comput. Interact. 7, 57–78. doi: 10.1080/10447319509526110

CrossRef Full Text | Google Scholar

Makransky, G., Terkildsen, T. S., and Mayer, R. E. (2019). Adding immersive virtual reality to a science lab simulation causes more presence but less learning. Learn. Instr. 60, 225–236. doi: 10.1016/J.LEARNINSTRUC.2017.12.007

CrossRef Full Text | Google Scholar

Mania, K., and Chalmers, A. (2001). The effects of levels of immersion on memory and presence in virtual environments: a reality centered approach. CyberPsychol. Behav. 4, 247–264. doi: 10.1089/109493101300117938

PubMed Abstract | CrossRef Full Text | Google Scholar

Martínez-Arán, A., Vieta, E., Reinares, M., Colom, F., Torrent, C., Sánchez-Moreno, J., et al. (2004). Cognitive function across manic or hypomanic, depressed, and euthymic states in bipolar disorder. Am. J. Psychiatry 161, 262–270. doi: 10.1176/appi.ajp.161.2.262

PubMed Abstract | CrossRef Full Text | Google Scholar

Moreno, R., and Mayer, R. E. (2004). Personalized messages that promote science learning in virtual environments. J. Educ. Psychol. 96, 165–173. doi: 10.1037/0022-0663.96.1.165

CrossRef Full Text | Google Scholar

Murcia-López, M., and Steed, A. (2016). The effect of environmental features, self-avatar, and immersion on object location memory in virtual environments. Front. ICT 3:24. doi: 10.3389/fict.2016.00024

CrossRef Full Text | Google Scholar

Naqvi, R., Liberman, D., Rosenberg, J., Alston, J., and Straus, S. (2013). Preventing cognitive decline in healthy older adults. Can. Med. Assoc. J. 185, 881–885. doi: 10.1503/cmaj.121448

PubMed Abstract | CrossRef Full Text | Google Scholar

Neisser, U. (1978). “Memory: What are the important questions?,” in Practical Aspects of Memory, eds M. Gruneberg, P. Morris, and R. Sykes (London: Academic Press), 3–24.

Google Scholar

O’Regan, J. K., and Noë, A. (2001). A sensorimotor account of vision and visual consciousness. Behav. Brain Sci. 24, 939–973. doi: 10.1017/S0140525X01000115

PubMed Abstract | CrossRef Full Text | Google Scholar

Parong, J., and Mayer, R. E. (2018). Learning science in immersive virtual reality. J. Educ. Psychol. 110, 785–797. doi: 10.1037/edu0000241

CrossRef Full Text | Google Scholar

Parsons, T. D. (2015). Virtual reality for enhanced ecological validity and experimental control in the clinical, affective and social neurosciences. Front. Hum. Neurosci. 9:660. doi: 10.3389/fnhum.2015.00660

PubMed Abstract | CrossRef Full Text | Google Scholar

Pause, B. M., Zlomuzica, A., Kinugawa, K., Mariani, J., Pietrowsky, R., and Dere, E. (2013). Perspectives on episodic-like and episodic memory. Front. Behav. Neurosci. 7:33. doi: 10.3389/fnbeh.2013.00033

PubMed Abstract | CrossRef Full Text | Google Scholar

Plechatá, A. (2017). Feasibility of Using Virtual Reality for Remediation Of Memory Deficit in Schizophrenia Patients. Available at: https://dspace.cuni.cz/handle/20.500.11956/93118 (accessed October 31, 2018).

Google Scholar

Plechatá, A., Fajnerová, I., Hejtmánek, L., and Sahula, V. (2017). “Development of a virtual supermarket shopping task for cognitive remediation of memory and executive functions in schizophrenia,” in Proceedings of the 2017 International Conference on Virtual Rehabilitation (ICVR) (Montreal, QC: IEEE).

Google Scholar

Preiss, M. (1999). Pamět’ový Test Učení. [Auditory Verbal Learning test. Manual]. Brno: Psychodiagnostika.

Google Scholar

Preiss, M., and Preiss, J. (2006). Test Cesty [Trail Making Test]. Bratislava: MD: Psychodiagnostika.

Google Scholar

Preiss, M., Rodriguez, M., and Laing, H. (2012). Neuropsychological Battery - Neuropsychologická Baterie Psychiatrického Centra Praha: Klinické Vyšetřeni Zaìkladniìch Kognitivniìch Funkciì, 3rd ed. Prague: Psychiatrickeì centrumGoogle Scholar

Rand, D., Kizony, R., Feintuch, U., Katz, N., Josman, N., Rizzo, A., et al. (2005). Comparison of two VR platforms for rehabilitation: video capture versus HMD. Pres. Teleoperat. Virt. Environ. 14, 147–160. doi: 10.1162/1054746053967012

CrossRef Full Text | Google Scholar

Reitan, R. M., and Wolfson, D. (1985). The Halstead-Reitan Neuropsychological Test Battery: Theory and Clinical Interpretation. Tucson Ariz: Neuropsychology Press.

Google Scholar

Rey, A. (1964). L’examen Clinique en Psychologie. 2e éd. Paris: Presses universitaires de France.

Google Scholar

Richards, D., and Taylor, M. (2015). A Comparison of learning gains when using a 2D simulation tool versus a 3D virtual world: An experiment to find the right representation involving the marginal value theorem. Comput. Educ. 86, 157–171. doi: 10.1016/j.compedu.2015.03.009

CrossRef Full Text | Google Scholar

Rizzo, A. A., Bowerly, T., Buckwalter, J. G., Schultheis, M., Matheis, R., Shahabi, C., et al. (2002). “Virtual Environments for the Assessment of Attention and Memory Processes: The Virtual Classroom and Office,” in Proceedings of the International Conference on Disability, Virtual Reality and Associated Technology 2002 (ICDVRAT2000), Vesaprem.

Google Scholar

Rönnlund, M., Nyberg, L., Bäckman, L., and Nilsson, L. -G. (2005). Stability, growth, and decline in adult life span development of declarative memory: cross-sectional and longitudinal data from a population-based study. Psychol. Aging 20, 3–18. doi: 10.1037/0882-7974.20.1.3

PubMed Abstract | CrossRef Full Text | Google Scholar

Ruddle, R. A., Payne, S. J., and Jones, D. M. (1999). Navigating large-scale virtual environments: what differences occur between helmet-mounted and desk-top displays? Pres. Teleoper. Virt. Environ. 8, 157–168. doi: 10.1162/105474699566143

CrossRef Full Text | Google Scholar

Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philos. Trans. R. Soc. Lond. B. Biol. Sci. 364, 3549–3557. doi: 10.1098/rstb.2009.0138

PubMed Abstract | CrossRef Full Text | Google Scholar

Slater, M. (2018). Immersion and the illusion of presence in virtual reality. Br. J. Psychol. 109, 431–433. doi: 10.1111/bjop.12305

PubMed Abstract | CrossRef Full Text | Google Scholar

Slater, M., Usoh, M., and Steed, A. (1994). Depth of presence in virtual environments. Pres. Teleoperat. Virt. Environ. 3, 130–144. doi: 10.1162/pres.1994.3.2.130

CrossRef Full Text | Google Scholar

Small, S. A. (2001). Age-related memory decline. Arch. Neurol. 58, 360–364. doi: 10.1001/archneur.58.3.360

PubMed Abstract | CrossRef Full Text | Google Scholar

Sousa Santos, B., Dias, P., Pimentel, A., Baggerman, J.-W., Ferreira, C., Silva, S., et al. (2009). Head-mounted display versus desktop for 3D navigation in virtual reality: a user study. Multimed. Tools Appl. 41, 161–181. doi: 10.1007/s11042-008-0223-2

CrossRef Full Text | Google Scholar

Sternberg, R. J., Sternberg, K., and Mio, J. S. (2012). Cognitive Psychology. 6th ed. Wadsworth: Cengage Learning.

Google Scholar

Taylor, L. C., Harm, D. L., Kennedy, R. S., Reschke, M. F., and Loftin, R. B. (2011). “Cybersickness Following Repeated Exposure to DOME and HMD Virtual Environments,” in Proceedings of the 3rd International Symposium on Visual Image Safety Las Vegas, NV.

Google Scholar

Tupper, D. E., and Cicerone, K. D. (1990). “Introduction to the Neuropsychology of Everyday Life,” in The Neuropsychology of Everyday Life: Assessment and Basic Competencies, eds D. E. Tupper and K. D. Cicerone (Boston, MA: Springer) 3–18. doi: 10.1007/978-1-4613-1503-2_1

CrossRef Full Text | Google Scholar

United States Census Bureau (2018). Projected Age Groups and Sex Composition of the Population: Main Projections Series for the United States, 2017-2060 Available at: https://www.census.gov/data/tables/2017/demo/popproj/2017-summary-tables.html (accessed October 29, 2018).

Google Scholar

Keywords: virtual reality, memory assessment, aging, immersion, neurocognitive methods

Citation: Plechatá A, Sahula V, Fayette D and Fajnerová I (2019) Age-Related Differences With Immersive and Non-immersive Virtual Reality in Memory Assessment. Front. Psychol. 10:1330. doi: 10.3389/fpsyg.2019.01330

Received: 31 October 2018; Accepted: 22 May 2019;
Published: 11 June 2019.

Edited by:

Massimo Bergamasco, Sant’Anna School of Advanced Studies, Italy

Reviewed by:

Pedro Gamito, Universidade Lusófona, Portugal
Pascual Gonzalez, University of Castilla La Mancha, Spain

Copyright © 2019 Plechatá, Sahula, Fayette and Fajnerová. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Adéla Plechatá, adela.plechata@nudz.cz; Iveta Fajnerová, iveta.fajnerova@nudz.cz

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.