Europe PMC

This website requires cookies, and the limited processing of your personal data in order to function. By using the site you are agreeing to this as outlined in our privacy notice and cookie policy.

Abstract 


Virtual reality (VR) has been proposed as a methodological tool to study the basic science of psychology and other fields. One key advantage of VR is that sharing of virtual content can lead to more robust replication and representative sampling. A database of standardized content will help fulfill this vision. There are two objectives to this study. First, we seek to establish and allow public access to a database of immersive VR video clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given the large sample size of participants needed to get reliable valence and arousal ratings for our video, we were able to explore the possible links between the head movements of the observer and the emotions he or she feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. We also tracked participants' rotational head movements as they watched the clips, allowing us to correlate head movements and affect. Based on past research, we predicted relationships between the standard deviation of head yaw and valence and arousal ratings. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal, with a slight underrepresentation of clips that are of negative valence and highly arousing. The standard deviation of yaw positively correlated with valence, while a significant positive relationship was found between head pitch and arousal. The immersive VR clips tested are available online as supplemental material.

Free full text 


Logo of frontpsycholLink to Publisher's site
Front Psychol. 2017; 8: 2116.
Published online 2017 Dec 5. https://doi.org/10.3389/fpsyg.2017.02116
PMCID: PMC5723428
PMID: 29259571

A Public Database of Immersive VR Videos with Corresponding Ratings of Arousal, Valence, and Correlations between Head Movements and Self Report Measures

Abstract

Virtual reality (VR) has been proposed as a methodological tool to study the basic science of psychology and other fields. One key advantage of VR is that sharing of virtual content can lead to more robust replication and representative sampling. A database of standardized content will help fulfill this vision. There are two objectives to this study. First, we seek to establish and allow public access to a database of immersive VR video clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given the large sample size of participants needed to get reliable valence and arousal ratings for our video, we were able to explore the possible links between the head movements of the observer and the emotions he or she feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. We also tracked participants' rotational head movements as they watched the clips, allowing us to correlate head movements and affect. Based on past research, we predicted relationships between the standard deviation of head yaw and valence and arousal ratings. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal, with a slight underrepresentation of clips that are of negative valence and highly arousing. The standard deviation of yaw positively correlated with valence, while a significant positive relationship was found between head pitch and arousal. The immersive VR clips tested are available online as supplemental material.

Keywords: virtual reality, database, immersive VR clips, head movement, affective ratings

Introduction

Blascovich et al. (2002) proposed the use of virtual reality (VR) as a methodological tool to study the basic science of psychology and other fields. Since then, there has been a steady increase in the number of studies that seek to use VR as a tool (Schultheis and Rizzo, 2001; Fox et al., 2009). Some studies use VR to examine how humans respond to virtual social interactions (Dyck et al., 2008; Schroeder, 2012; Qu et al., 2014) or as a tool for exposure therapy (Difede and Hoffman, 2002; Klinger et al., 2005), while others employ VR to study phenomenon that might otherwise be impossible to recreate or manipulate in real life (Slater et al., 2006; Peck et al., 2013). In recent years, the cost of a typical hardware setup has decreased dramatically, allowing researchers to spend less than the typical price of a laptop to implement compelling VR. One of the key advantages of VR for the study of social science is that sharing of virtual content will allow “not only for cross-sectional replication but also for more representative sampling” (Blascovich et al., 2002). What is needed to fulfill this vision is a database of standardized content.

The immersive video (or immersive VR clip) is one powerful and realistic aspect of VR. It shows a photorealistic video of a scene that updates based on head-orientation but is not otherwise interactive (Slater and Sanchez-Vives, 2016). When a viewer watches an immersive VR clip, he sees a 360° view from where the video was originally recorded, and while changes in head orientation are rendered accurately, typically these videos do not allow for head translation. A video is recorded using multiple cameras and stitched together through software to form a total surround scene. In this sense, creating content for immersive video is fairly straightforward, and consequently there is a wealth of content publicly available on social media sites (Multisilta, 2014).

To accomplish the goal of a VR content database, we sourced and created a library of immersive VR clips that can act as a resource for scholars, paralleling the design used in prior studies on affective picture viewing (e.g., International Affective Picture System, IAPS; Lang et al., 2008). The IAPS is a large set of photographs developed to provide emotional stimuli for psychological and behavioral studies on emotion and mood induction. Participants are shown photographs and asked to rate each on the dimensions of valence and arousal. While the IAPS and its acoustic stimuli counterpart the International Affective Digital Sounds (IADS; Bradley and Lang, 1999) are well-established and used extensively in emotional research, a database of immersive VR content that can potentially induce emotions does not exist to our knowledge. As such, we were interested to explore if we can establish a database of immersive VR clips for emotion induction based on the affective response of participants.

Most VR systems allow a user to have a full 360° head rotation view, such that the content updates based on the particular orientation of the head. In this sense, the so-called field of regard is higher in VR than in traditional media such as the television, which doesn't change when one moves her head away from the screen. This often allows VR to trigger strong emotions in individuals (Riva et al., 2007; Parsons and Rizzo, 2008). However, few studies have examined the relationship between head movements in VR and emotions. Darwin (1965) discussed the idea of head postures representing emotional states. When one is happy, he holds his head up high. Conversely, when he is sad, his head tends to hang low. Indeed, more recent empirical research has provided empirical evidence for these relationships (Schouwstra and Hoogstraten, 1995; Wallbott, 1998; Tracy and Matsumoto, 2008).

An early study which investigated the influence of body movements on presence in virtual environments found a significant positive association between head yaw and reported presence (Slater et al., 1998). In a study on head movements in VR, participants saw themselves in a virtual classroom and participated in a learning experience (Won et al., 2016). Results showed a relationship between lateral head rotations and anxiety, where the standard deviation of head yaw significantly correlated to the awareness and concern individuals had regarding other virtual people in the room. Livingstone and Palmer (2016) tasked vocalists to speak and sing passages of varying emotions (e.g., happy, neutral, sad) and tracked their head movements using motion capture technology. Findings revealed a significant relationship between head pitch and emotions. Participants raised their heads when vocalizing passages that conveyed happiness and excitement and lowered their heads for those of a sad nature. Understanding the link between head movements in VR and emotions may be key in the development and implementation of VR in the study and treatment of psychological disorders (Wiederhold and Wiederhold, 2005; Parsons et al., 2007).

There are two objectives of the study: First, we seek to establish and allow public access to a database of immersive VR clips that can act as a potential resource for studies on emotion induction using virtual reality. Second, given we need a large sample size of participants to get reliable valence and arousal ratings for our video, we are in a unique position explore the possible links between head movements and the emotions one feels while viewing immersive VR. To accomplish our goals, we sourced for and tested 73 immersive VR clips which participants rated on valence and arousal dimensions using self-assessment manikins. These clips are available online as supplemental material. We also tracked participants' rotational head movements as they watched the clips, allowing us to correlate the observers' head movements and affect. Based on past research (Won et al., 2016), we predicted significant relationships between the standard deviation of head yaw with valence and arousal ratings.

Methods

Participants

Participants comprised of undergraduates from a medium-sized West Coast university who received course credit for their participation. In total, 95 participants (56 female) between the ages of 18 and 24 took part in the study.

Stimulus and measures

The authors spent 6 months searching for clips of immersive VR which they thought will effectively induce emotions. Sources include personal contacts and internet searches on website such as YouTube, Vrideo, and Facebook. In total, more than 200 immersive VR clips were viewed and assessed. From this collection, 113 were shortlisted and subjected to further analysis. The experimenters evaluated the video clips and a subsequent round of selection was conducted based on the criteria employed by Gross and Levenson (1995). First, the clips had to be of relatively short length. This is especially important as longer clips may induce fatigue and nausea among participants. Second, the VR clips had to be understandable on their own without the need for further explanation. As such, clips which were sequels or part of an episodic series were excluded. Third, the VR clips should be likely to induce valence and arousal. The aim is to get a good spread of videos that will vary across the dimensions. A final 73 immersive VR clips were selected for the study. They ranged from 29 to 668 s in length with an average of 188 s per clip.

Participants viewed the immersive VR clips through an Oculus Rift CV1 (Oculus VR, Menlo Park, CA) head-mounted display (HMD). The Oculus Rift has a resolution of 2,160 × 1,200 pixels, a 110° field of view and a refresh rate of 90 Hz. The low-latency tracking technology determines the relative position of the viewer's head and adjusts his view of the immersive video accordingly. Participants interacted with on-screen prompts and rated the videos using an Oculus Rift remote. Vizard 5 software (Worldviz, San Francisco, CA) was used to program the rating system. The software ran on a 3.6 GHz Intel i7 computer with an Nvidia GTX 1080 graphics card. The experimental setup is shown in Figure Figure11.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-02116-g0001.jpg

The experimental setup depicting a participant (A) wearing an Oculus Rift HMD to view the immersive VR clips, and (B) holding an Oculus. Rift remote to select his affective responses to his viewing experience.

The Oculus Rift HMD features a magnetometer, gyroscope, and accelerometer which combine to allow for tracking of rotational head movement. The data was digitally captured and comprised of the pitch, yaw, and roll of the head. These are standard terms for rotations around the respective axes, and are measured in degrees. Pitch refers to the movement of the head around the X-axis, similar to a nodding movement. Yaw represents the movement of the head around the Y-axis, similar to turning the head side-to-side to indicate “no.” Roll refers to moving the head around the Z-axis, similar to tilting the head from one shoulder to the other. These movements are presented in Figure Figure2.2. As discussed earlier, Won et al. (2016) found a relationship between lateral head rotations and anxiety. They showed that scanning behavior, defined as the standard deviation of head yaw, significantly correlated with the awareness and concern people had of virtual others. In this study, we similarly assessed how much participants moved their heads by calculating the standard deviations of the pitch, yaw, and roll of their head movements while they watched each clip and included them as our variables.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-02116-g0002.jpg

Depiction of the three angles of rotational movement- pitch, yaw, and roll.

Participants made their ratings using the self-assessment manikin (SAM; Lang, 1980). SAM shows a series of graphical figures that range along the dimensions of valence and arousal. The expressions of these figures vary across a continuous scale. The SAM scale for valence shows a sad and unhappy figure on one end, and a smiling and happy figure at the other. For arousal, the SAM scale depicts a calm and relaxed figure on one end, and an excited and interested figure on the other. A 9-point rating scale is presented at the bottom of each SAM. Participants select one of the options while wearing the HMD using the Oculus Rift remote control device that could scroll among options. Studies have shown that SAM ratings of valence and arousal are similar to those obtained from the verbal semantic differential scale (Lang, 1980; Ito et al., 1998). The SAM figures are presented in Figure Figure33.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-02116-g0003.jpg

SAM figures to measure valence and arousal ratings.

Procedure

Pretests were conducted to find out the duration that participants were comfortable with watching immersive videos before they experience fatigue or simulation sickness. Results revealed that some participants encountered fatigue and/or nausea if they watched for more than 15 min without a break. Most participants were at ease with a duration of around 12 min. The 73 immersive VR clips were then divided into clusters with an approximate duration of 12 min per cluster. This resulted in a total of 19 groups of videos. Based on the judgment of the experimenters, no more than two clips of a particular valence (negative/positive) or arousal (low/high) were shown consecutively (Gross and Levenson, 1995). This was to discourage participants from being too involved in any particular affect and influence his judgement in the subsequent ratings. Each video clip was viewed by a minimum of 15 participants.

When participants first arrived, they were briefed by the experimenter that the purpose of the study was to examine how people respond to immersive videos. Participants were told that they would be wearing an HMD to view the immersive videos, and that they can request to stop participating at any time if they feel discomfort, nauseous, or some form of simulator sickness. Participants were then presented with a printout of the SAM measures for valence and arousal, and told that they would be rating the immersive videos based on these dimensions. Participants were then introduced to the Oculus Rift remote and its operation in order to rate the immersive VR clips.

The specific procedure is presented here: Participants sat on swivel chair which allowed them to turn around 360° if they wished to. They first watched a test immersive VR clip and did a mock rating to get accustomed to the viewing and rating process. They then watched and rated a total of three groups of video clips with each group comprising of between two and four video clips. A 5 s preparation screen was presented before each clip. After the clip was shown, participants were presented with the SAM scale for valence. After participants selected the corresponding rating using the Oculus Rift remote, the SAM scale for arousal was presented and participants made their ratings. Following this, the aforementioned 5 s preparation screen was presented to get participants ready to view the next clip. After watching one group of immersive VR clips, participants were given a short break of about 5 min before continuing with the next group of clips. This was done to minimize the chances of participants feeling fatigue or nauseous by allowing them to rest in between group of videos. With each group of videos having a duration of about 12 min, the entire rating process lasted around 40 min.

Results

Affective ratings

Figure Figure44 shows the plots of the immersive video clips (labeled by their ID numbers) based on mean ratings of valence and arousal. There is a varied distribution of video clips above the midpoint (5) of valence that vary across arousal ratings. However, despite our efforts to locate and shortlist immersive VR clips for the study, there appears to be an underrepresentation for clips that both induce negative valence and are highly arousing. Table Table11 shows a list of all the clips in the database, together with a short description, length and their corresponding valence and arousal ratings.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-02116-g0004.jpg

Scatterplot of immnersive video clips defined by mean arousal and valence ratings.

Table 1

Comprehensive list of all immersive VR clips in database.

IDTitleDescriptionLength (s)ValenceArousal
1Abandoned buildingDaytime shot of an alley in between two abandoned buildings, with ambient music1204.392.77
2A Mumbai summerTour of the Mumbai, India, various shots of urban and suburban locations1995.874.60
3Abandoned cityVirtual environment of a post-apocalyptic abandoned city503.333.33
4Alaska's melting glaciersEducational clip about the effects of climate change on Alaska's glaciers2344.733.33
5ChernobylEducational clip on the effects of the Chernobyl nuclear disaster on the town of Pripyat5483.064.18
6Sadness elicitation—lake valleyVirtual environment of a desolate valley1195.362.64
7FukushimaJournalistic feature on the effects of the Fukushima nuclear crisis5602.694.63
8HappylandShort documentary on a Manila dumpsite where 40,000 people call home6113.333.40
9Homeless veteransEducational clip on the plight of US veterans who are homeless1643.634.50
10New York 2121Virtual environment of a post-apocalyptic and abandoned New York City1204.003.93
11North KoreaJournalistic clip on the everyday life of North Koreans6406.003.50
12The fight to save threatened speciesVarious shots of animals threatened by extinction1247.004.60
13The marginsJournalistic clip on illegal border crossing1374.924.08
14War zoneJournalistic clip of a war torn city1832.533.82
15Inside a bee hiveThe clip gives the viewer an upclose view of the inside of a bee hive together with a large number of bees433.693.94
16Solitary confinementShort film on the emotional trauma of solitary confinement2212.384.25
17Survive a bear attackShort film on campers handling a bear attack905.225.00
18The displacedJournalistic feature on three homeless children6682.184.73
19The Nepal earthquake aftermathShort film on the effects of an earthquake in Nepal2402.733.80
20War knows no nationShort filmo on war that mixes computer-graphics with live-action footage4484.936.07
21Zombie apocalypse horrorAction film where the viewer follows a group of soldiers defending against a zombie attack2653.205.60
22Great ocean roadAerial shots over various scenic locations in Australia1187.773.92
23Instant caribbean vacationPromotional video of a Caribbean cruise liner1507.203.20
24Blyde canyonPromotional video introducing the features and scenic views of a large canyon1574.823.09
25The most beautiful place in the worldVarious scenic shots of a man's travels1866.654.94
26Getting licked by a cow in IrelandViewer gets a closeup experience with a cow657.073.21
27SeagullsVarious video clips of seagulls at a quiet seaside1206.001.60
28Maldives beach and resortVarious clips filmed at a beach in Maldives1386.693.50
29Fallen TreesAtmospheric clip of various fallen trees in the forest1466.502.50
30Haleakala national park sunriseTimelapse clip showing the sun rising over a forest376.723.39
31Ibiza weddingVarious shots of a couple's wedding party held at a beach resort3107.383.38
32Malaekahana sunriseViewer sees the sun rising over the horizon at a beach1206.571.57
33Pacific sunset half moon bayTimelapse clip showing the sunset close to the sea1346.191.81
34Raising ducklingsViewer gets a closeup experience with ducklings2036.002.63
35Redwoods: Walk Among GiantsVarious landscape shots of tall trees in forest1205.792.00
36Rocky beachScenic shots of a beach936.002.31
37Sunset of oia-santoriniTimelapse clip of sunset over a Grecian town896.553.09
38Mountain stillnessAtmospheric shots of Canadian snowy mountains1286.131.80
39Zip-lining in chattanoogaViewer takes the perspective of a zip-liner in action1274.794.57
40VR kittensVarious up close shots of kittens1016.074.00
41Fighter jet patrouile suisseViewer takes the perspective of a fighter jet pilot in command of an airplane1206.554.73
42Cute kittens battleVideo clip showing four kittens playing with one another656.944.13
43Alice the first Swedish baby goes VRVarious shots of an infant in outdoor environments1267.333.44
44Conquer the mega rampViewer takes the perspective of an extreme sports participant and goes down a huge slope before leaping across a gap865.296.43
45Joy elicitationVirtual environment of a field with flowers and butterflies1195.632.00
46Explore the world with IM360Various shots of popular tourist destinations1976.594.29
47Puppy Bowl XIIViewers watch puppies compete in a mock football match1927.444.75
48Holi festival of colorsJournalistic clip on a popular festival in India1736.604.00
49India's first ever 360 Wedding VideoVarious shots at an Indian wedding2017.074.00
50Puppies host SourceFed for a dayViewers get up close with some puppies807.475.35
51Resonance: a jump VR VideoAn experimental film that follows the journeys of a violin player2756.393.15
52Speed flyingViewer follows a speed wing pilot as he glides past mountains1546.757.42
53Tomorrowland 2014A highlights reel of the events at a popular music festival2655.805.40
54As It IsA trailer for a documentary on the history of the Grand Canyon1547.004.67
55New York City JumpJournalistic clip on the popular spots in New York City1445.864.21
56Solar impulse assembles the mobile hangarA time lapse clip on the setting up of a temporary airplane hangar1295.803.80
57Les berges du center à WasquehalA promotional clip for a condominium, where viewers get to explore the features and interior875.753.25
58Spangler LawnA view of people spending an afternoon relaxing in a courtyard585.093.27
59Seeking Pluto's Frigid HeartA journalistic clip on the features of the planet Pluto4636.004.31
60Russian knights acrobatic rehearsalsViewer takes the perspective of a fighter jet pilot involved in airshow rehearsals1205.734.20
61Kodak SP360 YachtA view of a yacht out at sea296.105.10
62Mega CoasterViewer takes the perspective of an extreme sports participant leaping off a ramp1176.177.17
63NASA: Encapsulation & Launch of OSIRIS RexDocumentary film on the planning and execution of rocket launches2856.365.93
64Surrounded by elephantsViewer has an up close experience with elephants in a field1565.945.56
65KidnappedA short comedy where viewer takes the perspective of a kidnap victim in a case of mistaken identity4064.835.25
66Great Hammerhead Shark EncounterViewer gets an up close experience with sharks in the sea1346.176.67
67Canyon SwingViewer experiences swinging over an open canyon1045.386.88
68Jailbreak 360Short action film depicting a jailbreak from various closed-circuit cameras and how the culprit was captured3394.406.70
69Walk the tight ropeViewer experiences walking a tight rope over a canyon1516.466.91
70Tahiti SurfViewer experiences snorkeling and surfing on a Tahitian beach2057.104.80
71Lion's Last StandViewer gets an up close experience with a tiger on a savanna405.885.25
72Relive Undertaker's EntranceViewer experiences a sports entertainment event at a packed stadium1225.365.57
73Through Mowgli's EyesA short film where the viewer observes a conversation between an ape and a boy936.276.18

The immersive VR clips varied on arousal ratings (M = 4.20, SD = 1.39), ranging from a low of 1.57 to a high of 7.4. This compares favorably with arousal ratings on the IAPS, which range from 1.72 to 7.35 (Lang et al., 2008). Comparatively, arousal ratings on the IAPS ranged from 1.72 to 7.35. The video clips also varied on valence ratings (M = 5.59, SD = 1.40), with a low of 2.2 and a high of 7.7. This compares reasonably well with valence ratings on the IAPS, which range from 1.31 to 8.34.

Head movement data

Pearson's product-moment correlations between observers' head movement data and their affective ratings are presented in Table Table2.2. Most scores appear to be normally distributed as assessed by a visual inspection of Normal Q-Q plots (see Figure Figure5).5). Analyses showed that average standard deviation of head yaw significantly predicted valence [F(1, 71) = 5.06, p = 0.03, r = 0.26, adjusted R2 = 0.05], although the direction was in contrast to our hypothesis. There was no significant relationship between standard deviation of head yaw with arousal [F(1, 71) = 2.02, p = 0.16, r = 0.17, adjusted R2 = 0.01)]. However, there was a significant relationship between average head pitch movement and arousal [F(1, 71) = 4.63, p = 0.04, r = 0.25; adjusted R2 = 0.05]. Assumptions of the F-test for the significant relationships were met, with analyses showing homoscedasticity and normality of the residuals. The plots of the significant relationships are presented in Figures Figures6,6, ,77.

Table 2

Correlation matrix of head movement data and affective ratings (N = 73).

ValenceArousal
Pitch (M = 4.23, SD = 5.86)0.020.25*
Pitch Standard Deviation (M = 13.90, SD = 3.64)0.160.10
Yaw (M = 3.67, SD = 20.25)0.00.08
Yaw Standard Deviation (M = 57.14, SD = 16.96)0.27*−0.17
Roll (M = −1.11, SD = 1.65)0.20−0.19
Roll Standard Deviation (M = 7.32, SD = 3.47)0.08−0.02
*p < 0.05.
An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-02116-g0005.jpg

Normal Q-Q plots of all observed variables.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-02116-g0006.jpg

Plot illustrating relationship between standard deviation of head yaw and valence ratings.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-08-02116-g0007.jpg

Plot illustrating the relationship between head pitch and arousal ratings.

Discussion

The first objective of the study was to establish and introduce a database of immersive video clips that can serve as a resource for emotion induction research through VR. We sourced and tested a total of 73 video clips. Results showed that the stimuli varied reasonably well along the dimensions of valence and arousal. However, there appears to be a lack of representation for videos that are of negative valence yet highly arousing. In the IAPS and IADS, stimuli that belong to this quadrant tend to represent themes that are gory or violent, such as a victim of an attack that has his face mutilated, or a woman being held hostage with a knife to her throat. The majority of our videos are in the public domain and readily viewable on popular websites such as Youtube which have a strict policy on the types of content that can be uploaded. Hence, it is not surprising that stimuli of negative valence and arousal were not captured in our selection of immersive videos. Regardless, the collection of video clips (which can be found here) should serve as a good launching pad for researchers interested to examine the links between VR and emotion.

Although not a key factor of interest for this paper, we observed variance in the length of the video clips which was confounded with video content. Long video clips in our database tend to be of serious journalism content (e.g., nuclear fallout, homeless veterans, dictatorship regime) and naturally evoke negative valence. Length is a distinct factor of videos in contrast to photographs which are the standard emotional stimuli of photographs. Hence, while we experienced difficulty sourcing for long video clips that are of positive valence, future studies should examine the influence of video clip length on affective ratings.

The second objective sought to explore the relationship between observers' head movements and their emotions. We demonstrated a significant relationship between the amount of head yaw and valence ratings, which suggests that individuals who displayed greater movement of side-to-side head movement gave higher ratings of pleasure. However, the positive relationship shown here is in contrast to that presented by Won et al. (2016) who showed a significant relationship between the amount of head yaw and reported anxiety. It appears that content and context is an important differentiating factor when it comes to the effects of head movements. Participants in the former study explored their virtual environment and may have felt anxious in the presence of other virtual people. In our study, participants simply viewed the content presented to them without the need for navigation. Although no significant relationship was present between standard deviation of head yaw and arousal ratings, we found a correlation between head pitch and arousal, suggesting that people who tend to tilt their head upwards while watching immersive videos reported being more excited. This parallels research conducted by Lhommet and Marsella (2015) who compiled data from various studies on head positions and emotion states and showed that tilting the head up corresponds to feelings of excitement such as surprise and fear. The links between head movement and emotion are important findings and deserves further investigation.

One thing of note is the small effect sizes shown in our study (adjusted R2 = 0.05). While we tried our best to balance efficient data collection and managing participant fatigue, some participants may not be used to watching VR clips at length and may have felt uncomfortable or distressed without overtly expressing it. This may have influenced their ratings for VR clips toward the end of their study session, and may explain the small effect size. Future studies can explore when participant fatigue is likely to take place and adjust the viewing duration accordingly to minimize the impact on participant ratings.

Self-perception theory posits that people determine their attitudes based on their behavior (Bem, 1972). Future research can explore whether tasking participants to direct their head in certain directions or movements can lead to changes in their affect or attitudes. For example, imagine placing a participant in a virtual garden filled with colorful flowers and lush greenery. Since our study shows a positive link between amount of head yaw and valence ratings, will participants tasked to keep their gaze on a butterfly fluttering around them (therefore increasing the amount of head movement) lead to stronger valence compared to those who see a stationary butterfly resting on a flower? Results from this and similar studies can possibly aid in the development of virtual environments that assist patients undergoing technology-assisted therapy.

Our study examined the rotational head movements enacted by participants as they watched the video clips. Participants in our study sat on a swivel chair, which allowed them to swing around to have a full surround view of the immersive video. Future studies can incorporate translational head movements, which refers to movements that operate horizontally, laterally and vertically (x-, y-, and z- axes). This can exist through allowing participants to sit, stand or walk freely, or even program depth field elements into the immersive videos and seeing how participants' rotational and translational head movements correlate with their affect. Exploring the effects of the added degrees of freedom will contribute to a deeper understanding on the connection between head movements and emotions.

Ethics statement

This study was carried out in accordance with the recommendations of the Human Research Protection Program, Stanford University Administrative Panel on Human Subjects in Non-Medical Research with written informed consent from all subjects. All subjects gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the Stanford University Administrative Panel on Human Subjects in Non-Medical Research.

Author contributions

The authors worked as a team and made contributions throughout. BL and JB conceptualized and conducted the study. AP contributed in the sourcing and shortlisting of immersive VR clips and in revising the manuscript. WG and LW acted as domain consultants for the subject and contributed in writing and revisions.

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Footnotes

Funding. This work is part of the RAINBOW-ENGAGE study supported by NIH Grant 1UH2AG052163-01.

Supplementary material

The Supplementary videos are available online at: http://vhil.stanford.edu/360-video-database/

References

  • Bem D. (1972). Self perception theory, in Advances in Experimental Social Psychology, ed Berkowitz L., editor. (New York, NY: Academic Press; ), 1–62. [Google Scholar]
  • Blascovich J., Loomis J., Beall A. C., Swinth K. R., Hoyt C. L., Bailenson J. N. (2002). Immersive virtual environment technology as a methodological tool for social psychology. Psychol. Inq. 13, 103–124. 10.1207/S15327965PLI1302_01 [CrossRef] [Google Scholar]
  • Bradley M. M., Lang P. J. (1999). International Affective Digitized Sounds (IADS): Stimuli, Instruction Manual and Affective Ratings. Technical Report B-2, The Center for Research in Psychophysiology, University of Florida, Gainesville, FL. [Google Scholar]
  • Darwin C. (1965). The Expression of the Emotions in Man and Animals. Chicago: University of Chicago Press. [Google Scholar]
  • Difede J., Hoffman H. G. (2002). Virtual reality exposure therapy for world trade center post-traumatic stress disorder: a case report. Cyberpsychol. Behav. 5, 529–535. 10.1089/109493102321018169 [Abstract] [CrossRef] [Google Scholar]
  • Dyck M., Winbeck M., Leiberg S., Chen Y., Gur R. C., Mathiak K. (2008). Recognition profile of emotions in natural and virtual faces. PLoS ONE 3:e3628. 10.1371/journal.pone.0003628 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Fox J., Arena D., Bailenson J. N. (2009). Virtual reality: a survival guide for the social scientist. J. Media Psychol. 21, 95–113. 10.1027/1864-1105.21.3.95 [CrossRef] [Google Scholar]
  • Gross J. J., Levenson R. W. (1995). Emotion elicitation using films. Cogn. Emot. 9, 87–108. 10.1080/02699939508408966 [CrossRef] [Google Scholar]
  • Ito T. A., Cacioppo J. T., Lang P. J. (1998). Eliciting affect using the international affective picture system: trajectories through evaluative space. Pers. Soc. Psychol. Bull. 24, 855–879. 10.1177/0146167298248006 [CrossRef] [Google Scholar]
  • Klinger E., Bouchard S., Légeron P., Roy S., Lauer F., Chemin I., et al. . (2005). Virtual reality therapy versus cognitive behavior therapy for social phobia: a preliminary controlled study. Cyberpsychol. Behav. 8, 76–88. 10.1089/cpb.2005.8.76 [Abstract] [CrossRef] [Google Scholar]
  • Lang P. J. (1980). Behavioral treatment and bio-behavioral assesment: computer applications, in Technology in Mental Health Care Delivery Systems, eds Sidowski J. B., Johnson J. H., Williams T. A., editors. (Norwood, NJ: Ablex; ), 119–137. [Google Scholar]
  • Lang P. J., Bradley M. M., Cuthbert B. N. (2008). International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual. Technical Report A-6, University of Florida, Gainsville, FL. [Google Scholar]
  • Lhommet M., Marsella S. C. (2015). Expressing emotion through posture and gesture, in The Oxford Handbook of Affective Computing, eds Calvo R., D'Mello S., Gratch J., Kappas A., editors. (Oxford; New York, NY: Oxford University Press; ), 273–285. [Google Scholar]
  • Livingstone S. R., Palmer C. (2016). Head movements encode emotions during speech and song. Emotion 16:365. 10.1037/emo0000106 [Abstract] [CrossRef] [Google Scholar]
  • Multisilta J. (2014). Mobile panoramic video applications for learning. Educ. Inform. Technol. 19, 655–666. 10.1007/s10639-013-9282-8 [CrossRef] [Google Scholar]
  • Parsons T. D., Bowerly T., Buckwalter J. G., Rizzo A. A. (2007). A controlled clinical comparison of attention performance in children with ADHD in a virtual reality classroom compared to standard neuropsychological methods. Child Neuropsychol. 13, 363–381. 10.1080/13825580600943473 [Abstract] [CrossRef] [Google Scholar]
  • Parsons T. D., Rizzo A. A. (2008). Affective outcomes of virtual reality exposure therapy for anxiety and specific phobias: a meta-analysis. J. Behav. Ther. Exp. Psychiatry 39, 250–261. 10.1016/j.jbtep.2007.07.007 [Abstract] [CrossRef] [Google Scholar]
  • Peck T. C., Seinfeld S., Aglioti S. M., Slater M. (2013). Putting yourself in the skin of a black avatar reduces implicit racial bias. Conscious. Cogn. 22, 779–787. 10.1016/j.concog.2013.04.016 [Abstract] [CrossRef] [Google Scholar]
  • Qu C., Brinkman W.-P., Ling Y., Wiggers P., Heynderickx I. (2014). Conversations with a virtual human: synthetic emotions and human responses. Comput. Hum. Behav. 34, 58–68. 10.1016/j.chb.2014.01.033 [CrossRef] [Google Scholar]
  • Riva G., Mantovani F., Capideville C. S., Preziosa A., Morganti F., Villani D., et al. . (2007). Affective interactions using virtual reality: the link between presence and emotions. CyberPsychol. Behav. 10, 45–56. 10.1089/cpb.2006.9993 [Abstract] [CrossRef] [Google Scholar]
  • Schouwstra S. J., Hoogstraten J. (1995). Head position and spinal position as determinants of perceived emotional state 1. Percept. Mot. Skills 81, 673–674. 10.1177/003151259508100262 [Abstract] [CrossRef] [Google Scholar]
  • Schroeder R. (2012). The Social Life of Avatars: Presence and Interaction in Shared Virtual Environments. New York, NY: Springer Science & Business Media. [Google Scholar]
  • Schultheis M. T., Rizzo A. A. (2001). The application of virtual reality technology in rehabilitation. Rehabil. Psychol. 46:296 10.1037/0090-5550.46.3.296 [CrossRef] [Google Scholar]
  • Slater M., Antley A., Davison A., Swapp D., Guger C., Barker C., et al. . (2006). A virtual reprise of the stanley milgram obedience experiments. PLoS ONE 1:e39. 10.1371/journal.pone.0000039 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Slater M., Steed A., McCarthy J., Maringelli F. (1998). The influence of body movement on subjective presence in virtual environments. Hum. Factors 40, 469–477. 10.1518/001872098779591368 [Abstract] [CrossRef] [Google Scholar]
  • Slater M., Sanchez-Vives M. V. (2016). Enhancing our lives with immersive virtual reality. Front. Robot. AI 3:74 10.3389/frobt.2016.00074 [CrossRef] [Google Scholar]
  • Tracy J. L., Matsumoto D. (2008). The spontaneous expression of pride and shame: evidence for biologically innate nonverbal displays. Proc. Natl. Acad. Sci. U.S.A. 105, 11655–11660. 10.1073/pnas.0802686105 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Wallbott H. G. (1998). Bodily expression of emotion. Eur. J. Soc. Psychol. 28, 879–896. 10.1002/(SICI)1099-0992(1998110)28:6<879::AID-EJSP901>3.0.CO;2-W [CrossRef] [Google Scholar]
  • Wiederhold B. K., Wiederhold M. D. (2005). Virtual Reality Therapy for Anxiety Disorders: Advances in Evaluation and Treatment. Washington, DC: American Psychological Association. [Google Scholar]
  • Won A. S., Perone B., Friend M., Bailenson J. N. (2016). Identifying anxiety through tracked head movements in a virtual classroom. Cyberpsychol. Behav. Soc. Netw. 19, 380–387. 10.1089/cyber.2015.0326 [Abstract] [CrossRef] [Google Scholar]

Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

Citations & impact 


Impact metrics

Jump to Citations

Citations of article over time

Alternative metrics

Altmetric item for https://www.altmetric.com/details/29202239
Altmetric
Discover the attention surrounding your research
https://www.altmetric.com/details/29202239

Smart citations by scite.ai
Smart citations by scite.ai include citation statements extracted from the full text of the citing article. The number of the statements may be higher than the number of citations provided by EuropePMC if one paper cites another multiple times or lower if scite has not yet processed some of the citing articles.
Explore citation contexts and check if this article has been supported or disputed.
https://scite.ai/reports/10.3389/fpsyg.2017.02116

Supporting
Mentioning
Contrasting
1
85
0

Article citations


Go to all (21) article citations

Similar Articles 


To arrive at the top five similar articles we use a word-weighted algorithm to compare words from the Title and Abstract of each citation.