Event Abstract

EEG & Eye-Tracking Changes With Expertise In A Multi-Vehicle Control Task

  • 1 Wright State University, Department of Psychology, United States
  • 2 Lockheed Martin (United States), Advanced Technology Laboratories (ATL), United States

Recent years have seen rapidly increasing growth of work domains requiring a high degree of interaction with new technologies incurring an unprecedented level of demand on attentional and perceptual resources. One domain in particular where this challenge has become apparent is multi-vehicle command and control, wherein a single operator has to direct multiple vehicles while integrating numerous sources of information in a dynamic environment. Successful performance in this domain requires the operator to maintain a fine balance between two disparate types of tasks: persistent vigilance tasks, such as monitoring live video feed, and immediate time-sensitive responding tasks, such as approving target authorization requests, or responding to threats. Maintaining such a balance requires successful allocation and control of multiple types of attention, including overt and covert attention, central-executive attention and visual selective attention. Thus, in order to monitor and enhance operational performance, it is imperative to understand the time course and dynamics of multiple attentional processes and bottlenecks in the real-world. Moreover, since operator experience and expertise play an essential role in determining how various attentional systems will be recruited and employed, it is essential to resolve how learning impacts the dynamic attentional interactions and how expertise emerges over the course of one’s training and practice. We propose here a new experimental approach to gauge the dynamics of operator attentional states by combining covert psychophysiological measures with overt behavioral and eye-tracking measures in a multi-vehicle control task. The combination of overt and covert measurements allows us (1) to disentangle the contribution of various components of attention to performance and (2) determine the exact time windows during which the observer attentional states converged or diverged. To achieve these goals we use the Air Force Research Laboratory’s FUSION multi-vehicle control simulation environment that requires the participant to operate a number of vehicles with flexible levels of automation. This environment allows researchers to study human interactions with adaptive tools during goal-oriented and complex scenarios. Participants are engaged in a computer-based command and control task implemented in the simulated environment displayed across two monitors. The overall task of the participants is distributed across these two monitors: on one monitor participants are instructed to direct the flight paths of six air vehicles within a map space to locate an undisclosed number of hidden targets, while on the other monitor the participants have to simultaneously respond to text-based chat communications window, as well as managing the fuel levels of the air vehicles. Participants acknowledge any located target via the chat window, with the response time between locating and acknowledging the target serving as an overt performance measure. Notably, each of the two monitors flicker at a different rate so as to generate a subsequent steady-state visual evoked potential (SSVEP) which can then be utilized to index which display the participants are covertly attending to. In parallel with the recording of the ongoing electroencephalography (EEG), we record participants’ eye movements throughout the task to provide a continuous and position-specific measure of overt attention. These two continuous measures allow us to quantify the attentional dynamics involved in multi-vehicle control in a complementary fashion: EEG, with its high temporal-resolution and eye tracking, with its high-resolution spatial information. Specifically, the pattern of the SSVEP response over a given time window can be used to index whether attention is directed to a given display during that time period while the eye-tracking data provides spatially precise information about overt gaze direction across the two monitors, giving us a measure of attentionally-driven viewing behavior. Lastly, we also record a number of additional psychophysiological measures, such as heart rate and body posture, to assess overall vigilance and alertness throughout the task. To track the development of expertise in task performance and evaluate how learning impacts the current paradigm, participants performed the task over five separate training sessions, with the same exact procedure repeated in each session. As participants gain more experience in the task and develop expertise across sessions, we can assess the resulting changes in the neural and eye tracking measures in order to approximate changes in performance over time.

Acknowledgements

The present work is supported by Air Force Research Laboratory as part of the Human Performance Sensing Call 002 BAA

Keywords: Attention, EEG, eye tracking, Psychophysiology, Multi-vehicle control, multi-vehicle tracking

Conference: 2nd International Neuroergonomics Conference, Philadelphia, PA, United States, 27 Jun - 29 Jun, 2018.

Presentation Type: Poster Presentation

Topic: Neuroergonomics

Citation: Harel A, Fox OM, Hansen N, Galego B, Pava M and Russell B (2019). EEG & Eye-Tracking Changes With Expertise In A Multi-Vehicle Control Task. Conference Abstract: 2nd International Neuroergonomics Conference. doi: 10.3389/conf.fnhum.2018.227.00055

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 02 Apr 2018; Published Online: 27 Sep 2019.

* Correspondence: Ms. Olivia M Fox, Wright State University, Department of Psychology, Dayton, Ohio, 45435, United States, fox.128@wright.edu