Investigating ocular and EEG parameters for pilot’s cognitive load estimation using flight simulators

By: Contributor(s): Material type: TextTextPublication details: Bangalore: Indian Institute of Science, 2023.Description: iii,185p. : col. ill. ; e-thesis 2.869MbSubject(s): DDC classification:
  • 623.746 HEB
Online resources: Dissertation note: PhD;2023; Centre for Product Design and manufacturing Summary: Present day aircraft systems are becoming incredibly complicated with the advent of new technologies. As per International civil aviation organization’s safety report, failure of mechanical or any physical hardware has reduced over the years due to highly reliable systems. The main reasons of incidents or accidents are pilot related - either pilot error due to experiencing high workload, loss of situational awareness, delay in task execution, lack of crew coordination; to name a few. Further, the proposed next generation aircraft cockpit solutions focus more towards considering human autonomy. In general, they focus on concepts such as providing larger Multi-Function Displays (MFDs), integrated and interchangeable displays, Pilot Vehicle Interface (PVI) solutions such as touch based displays, 3- dimensional audio, gesture recognition, speech recognition, haptic flight controls, seamless cockpit displays and so on. Hence it is important that pilot’s capabilities and limitations are considered while designing these display interfaces. We cannot assume that pilots will capture all the available information at all times for decision making. In other words, human’s sensory modalities such as vision, auditory senses, verbal senses and sense of touch should be utilized optimally. Designers should consider the limitation that human errors are induced when the mental tasks demand more resources than operator’s capabilities. Hence, when these kinds of advanced concepts are being designed, designers should also parallelly develop technologies for validating these designs. Such evaluations should consider pilot to be in the loop. The scope of my research is to understand human factors in an aircraft cockpit and to identify methodologies to evaluate pilot aircraft interfaces. My dissertation initiates with identifying areas wherein eye gaze tracking can be used to evaluate cockpit interfaces. In the first user study, I set up a non-intrusive SmartEye based eye tracking system at CSIR-NAL’s flight training simulator and pilot in the loop simulations were conducted for predefined test scenarios. The study concentrates on specific applications of eye gaze tracking such as evaluating usefulness of display or symbology, monitoring pilot’s scan behavior, understanding pilot-display interactions and to estimate the variations of pilot’s cognitive load. Results from this study found correlation between the statistical inferences obtained from the ocular parameters with those obtained from the flight path deviations. Based on the understanding on the various applications of eye tracking, I develop a support system that can assist the designers or evaluators in real time. Real-time display of gaze fixation, scan path of gaze movement and scatterplot of the gaze location is provided as feedback on pilot’s attention allocation.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Call number URL Status Date due Barcode
Thesis Thesis JRD Tata Memorial Library 623.746 HEB (Browse shelf(Opens below)) Link to resource Not for loan ET00349

includes bibliographical references

PhD;2023; Centre for Product Design and manufacturing

Present day aircraft systems are becoming incredibly complicated with the advent of new technologies.
As per International civil aviation organization’s safety report, failure of mechanical or any physical
hardware has reduced over the years due to highly reliable systems. The main reasons of incidents or
accidents are pilot related - either pilot error due to experiencing high workload, loss of situational
awareness, delay in task execution, lack of crew coordination; to name a few. Further, the proposed
next generation aircraft cockpit solutions focus more towards considering human autonomy. In
general, they focus on concepts such as providing larger Multi-Function Displays (MFDs), integrated
and interchangeable displays, Pilot Vehicle Interface (PVI) solutions such as touch based displays, 3-
dimensional audio, gesture recognition, speech recognition, haptic flight controls, seamless cockpit
displays and so on. Hence it is important that pilot’s capabilities and limitations are considered while
designing these display interfaces. We cannot assume that pilots will capture all the available
information at all times for decision making. In other words, human’s sensory modalities such as vision,
auditory senses, verbal senses and sense of touch should be utilized optimally. Designers should
consider the limitation that human errors are induced when the mental tasks demand more resources
than operator’s capabilities. Hence, when these kinds of advanced concepts are being designed,
designers should also parallelly develop technologies for validating these designs. Such evaluations
should consider pilot to be in the loop. The scope of my research is to understand human factors in an
aircraft cockpit and to identify methodologies to evaluate pilot aircraft interfaces. My dissertation
initiates with identifying areas wherein eye gaze tracking can be used to evaluate cockpit interfaces. In
the first user study, I set up a non-intrusive SmartEye based eye tracking system at CSIR-NAL’s flight
training simulator and pilot in the loop simulations were conducted for predefined test scenarios. The
study concentrates on specific applications of eye gaze tracking such as evaluating usefulness of display
or symbology, monitoring pilot’s scan behavior, understanding pilot-display interactions and to
estimate the variations of pilot’s cognitive load. Results from this study found correlation between the
statistical inferences obtained from the ocular parameters with those obtained from the flight path
deviations. Based on the understanding on the various applications of eye tracking, I develop a support
system that can assist the designers or evaluators in real time. Real-time display of gaze fixation, scan
path of gaze movement and scatterplot of the gaze location is provided as feedback on pilot’s attention
allocation.

There are no comments on this title.

to post a comment.

                                                                                                                                                                                                    Facebook    Twitter

                             Copyright © 2024. J.R.D. Tata Memorial Library, Indian Institute of Science, Bengaluru - 560012

                             Contact   Phone: +91 80 2293 2832