Gaze estimation in the wild : Models, datasets and usability

By: Contributor(s): Material type: BookBookLanguage: en Publication details: Banglore: Indian Institute of Science, 2022.Description: xix, 229p. e-Theses col. ill. ; 29.1 cm * 20.5 cm 34.98MbDissertation: PhD; 2022; Centre for product design and manufacturingSubject(s): DDC classification:
  • 621 MUR
Online resources: Dissertation note: PhD; 2022; Centre for product design and manufacturing Summary: Human eye gaze estimation research has numerous applications in diverse fields from Human Computer Interaction (HCI) to aviation. Non-intrusive video oculography based methods are classified into two categories based on their input image type. Infra-red illuminator-based approaches report higher accuracy and are available commercially. Appearance-based gaze estimation systems, on the other hand, do not require any dedicated hardware and relies on RGB camera image, hence can be deployed at a larger scale. Existing appearance-based gaze estimation approaches are either less accurate or too bulky to be deployed for interaction purposes. I proposed two gaze estimation models based on attention and difference mechanisms. The proposed AGE-Net model achieved state of the art accuracy on all publicly available datasets. Further, the proposed models contain less memory footprint and can be deployed for real-time interaction. Next, analysis of existing appearance-based gaze estimation datasets revealed that their head pose angles and illumination variations are limited. Further, no dataset or evaluation talks about the precision of gaze estimates from the models proposed so far. In this direction, I proposed PARKS-Gaze, a precision-focused gaze estimation dataset which was recorded in daily life conditions and captured wider head pose and illumination levels than the existing datasets like MPIIGaze and GazeCapture. Even though IR-based systems are commercially available, their performance limitations were not well studied. This hinders their deployment in wide range of scenarios. To this end, a feasibility assessment of a commercially available IR-based gaze estimation wearable eye tracker in actual flying conditions was conducted. Experiments carried out in two fighter aircrafts revealed that the eye tracker failed to provide consistent gaze estimates. Similar findings were observed in terms of detection rate in the studies conducted in laboratory conditions as well. I proposed two hypotheses about the failure modes of commercial eye gaze tracker in flying environment and corroborated them with in-flight eye tracking data. Finally, an indigenous helmet-mounted eye tracker was designed to further extend the gaze estimation capabilities. The proposed helmet-mounted eye tracker along with a deep-learning based gaze estimation method achieved on-par accuracy, superior precision error and consistent gaze estimates compared to a commercial eye tracker. This thesis concludes by presenting the applications developed using the proposed systems and their usability in automotive, aviation and assistive technology domains.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Call number Status Date due Barcode
E-BOOKS E-BOOKS JRD Tata Memorial Library 621 MUR (Browse shelf(Opens below)) Available ET00111

Include bibliographical references and index.

PhD; 2022; Centre for product design and manufacturing

Human eye gaze estimation research has numerous applications in diverse fields from Human Computer Interaction (HCI) to aviation. Non-intrusive video oculography based methods are classified into two categories based on their input image type. Infra-red illuminator-based approaches report higher accuracy and are available commercially. Appearance-based gaze estimation systems, on the other hand, do not require any dedicated hardware and relies on RGB camera image, hence can be deployed at a larger scale. Existing appearance-based gaze estimation approaches are either less accurate or too bulky to be deployed for interaction purposes. I proposed two gaze estimation models based on attention and difference mechanisms. The proposed AGE-Net model achieved state of the art accuracy on all publicly available datasets. Further, the proposed models contain less memory footprint and can be deployed for real-time interaction. Next, analysis of existing appearance-based gaze estimation datasets revealed that their head pose angles and illumination variations are limited. Further, no dataset or evaluation talks about the precision of gaze estimates from the models proposed so far. In this direction, I proposed PARKS-Gaze, a precision-focused gaze estimation dataset which was recorded in daily life conditions and captured wider head pose and illumination levels than the existing datasets like MPIIGaze and GazeCapture. Even though IR-based systems are commercially available, their performance limitations were not well studied. This hinders their deployment in wide range of scenarios. To this end, a feasibility assessment of a commercially available IR-based gaze estimation wearable eye tracker in actual flying conditions was conducted. Experiments carried out in two fighter aircrafts revealed that the eye tracker failed to provide consistent gaze estimates. Similar findings were observed in terms of detection rate in the studies conducted in laboratory conditions as well. I proposed two hypotheses about the failure modes of commercial eye gaze tracker in flying environment and corroborated them with in-flight eye tracking data. Finally, an indigenous helmet-mounted eye tracker was designed to further extend the gaze estimation capabilities. The proposed helmet-mounted eye tracker along with a deep-learning based gaze estimation method achieved on-par accuracy, superior precision error and consistent gaze estimates compared to a commercial eye tracker. This thesis concludes by presenting the applications developed using the proposed systems and their usability in automotive, aviation and assistive technology domains.

There are no comments on this title.

to post a comment.

                                                                                                                                                                                                    Facebook    Twitter

                             Copyright © 2023. J.R.D. Tata Memorial Library, Indian Institute of Science, Bengaluru - 560012

                             Contact   Phone: +91 80 2293 2832

Powered by Koha