Defense: It's all in your eyes: Gaze tracking, synthesis, and redirection

Harsimran Kaur
Computer Science and Engineering Ph.D. Candidate
Location
Virtual Event
Advisor
Roberto Manduchi

Join us on Zoom: https://ucsc.zoom.us/j/95972493316?pwd=T20wOUlUUXQzVzNMUUVYN096cjVSQT09 / Passcode: 541662

Description: The human eye manifests remarkable optical and mechanical characteristics that can be exploited to determine where a person is looking. While the IR-based devices can closely model such attributes, the webcam-based geometrical methods for determining gaze often suffer from low accuracy due to their sensitivity to estimated physical parameters.

Over the past several years, a number of data-driven gaze tracking algorithms have been proposed, which have been shown to outperform classic model-based methods in terms of gaze direction accuracy. These algorithms leverage the recent development of sophisticated CNN architectures, as well as the availability of large gaze datasets captured under various conditions. One shortcoming of black-box, end-to-end methods, though, is that any unexpected behaviors are difficult to explain. In addition, there is always the risk that a system trained with a certain data set may not perform well when tested on data from a different source (the "domain gap" problem.)

In this work, we propose a novel method to embed eye geometry information in an end-to-end gaze estimation network by means of an "analytic layer". Our experimental results show that our system outperforms other state-of-the-art methods in cross-dataset evaluation, while producing competitive performance over within dataset tests. In addition, the proposed system is able to extrapolate gaze angles outside the range of those considered in the training data.