Research

pervasive gaze estimation

Gaze estimation is an active topic of research in several fields, most notably mobile and ubiquitous computing, computer vision, and robotics. Advances in head-mounted eyes tracking and egocentric vision promise continuous visual behaviour sensing in mobile everyday settings over days or even weeks. We have been working on advancing the state of the art in both remote and head-mounted gaze estimation for several years. For example, we have developed computer vision methods for appearance-based gaze estimation in the wild using large-scale and learning-by-synthesis methods. We have further presented computational methods for head-mounted eye tracker self-calibration, seamless gaze estimation across multiple hand-held and ambient displays, as well as for robust pupil detection and tracking under challenging real-world occlusion conditions.

selected publications

Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency

Yusuke Sugano; Andreas Bulling

Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency Inproceedings

Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015), pp. 363-372, 2015.

Abstract | Links | BibTeX

GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays

Christian Lander; Sven Gehring; Antonio Krüger; Sebastian Boring; Andreas Bulling

GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays Inproceedings

Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015), pp. 395-404, 2015.

Abstract | Links | BibTeX

Appearance-Based Gaze Estimation in the Wild

Xucong Zhang; Yusuke Sugano; Mario Fritz; Andreas Bulling

Appearance-Based Gaze Estimation in the Wild Inproceedings

Proc. of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2015), pp. 4511-4520, 2015.

Abstract | Links | BibTeX

Rendering of Eyes for Eye-Shape Registration and Gaze Estimation

Erroll Wood; Tadas Baltrusaitis; Xucong Zhang; Yusuke Sugano; Peter Robinson; Andreas Bulling

Rendering of Eyes for Eye-Shape Registration and Gaze Estimation Inproceedings

Proc. of the IEEE International Conference on Computer Vision (ICCV 2015), 2015.

Abstract | Links | BibTeX

Robust, real-time pupil tracking in highly off-axis images

Lech Świrski; Andreas Bulling; Neil Dodgson

Robust, real-time pupil tracking in highly off-axis images Inproceedings

Proc. of the 7th International Symposium on Eye Tracking Research and Applications (ETRA 2012), pp. 173-176, 2012.

Abstract | Links | BibTeX

visual behaviour modelling and analysis

User modelling is among the most fundamental problems in human-computer interaction and ubiquitous computing. We have shown that everyday activities, such as reading or common office activities, can be predicted in both stationary and mobile settings from eye movements alone. Eye movements are closely linked to human visual information processing and cognition, such as perceptual learning, experience, or visual search. We have therefore further explored eye movement analysis as a promising approach towards the vision of cognition-aware computing: Computing systems that sense and adapt to covert aspects of user state. The vast majority of previous works focused on short-term visual behaviour lasting only a few minutes. We have contributed methods for recognition of high-level contextual cues, such as social interactions, as well as for discovery of everyday activities from visual behaviour.

selected publications

Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models

Julian Steil; Andreas Bulling

Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models Inproceedings

Proc. of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015), pp. 75-85, 2015.

Abstract | Links | BibTeX

Cognition-Aware Computing

Andreas Bulling; Thorsten O. Zander

Cognition-Aware Computing Journal Article

IEEE Pervasive Computing, 13 (3), pp. 80-83, 2014.

Abstract | Links | BibTeX

Multimodal Recognition of Reading Activity in Transit Using Body-Worn Sensors

Andreas Bulling; Jamie A. Ward; Hans Gellersen

Multimodal Recognition of Reading Activity in Transit Using Body-Worn Sensors Journal Article

ACM Transactions on Applied Perception, 9 (1), pp. 2:1–2:21, 2012.

Abstract | Links | BibTeX

Eye Movement Analysis for Activity Recognition Using Electrooculography

Andreas Bulling; Jamie A. Ward; Hans Gellersen; Gerhard Tröster

Eye Movement Analysis for Activity Recognition Using Electrooculography Journal Article

IEEE Transactions on Pattern Analysis and Machine Intelligence, 33 (4), pp. 741-753, 2011.

Abstract | Links | BibTeX

Recognition of Visual Memory Recall Processes Using Eye Movement Analysis

Andreas Bulling; Daniel Roggen

Recognition of Visual Memory Recall Processes Using Eye Movement Analysis Inproceedings

Proc. of the 13th International Conference on Ubiquitous Computing (UbiComp 2011), pp. 455-464, 2011.

Abstract | Links | BibTeX

everyday gaze-based human-computer interfaces

Despite considerable advances in eye tracking, previous work on eye-based human-computer interfaces mainly developed use of the eyes in settings that involved single user, single device, and WIMP-style interactions. This is despite the fact that the eyes are involved in nearly everything that we do and thus potentially hold a lot of valuable information for interactive systems. In this spirit, we have introduced smooth pursuit eye movements — the movements we perform when latching onto a moving object — as a novel gaze interaction technique for dynamic interfaces. We have demonstrated the use of pursuits for eye tracker calibration as well as interaction with smart watches. Inspired by how visual attention mediates interactions between humans, we have further proposed social gaze as a new paradigm for designing user interfaces that react to visual attention. Another important research direction is to use gaze for interaction in unconstrained everyday settings, in particular with the increasing number of personal devices and ambient displays.

selected publications

Orbits: Gaze Interaction in Smart Watches using Moving Targets

Augusto Esteves; Eduardo Velloso; Andreas Bulling; Hans Gellersen

Orbits: Gaze Interaction in Smart Watches using Moving Targets Inproceedings

Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015), pp. 457-466, 2015, (best paper award).

Abstract | Links | BibTeX

Eye Tracking for Public Displays in the Wild

Yanxia Zhang; Ming Ki Chong; Jörg Müller; Andreas Bulling; Hans Gellersen

Eye Tracking for Public Displays in the Wild Journal Article

Personal and Ubiquitous Computing, 19 (5), pp. 967-981, 2015.

Abstract | Links | BibTeX

The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay

Mélodie Vidal; Remi Bismuth; Andreas Bulling; Hans Gellersen

The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay Inproceedings

Proc. of the 33rd ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2015), pp. 115-124, 2015.

Abstract | Links | BibTeX

Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible

Ken Pfeuffer; Mélodie Vidal; Jayson Turner; Andreas Bulling; Hans Gellersen

Pursuit Calibration: Making Gaze Calibration Less Tedious and More Flexible Inproceedings

Proc. of the 26th ACM Symposium on User Interface Software and Technology (UIST 2013), pp. 261-270 , 2013.

Abstract | Links | BibTeX

Pursuits: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving Targets

Mélodie Vidal; Andreas Bulling; Hans Gellersen

Pursuits: Spontaneous Interaction with Displays based on Smooth Pursuit Eye Movement and Moving Targets Inproceedings

Proc. of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2013), pp. 439-448 , 2013.

Abstract | Links | BibTeX