Members of the Perceptual User Interfaces Group

research mission

Developing human-machine systems that fully exploit the information content available in non-verbal human behaviour is challenging, particularly in unconstrained daily life settings. Key challenges are 1) to develop sensing systems that robustly and accurately capture non-verbal human behaviour in these settings, 2) to develop computational methods for automatic analysis and modelling that cope with the variability and subtleness in human behaviour, and 3) to develop novel human-machine systems that exploit non-verbal behavioural cues to enable collaborative or even symbiotic interactions. To this end, our group works at the interface of ubiquitous computing, human-computer interaction, computer vision, and machine learning. We develop novel computational methods as well as ambient and on-body systems to sense non-verbal human behaviour. We specifically focus on visual and physical behaviour as well as brain activity as these modalities are most promising for developing interfaces that offer human-like perceptual and interactive capabilities.

selected recent publications

ECCV’16: A 3D Morphable Eye Region Model for Gaze Estimation
UIST’16 (best paper honourable mention award): AggreGaze: Collective Estimation of Audience Attention on Public Displays
UbiComp’16: TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
ETRA’16 (emerging investigator award): Learning an appearance-based gaze estimator from one million synthesised images
ETRA’16: Labelled pupils in the wild: A dataset for studying pupil detection in unconstrained environments
ETRA’16: Gaussian Processes as an Alternative to Polynomial Gaze Estimation Functions
ETRA’16: 3D Gaze Estimation from 2D Pupil Positions on Monocular Head-Mounted Eye Trackers
ETRA’16: Prediction of Gaze Estimation Error for Error-Aware Gaze-Based Interfaces
IEEE Computer’16: Pervasive Attentive User Interfaces
CHI’16 (best paper honourable mention award): Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
CHI’16: SkullConduct: Biometric User Identification on Eyewear Computers Using Bone Conduction Through the Skull
UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
UbiComp’15: Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models
ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
CVPR’15: Appearance-Based Gaze Estimation in the Wild