Home

research mission

Developing human-computer interfaces that fully exploit the information content available in non-verbal human behaviour is challenging, particularly in unconstrained daily life settings. Key challenges are 1) to develop sensing systems that robustly and accurately capture non-verbal human behaviour in ever-changing conditions, 2) to develop computational methods for automatic analysis and modelling that are able to cope with the large variability in human behaviour, and 3) to use the information extracted from such behaviour to develop novel human-computer interfaces that are highly interactive, multimodal and modelled after natural human-to-human interactions.

Our group works at the interface of human-computer interaction, computer vision, wearable computing, and eye tracking. We develop novel sensing systems and computational methods to analyse non-verbal human behaviour automatically using ambient and on-body sensors. We specifically focus on visual and physical behaviour as we believe that these modalities are most promising for developing interfaces that offer human-like perceptual and interaction capabilities. We study the systems and methods that we develop in the context of specific application domains, most importantly pervasive eye-based human-computer interfaces and computational behaviour analysis.

recent publications

CVPR’15: Prediction of Search Targets From Fixations in Open-world Settings
CVPR’15: Appearance-based Gaze Estimation in the Wild
CHI’15: The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay
CHI’15: Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks
UbiComp’14: GazeHorizon: Enabling Passers-by to Interact with Public Displays by Gaze
UbiComp’14: SmudgeSafe: Geometric Image Transformations for Smudge-resistant User Authentication
ETRA’14: EyeTab: Model-based gaze estimation on unmodified tablet computers
ETRA’14: Cross-Device Gaze-Supported Point-to-Point Content Transfer
ACM Computing Surveys’14: A Tutorial on Human Activity Recognition Using Body-worn Inertial Sensors
IEEE Pervasive Computing’14: Cognition-Aware Computing
AVI’14: Pupil-canthi-ratio: a calibration-free method for tracking horizontal gaze direction