Developing human-computer interfaces that fully exploit the information content available in non-verbal human behaviour is challenging, particularly in unconstrained daily life settings. Key challenges are 1) to develop sensing systems that robustly and accurately capture non-verbal human behaviour in ever-changing conditions, 2) to develop computational methods for automatic analysis and modelling that are able to cope with the large variability in human behaviour, and 3) to use the information extracted from such behaviour to develop novel human-computer interfaces that are highly interactive, multimodal and modelled after natural human-to-human interactions.
Our group works at the interface of human-computer interaction, computer vision, ubiquitous computing, and machine learning. We develop novel sensing systems and computational methods to analyse non-verbal human behaviour automatically using ambient and on-body sensors. We specifically focus on visual and physical behaviour as we believe that these modalities are most promising for developing interfaces that offer human-like perceptual and interaction capabilities. We study the systems and methods that we develop in the context of specific application domains, most importantly pervasive eye-based human-computer interfaces and computational behaviour analysis.
selected recent publications
UIST’15: Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency
UIST’15: GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays
UIST’15: GravitySpot: Guiding Users in Front of Public Displays Using On-Screen Visual Cues
UIST’15: Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
ACII’15: Emotion recognition from embedded bodily expressions and speech during dyadic interactions
Personal and Ubiquitous Computing’15: Eye Tracking for Public Displays in the Wild
UbiComp’15: Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models
UbiComp’15: Analyzing Visual Attention During Whole Body Interaction with Public Displays
UbiComp’15: Recognition of Curiosity Using Eye Movement Analysis
CVPR’15: Prediction of Search Targets From Fixations in Open-world Settings
CVPR’15: Appearance-Based Gaze Estimation in the Wild
CHI’15: The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay
CHI’15: Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks