Members of the Perceptual User Interfaces Group

research mission

Developing human-machine systems that fully exploit the information content available in non-verbal human behaviour is challenging, particularly in unconstrained daily life settings. Key challenges are 1) to develop sensing systems that robustly and accurately capture non-verbal human behaviour in these settings, 2) to develop computational methods for automatic analysis and modelling that cope with the variability and subtleness in human behaviour, and 3) to develop novel human-machine systems that exploit non-verbal behavioural cues to enable collaborative or even symbiotic interactions. To this end, our group works at the interface of ubiquitous computing, human-computer interaction, computer vision, and machine learning. We develop novel computational methods as well as ambient and on-body systems to sense non-verbal human behaviour. We specifically focus on visual and physical behaviour as well as brain activity as these modalities are most promising for developing interfaces that offer human-like perceptual and interactive capabilities.

selected recent publications

IEEE Computer’16: Pervasive Attentive User Interfaces
CHI’16: Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
CHI’16: SkullConduct: Biometric User Identification on Eyewear Computers Using Bone Conduction Through the Skull
ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
UIST’15: Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency
UIST’15: GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays
UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
ACII’15: Emotion recognition from embedded bodily expressions and speech during dyadic interactions
Personal and Ubiquitous Computing’15: Eye Tracking for Public Displays in the Wild
UbiComp’15: Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models
UbiComp’15: Analyzing Visual Attention During Whole Body Interaction with Public Displays
UbiComp’15: Recognition of Curiosity Using Eye Movement Analysis
CVPR’15: Prediction of Search Targets From Fixations in Open-world Settings
CVPR’15: Appearance-Based Gaze Estimation in the Wild