Home

research mission

Welcome to the Perceptual User Interfaces group! We work at the intersection of Ubiquitous Computing, Human-Computer Interaction, Computer Vision, and Machine Learning. We develop novel computational methods as well as ambient and on-body systems to sense, model, and analyse everyday non-verbal human behaviour. We specifically focus on visual and physical behaviour as these modalities are most promising for developing next-generation human-computer interfaces that offer natural interactive capabilities.

Please see research and publications as well as our YouTube channel for more information.

spotlight

UIST’17: Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
UIST’17: EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
CVPRW’17: It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation
CVPR’17 (spotlight presentation): Gaze Embeddings for Zero-Shot Image Classification
ECCV’16: A 3D Morphable Eye Region Model for Gaze Estimation
UIST’16 (best paper honourable mention award): AggreGaze: Collective Estimation of Audience Attention on Public Displays
UbiComp’16: TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
ETRA’16 (emerging investigator award): Learning an appearance-based gaze estimator from one million synthesised images
CHI’16 (best paper honourable mention award): Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
CHI’16: SkullConduct: Biometric User Identification on Eyewear Computers Using Bone Conduction Through the Skull
UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
UbiComp’15: Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models
ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
CVPR’15: Appearance-Based Gaze Estimation in the Wild