Home

research mission

Welcome to the Perceptual User Interfaces group! We work at the intersection of Ubiquitous Computing, Human-Computer Interaction, Computer Vision, and Machine Learning. We develop novel computational methods as well as ambient and on-body systems to sense, model, and analyse everyday non-verbal human behaviour. We specifically focus on gaze and physical behaviour as these modalities are most promising for developing next-generation human-computer interfaces that offer natural interactive capabilities.

Please see research and publications as well as our YouTube channel for more information.

spotlight

CHI’18: Training Person-Specific Gaze Estimators from Interactions with Multiple Devices
CHI’18: Which one is me? Identifying Oneself on Public Displays
CHI’18: Understanding Face and Eye Visibility in Front-Facing Cameras of Smartphones used in the Wild
IUI’18: Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behavior
IEEE TPAMI’17: MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation
MUM’17 (best paper honourable mention award): They are all after you: Investigating the Viability of a Threat Model that involves Multiple Shoulder Surfers
PACM IMWUT’17: InvisibleEye: Mobile Eye Tracking Using Multiple Low-Resolution Cameras and Learning-Based Gaze Estimation
PACM IMWUT’17: EyePACT: Eye-Based Parallax Correction on Touch-Enabled Interactive Displays
UIST’17 (best paper honourable mention award): Everyday Eye Contact Detection Using Unsupervised Gaze Target Discovery
UIST’17: EyeScout: Active Eye Tracking for Position and Movement Independent Gaze Interaction with Large Public Displays
CVPRW’17: It’s Written All Over Your Face: Full-Face Appearance-Based Gaze Estimation
CVPR’17 (spotlight presentation): Gaze Embeddings for Zero-Shot Image Classification
ECCV’16: A 3D Morphable Eye Region Model for Gaze Estimation
UIST’16 (best paper honourable mention award): AggreGaze: Collective Estimation of Audience Attention on Public Displays
UbiComp’16: TextPursuits: Using Text for Pursuits-Based Interaction and Calibration on Public Displays
ETRA’16 (emerging investigator award): Learning an appearance-based gaze estimator from one million synthesised images
CHI’16 (best paper honourable mention award): Spatio-Temporal Modeling and Prediction of Visual Attention in Graphical User Interfaces
CHI’16: SkullConduct: Biometric User Identification on Eyewear Computers Using Bone Conduction Through the Skull
UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
UbiComp’15: Discovery of Everyday Human Activities From Long-Term Visual Behaviour Using Topic Models
ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
CVPR’15: Appearance-Based Gaze Estimation in the Wild