Members of the Perceptual User Interfaces Group

research mission

With an increasing number of computing systems competing for it simultaneously, attention has become a scarce and valuable resource that needs to be managed carefully by human-computer interfaces. Attentive user interfaces aim to sense, analyse, and adapt to user’s current attentional focus and capacity. The first generation of these interfaces was limited to short-term measurements of individuals in controlled settings. Pervasive attentive user interfaces promise unobtrusive, accurate, and robust assessment as well as continuous monitoring of attention allocation of large numbers of people over long periods of time in daily life.

Our group works at the interface of human-computer interaction, computer vision, ubiquitous computing, and machine learning. Our first area of research is pervasive gaze estimation, i.e. the development of computational methods to unobtrusively, robustly, and accurately estimate visual attention pervasively in daily-life settings. This includes computer vision methods for ambient sensing in smart environments as well as methods for user-centric sensing using body-worn devices, such as head-mounted eye trackers or egocentric cameras. A second line of work focuses on visual behaviour modelling and analysis, i.e. the development of signal processing and machine learning methods to process, model, and analyse everyday visual behaviour. The third line of work is concerned with the question of how to use information about visual behaviour and attention in everyday gaze-based interfaces. [learn more]

selected recent publications

ACM Computing Surveys’15: The Feet in HCI: A Survey of Foot-Based Interaction
ICCV’15: Rendering of Eyes for Eye-Shape Registration and Gaze Estimation
UIST’15: Self-Calibrating Head-Mounted Eye Trackers Using Egocentric Visual Saliency
UIST’15: GazeProjector: Accurate Gaze Estimation and Seamless Gaze Interaction Across Multiple Displays
UIST’15: GravitySpot: Guiding Users in Front of Public Displays Using On-Screen Visual Cues
UIST’15 (best paper award): Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets
ACII’15: Emotion recognition from embedded bodily expressions and speech during dyadic interactions
Personal and Ubiquitous Computing’15: Eye Tracking for Public Displays in the Wild
UbiComp’15: Discovery of Everyday Human Activities From Long-term Visual Behaviour Using Topic Models
UbiComp’15: Analyzing Visual Attention During Whole Body Interaction with Public Displays
UbiComp’15: Recognition of Curiosity Using Eye Movement Analysis
CVPR’15: Prediction of Search Targets From Fixations in Open-world Settings
CVPR’15: Appearance-Based Gaze Estimation in the Wild
CHI’15: The Royal Corgi: Exploring Social Gaze Interaction for Immersive Gameplay
CHI’15: Gaze+RST: Integrating Gaze and Multitouch for Remote Rotate-Scale-Translate Tasks