1 poster, 1 demo and 3 workshop papers at UbiComp 2015
We will present the following poster, demo, and workshop papers at the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015):
![]() | Sabrina Hoppe; Tobias Loetscher; Stephanie Morey; Andreas Bulling Recognition of Curiosity Using Eye Movement Analysis Inproceedings Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015), pp. 185-188, 2015. @inproceedings{Hoppe15_ubicomp, title = {Recognition of Curiosity Using Eye Movement Analysis}, author = {Sabrina Hoppe and Tobias Loetscher and Stephanie Morey and Andreas Bulling}, url = {https://perceptual.mpi-inf.mpg.de/files/2015/07/Hoppe_Ubicomp15.pdf http://de.in-mind.org/blog/post/das-fenster-zum-gehirn-was-computer-in-unseren-blicken-lesen}, doi = {10.1145/2800835.2800910}, year = {2015}, date = {2015-09-09}, booktitle = {Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015)}, pages = {185-188}, abstract = {Among the different personality traits that guide our behaviour, curiosity is particularly interesting for context-aware assistive systems as it is closely linked to our well-being and the way we learn. This work proposes eye movement analysis for automatic recognition of different levels of curiosity. We present a 26-participant gaze dataset recorded during a real-world shopping task with empirically validated curiosity questionnaires as ground truth. Using a support vector machine classifier and a leave-one-person-out evaluation scheme we can discriminate between two to four classes of standard curiosity scales well above chance. These results are promising and point towards a new class of context-aware systems that take the user's curiosity into account, thereby enabling new types of interaction and user adaptation.}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } Among the different personality traits that guide our behaviour, curiosity is particularly interesting for context-aware assistive systems as it is closely linked to our well-being and the way we learn. This work proposes eye movement analysis for automatic recognition of different levels of curiosity. We present a 26-participant gaze dataset recorded during a real-world shopping task with empirically validated curiosity questionnaires as ground truth. Using a support vector machine classifier and a leave-one-person-out evaluation scheme we can discriminate between two to four classes of standard curiosity scales well above chance. These results are promising and point towards a new class of context-aware systems that take the user's curiosity into account, thereby enabling new types of interaction and user adaptation. |
![]() | Mohamed Khamis; Andreas Bulling; Florian Alt Tackling Challenges of Interactive Public Displays using Gaze Inproceedings Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015), pp. 763-766, 2015. @inproceedings{Khamis15_pdapps, title = {Tackling Challenges of Interactive Public Displays using Gaze}, author = {Mohamed Khamis and Andreas Bulling and Florian Alt}, url = {https://perceptual.mpi-inf.mpg.de/files/2015/07/Khamis15_pdapps.pdf}, doi = {10.1145/2800835.2807951}, year = {2015}, date = {2015-09-08}, booktitle = {Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015)}, journal = {Proc. of the 2nd Workshop on Challenges and Opportunities in Creating Applications for Pervasive Public Display Networks (PD-Apps 2015). }, pages = {763-766}, abstract = {Falling hardware prices led to a widespread use of public displays. Common interaction techniques for such displays currently include touch, mid-air, or smartphone-based interaction. While these techniques are well understood from a technical perspective, several remaining challenges hinder the uptake of interactive displays among passersby. In this paper we propose addressing major public display challenges through gaze as a novel interaction modality. We discuss why gaze-based interaction can tackle these challenges effectively and discuss how solutions can be technically realized. Furthermore, we summarize state-of-the-art eye tracking techniques that show particular promise in the area of public displays.}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } Falling hardware prices led to a widespread use of public displays. Common interaction techniques for such displays currently include touch, mid-air, or smartphone-based interaction. While these techniques are well understood from a technical perspective, several remaining challenges hinder the uptake of interactive displays among passersby. In this paper we propose addressing major public display challenges through gaze as a novel interaction modality. We discuss why gaze-based interaction can tackle these challenges effectively and discuss how solutions can be technically realized. Furthermore, we summarize state-of-the-art eye tracking techniques that show particular promise in the area of public displays. |
![]() | Mohamed Khamis; Florian Alt; Andreas Bulling A Field Study on Spontaneous Gaze-based Interaction with a Public Display using Pursuits Inproceedings Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015), pp. 865-874, 2015. @inproceedings{Khamis15_ubicomp, title = {A Field Study on Spontaneous Gaze-based Interaction with a Public Display using Pursuits}, author = {Mohamed Khamis and Florian Alt and Andreas Bulling}, url = {https://perceptual.mpi-inf.mpg.de/files/2015/07/Khamis15_Ubicomp.pdf}, doi = {10.1145/2800835.2804335}, year = {2015}, date = {2015-09-07}, booktitle = {Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015)}, pages = {865-874}, abstract = {Smooth pursuit eye movements were recently introduced as a promising technique for calibration-free and thus spontaneous and natural gaze interaction. While pursuits have been evaluated in controlled laboratory studies, the technique has not yet been evaluated with respect to usability in the wild. We report on a field study in which we deployed a game on a public display where participants used pursuits to select fish moving in linear and circular trajectories at different speeds. The study ran for two days in a busy computer lab resulting in a total of 56 interactions. Results from our study show that linear trajectories are statistically faster to select via pursuits than circular trajectories. We also found that pursuits is well perceived by users who find it fast and responsive.}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } Smooth pursuit eye movements were recently introduced as a promising technique for calibration-free and thus spontaneous and natural gaze interaction. While pursuits have been evaluated in controlled laboratory studies, the technique has not yet been evaluated with respect to usability in the wild. We report on a field study in which we deployed a game on a public display where participants used pursuits to select fish moving in linear and circular trajectories at different speeds. The study ran for two days in a busy computer lab resulting in a total of 56 interactions. Results from our study show that linear trajectories are statistically faster to select via pursuits than circular trajectories. We also found that pursuits is well perceived by users who find it fast and responsive. |
![]() | Andreas Bulling Human Visual Behaviour for Collaborative Human-Machine Interaction Inproceedings Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015), pp. 903-907, 2015. @inproceedings{bulling15_ubicomp, title = {Human Visual Behaviour for Collaborative Human-Machine Interaction}, author = {Andreas Bulling}, url = {https://perceptual.mpi-inf.mpg.de/files/2015/11/p901-bulling.pdf}, doi = {10.1145/2800835.2815378}, year = {2015}, date = {2015-09-07}, booktitle = {Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015)}, pages = {903-907}, abstract = {Non-verbal behavioural cues are fundamental to human communication and interaction. Despite significant advances in recent years, state-of-the-art human-machine systems still fall short in sensing, analysing, and fully "understanding" cues naturally expressed in everyday settings. Two of the most important non-verbal cues, as evidenced by a large body of work in experimental psychology and behavioural sciences, are visual (gaze) behaviour and body language. We envision a new class of collaborative human-machine systems that fully exploit the information content available in non-verbal human behaviour in everyday settings through joint analysis of human gaze and physical behaviour.}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } Non-verbal behavioural cues are fundamental to human communication and interaction. Despite significant advances in recent years, state-of-the-art human-machine systems still fall short in sensing, analysing, and fully "understanding" cues naturally expressed in everyday settings. Two of the most important non-verbal cues, as evidenced by a large body of work in experimental psychology and behavioural sciences, are visual (gaze) behaviour and body language. We envision a new class of collaborative human-machine systems that fully exploit the information content available in non-verbal human behaviour in everyday settings through joint analysis of human gaze and physical behaviour. |
![]() | Augusto Esteves; Eduardo Velloso; Andreas Bulling; Hans Gellersen Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets Inproceedings Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015), pp. 419-422, 2015. @inproceedings{Esteves15_UbiComp, title = {Orbits: Enabling Gaze Interaction in Smart Watches using Moving Targets}, author = {Augusto Esteves and Eduardo Velloso and Andreas Bulling and Hans Gellersen}, url = {https://perceptual.mpi-inf.mpg.de/files/2015/08/Esteves15_UbiComp.pdf http://www.wired.co.uk/news/archive/2016-01/22/eye-tracking-smartwatch}, doi = {10.1145/2800835.2800942}, year = {2015}, date = {2015-09-01}, booktitle = {Adj. Proc. of the ACM International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp 2015)}, pages = {419-422}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } |