Article in Personal and Ubiquitous Computing
The following article was accepted to Personal and Ubiquitous Computing:
![]() | Yanxia Zhang; Ming Ki Chong; Jörg Müller; Andreas Bulling; Hans Gellersen Eye Tracking for Public Displays in the Wild Journal Article Personal and Ubiquitous Computing, 19 (5), pp. 967-981, 2015. @article{Zhang15_PUC, title = {Eye Tracking for Public Displays in the Wild}, author = {Yanxia Zhang and Ming Ki Chong and Jörg Müller and Andreas Bulling and Hans Gellersen}, url = {https://perceptual.mpi-inf.mpg.de/files/2015/07/Zhang15_UC.pdf}, doi = {10.1007/s00779-015-0866-8}, year = {2015}, date = {2015-07-03}, journal = {Personal and Ubiquitous Computing}, volume = {19}, number = {5}, pages = {967-981}, abstract = {In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such con- texts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates sponta- neous gaze interaction, enabling users to walk up to a display and navi- gate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configura- tion, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the- shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction, and maps this input to rate-controlled nav- igation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a four-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye move- ments are subtle, users cannot learn gaze interaction from only observing others, and as a results guidance is required.}, keywords = {}, pubstate = {published}, tppubtype = {article} } In public display contexts, interactions are spontaneous and have to work without preparation. We propose gaze as a modality for such con- texts, as gaze is always at the ready, and a natural indicator of the user’s interest. We present GazeHorizon, a system that demonstrates sponta- neous gaze interaction, enabling users to walk up to a display and navi- gate content using their eyes only. GazeHorizon is extemporaneous and optimised for instantaneous usability by any user without prior configura- tion, calibration or training. The system provides interactive assistance to bootstrap gaze interaction with unaware users, employs a single off-the- shelf web camera and computer vision for person-independent tracking of the horizontal gaze direction, and maps this input to rate-controlled nav- igation of horizontally arranged content. We have evaluated GazeHorizon through a series of field studies, culminating in a four-day deployment in a public environment during which over a hundred passers-by interacted with it, unprompted and unassisted. We realised that since eye move- ments are subtle, users cannot learn gaze interaction from only observing others, and as a results guidance is required. |
Tags: Journal, Publication