 | Augusto Esteves; Eduardo Velloso; Andreas Bulling; Hans Gellersen Orbits: Gaze Interaction in Smart Watches using Moving Targets Inproceedings Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015), pp. 457-466, 2015, (best paper award). Abstract | Links | BibTeX @inproceedings{Esteves_UIST15,
title = {Orbits: Gaze Interaction in Smart Watches using Moving Targets},
author = {Augusto Esteves and Eduardo Velloso and Andreas Bulling and Hans Gellersen},
url = {https://perceptual.mpi-inf.mpg.de/files/2015/09/Esteves_UIST15.pdf
https://www.youtube.com/watch?v=KEIgw5A0yfI
http://www.wired.co.uk/news/archive/2016-01/22/eye-tracking-smartwatch},
doi = {10.1145/2807442.2807499},
year = {2015},
date = {2015-11-01},
booktitle = {Proc. of the 28th ACM Symposium on User Interface Software and Technology (UIST 2015)},
pages = {457-466},
abstract = {We introduce Orbits, a novel gaze interaction technique that enables hands-free input on smart watches. The technique relies on moving controls to leverage the smooth pursuit movements of the eyes and detect whether and at which control the user is looking at. In Orbits, controls include targets that move in a circular trajectory in the face of the watch, and can be selected by following the desired one for a small amount of time. We conducted two user studies to assess the technique’s recognition and robustness, which demonstrated how Orbits is robust against false positives triggered by natural eye movements and how it presents a hands-free, high accuracy way of interacting with smart watches using off-the-shelf devices. Finally, we developed three example interfaces built with Orbits: a music player, a notifications face plate and a missed call menu. Despite relying on moving controls – very unusual in current HCI interfaces – these were generally well received by participants in a third and final study.},
note = {best paper award},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
We introduce Orbits, a novel gaze interaction technique that enables hands-free input on smart watches. The technique relies on moving controls to leverage the smooth pursuit movements of the eyes and detect whether and at which control the user is looking at. In Orbits, controls include targets that move in a circular trajectory in the face of the watch, and can be selected by following the desired one for a small amount of time. We conducted two user studies to assess the technique’s recognition and robustness, which demonstrated how Orbits is robust against false positives triggered by natural eye movements and how it presents a hands-free, high accuracy way of interacting with smart watches using off-the-shelf devices. Finally, we developed three example interfaces built with Orbits: a music player, a notifications face plate and a missed call menu. Despite relying on moving controls – very unusual in current HCI interfaces – these were generally well received by participants in a third and final study. |