2 papers at MobileHCI 2018
The following papers were accepted at the 20th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI 2018):
![]() | Julian Steil; Philipp Müller; Yusuke Sugano; Andreas Bulling Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors Inproceedings Proc. International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), pp. 1:1–1:13, 2018, (best paper award). @inproceedings{steil18_mobilehci, title = {Forecasting User Attention During Everyday Mobile Interactions Using Device-Integrated and Wearable Sensors}, author = {Julian Steil and Philipp Müller and Yusuke Sugano and Andreas Bulling}, url = {https://wp.mpi-inf.mpg.de/perceptual/files/2018/07/steil18_mobilehci.pdf}, doi = {10.1145/3229434.3229439}, year = {2018}, date = {2018-04-16}, booktitle = {Proc. International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI)}, pages = {1:1--1:13}, abstract = {Visual attention is highly fragmented during mobile interactions but the erratic nature of attention shifts currently limits attentive user interfaces to adapt after the fact, i.e. after shifts have already happened. We instead study attention forecasting – the challenging task of predicting users' gaze behavior (overt visual attention) in the near future. We present a novel long-term dataset of everyday mobile phone interactions, continuously recorded from 20 participants engaged in common activities on a university campus over 4.5 hours each (more than 90 hours in total). We propose a proof-of-concept method that uses device-integrated sensors and body-worn cameras to encode rich information on device usage and users' visual scene. We demonstrate that our method can forecast bidirectional attention shifts and whether the primary attentional focus is on the handheld mobile device. We study the impact of different feature sets on performance and discuss the significant potential but also remaining challenges of forecasting user attention during mobile interactions.}, note = {best paper award}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } Visual attention is highly fragmented during mobile interactions but the erratic nature of attention shifts currently limits attentive user interfaces to adapt after the fact, i.e. after shifts have already happened. We instead study attention forecasting – the challenging task of predicting users' gaze behavior (overt visual attention) in the near future. We present a novel long-term dataset of everyday mobile phone interactions, continuously recorded from 20 participants engaged in common activities on a university campus over 4.5 hours each (more than 90 hours in total). We propose a proof-of-concept method that uses device-integrated sensors and body-worn cameras to encode rich information on device usage and users' visual scene. We demonstrate that our method can forecast bidirectional attention shifts and whether the primary attentional focus is on the handheld mobile device. We study the impact of different feature sets on performance and discuss the significant potential but also remaining challenges of forecasting user attention during mobile interactions. |
![]() | Mohamed Khamis; Florian Alt; Andreas Bulling The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned Inproceedings Proc. International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI), pp. 38:1–38:17, 2018, (best paper honourable mention award). @inproceedings{khamis18_mobilehci, title = {The Past, Present, and Future of Gaze-enabled Handheld Mobile Devices: Survey and Lessons Learned}, author = {Mohamed Khamis and Florian Alt and Andreas Bulling}, url = {https://perceptual.mpi-inf.mpg.de/files/2018/05/khamis18_mobilehci.pdf}, doi = {10.1145/3229434.3229452}, year = {2018}, date = {2018-04-16}, booktitle = {Proc. International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI)}, pages = {38:1--38:17}, abstract = {While first-generation mobile gaze interfaces required special- purpose hardware, recent advances in computational gaze estimation and the availability of sensor-rich and powerful devices is finally fulfilling the promise of pervasive eye tracking and eye-based interaction on off-the-shelf mobile devices. This work provides the first holistic view on the past, present, and future of eye tracking on handheld mobile devices. To this end, we discuss how research developed from building hardware prototypes, to accurate gaze estimation on unmodified smartphones and tablets. We then discuss implications by laying out 1) novel opportunities, including pervasive advertising and conducting in-the-wild eye tracking studies on handhelds, and 2) new challenges that require further research, such as visibility of the user’s eyes, lighting conditions, and privacy implications. We discuss how these developments shape MobileHCI research in the future, possibly the next 20 years.}, note = {best paper honourable mention award}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } While first-generation mobile gaze interfaces required special- purpose hardware, recent advances in computational gaze estimation and the availability of sensor-rich and powerful devices is finally fulfilling the promise of pervasive eye tracking and eye-based interaction on off-the-shelf mobile devices. This work provides the first holistic view on the past, present, and future of eye tracking on handheld mobile devices. To this end, we discuss how research developed from building hardware prototypes, to accurate gaze estimation on unmodified smartphones and tablets. We then discuss implications by laying out 1) novel opportunities, including pervasive advertising and conducting in-the-wild eye tracking studies on handhelds, and 2) new challenges that require further research, such as visibility of the user’s eyes, lighting conditions, and privacy implications. We discuss how these developments shape MobileHCI research in the future, possibly the next 20 years. |