Paper at ICMI 2016
We will present the following paper at the 18th ACM International Conference on Multimodal Interaction (ICMI 2016):
![]() | Murtaza Dhuliawala; Juyoung Lee; Junichi Shimizu; Andreas Bulling; Kai Kunze; Thad Starner; Woontack Woo Smooth Eye Movement Interaction Using EOG Glasses Inproceedings Proc. of the International Conference on Multimodal Interaction (ICMI), pp. 307-311 , 2016. @inproceedings{Dhuliawala16_ICMI, title = {Smooth Eye Movement Interaction Using EOG Glasses}, author = {Murtaza Dhuliawala and Juyoung Lee and Junichi Shimizu and Andreas Bulling and Kai Kunze and Thad Starner and Woontack Woo}, url = {https://perceptual.mpi-inf.mpg.de/wp-content/blogs.dir/12/files/2016/09/dhuliawala16_icmi.pdf}, doi = {10.1145/2993148.2993181}, year = {2016}, date = {2016-08-25}, booktitle = {Proc. of the International Conference on Multimodal Interaction (ICMI)}, pages = {307-311 }, abstract = {Orbits combines a visual display and an eye motion sensor to allow a user to select between options by tracking a cursor with the eyes as the cursor travels in a circular path around each option. Using an off-the-shelf J!NS MEME pair of eyeglasses, we present a pilot study that suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad. For forced choice binary selection, we achieve a 2.6 bits per second (bps) input rate at 250ms per input. We also introduce Head Orbits, where the user fixates the eyes on a target and moves the head in synchrony with the orbiting target. Measuring only the relative movement of the eyes in relation to the head, this method achieves a maximum rate of 2.0 bps at 500ms per input. Finally, we combine the two techniques together with a gyro to create an interface with a maximum input rate of 5.0 bps.}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } Orbits combines a visual display and an eye motion sensor to allow a user to select between options by tracking a cursor with the eyes as the cursor travels in a circular path around each option. Using an off-the-shelf J!NS MEME pair of eyeglasses, we present a pilot study that suggests that the eye movement required for Orbits can be sensed using three electrodes: one in the nose bridge and one in each nose pad. For forced choice binary selection, we achieve a 2.6 bits per second (bps) input rate at 250ms per input. We also introduce Head Orbits, where the user fixates the eyes on a target and moves the head in synchrony with the orbiting target. Measuring only the relative movement of the eyes in relation to the head, this method achieves a maximum rate of 2.0 bps at 500ms per input. Finally, we combine the two techniques together with a gyro to create an interface with a maximum input rate of 5.0 bps. |
Tags: Conference, Paper