Paper at Eurographics 2018
We will present the following paper at Eurographics (EG 2018):
![]() | Erroll Wood; Tadas Baltrusaitis; Louis-Philippe Morency; Peter Robinson; Andreas Bulling GazeDirector: Fully Articulated Eye Gaze Redirection in Video Journal Article Computer Graphics Forum, 37 (2), pp. 217-225, 2018, (best paper honourable mention award). @article{wood18_eg, title = { GazeDirector: Fully Articulated Eye Gaze Redirection in Video}, author = {Erroll Wood and Tadas Baltrusaitis and Louis-Philippe Morency and Peter Robinson and Andreas Bulling}, url = {https://perceptual.mpi-inf.mpg.de/files/2018/03/wood18_eg.pdf https://youtu.be/rSNUGciJH6A}, doi = {10.1111/cgf.13355}, year = {2018}, date = {2018-03-17}, booktitle = {Proc. Eurographics (EG)}, journal = {Computer Graphics Forum}, volume = {37}, number = {2}, pages = {217-225}, abstract = {We present GazeDirector, a new approach for eye gaze redirection that uses model-fitting. Our method first tracks the eyes by fitting a multi-part eye region model to video frames using analysis-by-synthesis, thereby recovering eye region shape, texture, pose, and gaze simultaneously. It then redirects gaze by 1) warping the eyelids from the original image using a model-derived flow field, and 2) rendering and compositing synthesized 3D eyeballs onto the output image in a photorealistic manner. GazeDirector allows us to change where people are looking without person-specific training data, and with full articulation, i.e. we can precisely specify new gaze directions in 3D. Quantitatively, we evaluate both model-fitting and gaze synthesis, with experiments for gaze estimation and redirection on the Columbia gaze dataset. Qualitatively, we compare GazeDirector against recent work on gaze redirection, showing better results especially for large redirection angles. Finally, we demonstrate gaze redirection on YouTube videos by introducing new 3D gaze targets and by manipulating visual behavior.}, note = {best paper honourable mention award}, keywords = {}, pubstate = {published}, tppubtype = {article} } We present GazeDirector, a new approach for eye gaze redirection that uses model-fitting. Our method first tracks the eyes by fitting a multi-part eye region model to video frames using analysis-by-synthesis, thereby recovering eye region shape, texture, pose, and gaze simultaneously. It then redirects gaze by 1) warping the eyelids from the original image using a model-derived flow field, and 2) rendering and compositing synthesized 3D eyeballs onto the output image in a photorealistic manner. GazeDirector allows us to change where people are looking without person-specific training data, and with full articulation, i.e. we can precisely specify new gaze directions in 3D. Quantitatively, we evaluate both model-fitting and gaze synthesis, with experiments for gaze estimation and redirection on the Columbia gaze dataset. Qualitatively, we compare GazeDirector against recent work on gaze redirection, showing better results especially for large redirection angles. Finally, we demonstrate gaze redirection on YouTube videos by introducing new 3D gaze targets and by manipulating visual behavior. |