Paper at IUI 2018
We will present the following paper at the ACM International Conference on Intelligent User Interfaces (IUI 2018):
![]() | Philipp Müller; Michael Xuelin Huang; Andreas Bulling Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behaviour Inproceedings Proc. ACM International Conference on Intelligent User Interfaces (IUI), pp. 153-164, 2018. @inproceedings{mueller18_iui, title = {Detecting Low Rapport During Natural Interactions in Small Groups from Non-Verbal Behaviour}, author = {Philipp Müller and Michael Xuelin Huang and Andreas Bulling}, url = {https://perceptual.mpi-inf.mpg.de/files/2018/01/mueller2018_iui.pdf}, doi = {10.1145/3172944.3172969}, year = {2018}, date = {2018-03-07}, booktitle = {Proc. ACM International Conference on Intelligent User Interfaces (IUI)}, pages = {153-164}, abstract = {Rapport, the close and harmonious relationship in which interaction partners are "in sync" with each other, was shown to result in smoother social interactions, improved collaboration, and improved interpersonal outcomes. In this work, we are first to investigate automatic prediction of low rapport during natural interactions within small groups. This task is challenging given that rapport only manifests in subtle non-verbal signals that are, in addition, subject to influences of group dynamics as well as inter-personal idiosyncrasies. We record videos of unscripted discussions of three to four people using a multi-view camera system and microphones. We analyse a rich set of non-verbal signals for rapport detection, namely facial expressions, hand motion, gaze, speaker turns, and speech prosody. Using facial features, we can detect low rapport with an average precision of 0.7 (chance level at 0.25), while incorporating prior knowledge of participants' personalities can even achieve early prediction without a drop in performance. We further provide a detailed analysis of different feature sets and the amount of information contained in different temporal segments of the interactions.}, keywords = {}, pubstate = {published}, tppubtype = {inproceedings} } Rapport, the close and harmonious relationship in which interaction partners are "in sync" with each other, was shown to result in smoother social interactions, improved collaboration, and improved interpersonal outcomes. In this work, we are first to investigate automatic prediction of low rapport during natural interactions within small groups. This task is challenging given that rapport only manifests in subtle non-verbal signals that are, in addition, subject to influences of group dynamics as well as inter-personal idiosyncrasies. We record videos of unscripted discussions of three to four people using a multi-view camera system and microphones. We analyse a rich set of non-verbal signals for rapport detection, namely facial expressions, hand motion, gaze, speaker turns, and speech prosody. Using facial features, we can detect low rapport with an average precision of 0.7 (chance level at 0.25), while incorporating prior knowledge of participants' personalities can even achieve early prediction without a drop in performance. We further provide a detailed analysis of different feature sets and the amount of information contained in different temporal segments of the interactions. |