Intrapersonal dependencies in multimodal behavior
Blomsma,Pieter A. ; Linders,Guido M. ; Vaitonyte,Julija ; Louwerse,Max M.
Blomsma,Pieter A.
Linders,Guido M.
Vaitonyte,Julija
Louwerse,Max M.
Abstract
Human interlocutors automatically adapt verbal and non-verbal signals so that different behaviors become synchronized over time. Multimodal communication comes naturally to humans, while this is not the case for Embodied Conversational Agents (ECAs). Knowing which behavioral channels synchronize within and across speakers and how they align seems critical in the development of ECAs. Yet, there exists little data-driven research that provides guidelines for the synchronization of different channels within an interlocutor. This study focuses on intrapersonal dependencies of multimodal behavior by using cross-recurrence analysis on a multimodal communication dataset to better understand the temporal relationships between language and gestural behavior channels. By shedding light on the intrapersonal synchronization of communicative channels in humans, we provide an initial manual for modality synchro-nisation in ECAs. CCS CONCEPTS · Human-centered computing → Empirical studies in HCI; · Computing methodologies → Discourse, dialogue and prag-matics.
Description
Date
2020-10-20
Journal Title
Journal ISSN
Volume Title
Publisher
Association for Computing Machinery
Research Projects
Organizational Units
Journal Issue
Keywords
Citation
Blomsma, P A, Linders, G M, Vaitonyte, J & Louwerse, M M 2020, Intrapersonal dependencies in multimodal behavior. in Intrapersonal dependencies in multimodal behavior. Association for Computing Machinery, pp. 1-8, IVA '20, Glasgow, United Kingdom, 19/10/20. https://doi.org/10.1145/3383652.3423872
