Loading...
Predicting conversational turns: Signers' and nonsigners' sensitivity to language-specific and globally accessible cues
de Vos,C. ; Casillas,Marisa ; Uittenbogert,Tom ; Crasborn,Onno ; Levinson,Stephen C.
de Vos,C.
Casillas,Marisa
Uittenbogert,Tom
Crasborn,Onno
Levinson,Stephen C.
Abstract
Precision turn-taking may constitute a crucial part of the human endowment for communication. If so, it should be implemented similarly across language modalities, as in signed vs. spoken language. Here, in the first experimental study of turn-end prediction in sign language, we find support for the idea that signed language, like spoken language, involves turn-type prediction and turn-end anticipation. In both cases, turns like questions that elicit specific responses accelerate anticipation. We also show remarkable cross-modality predictive capacity: nonsigners anticipate signed turn ends surprisingly well. Finally, we show that despite nonsigners' ability to intuitively predict signed turn ends, early native signers do it much better by using their access to linguistic signals (here, question markers). As shown in prior work, question formation facilitates prediction, and age of sign language acquisition affects accuracy. The study thus sheds light on the kinds of features that may facilitate turn-taking universally, and those that are language-specific
Description
Funding Information: * With a project this size, we inevitably have many people to thank for their contributions to this article. First and foremost, we would like to thank all of our participants, deaf and hearing, for their time and attention in participating in these perception experiments. Our experiments were carried out in a mobile lab at various locations, and we are extremely grateful for the hospitality and generosity we received in each of these locations (Stichting Welzijn Doven Rotterdam, Stichting Welzijn Doven Amsterdam, Utrecht University of Applied Sciences, De Gebarenkorf, Stichting Welzijn Doven Drenthe, Koninklijke Kentalis Zoetermeer) and highly indebted to the deaf and hearing families who allowed us to park on their sidewalks (Fam. Hartzema, Fam. Uittenbogert, Iris Wijnen & Nico Borst, and Mirjam-Iris Crox & Thomas op de Coul). We also thank all four deaf signers, who preferred not to be mentioned here by name, for allowing us to record their spontaneous conversations as a basis for the stimulus materials, and Nick Wood† for editing them. We also thank Merel van Zuilen for modeling the images in Figure 1, Ellen Nauta and Marjolein Ankone for their assistance in data collection, and Frouke van Winsum for her help in data transcription. Anonymized experiment data and scripts to replicate the reported analyses are available at https://github.com/marisacasillas/NGT-Turn_end_prediction. All analyses were completed in R, an open-source software (R Core Team 2017). In addition to the anonymized experiment data and scripts available at the link above, the stimuli and experiment scripts are available on request. Asummary of this article in Sign Language of the Netherlands is available at https://www.gebareninzicht .nl/onderzoeksresultaten. We gratefully acknowledge funding support from the ERC Starting grant 852352-ELISA (CV), the NWO VICI grant 277-70-014 (OC), the NWO Veni grant 275-89-033 (MC), the NWO Veni grant 016-164-155 (CV), and the ERC Advanced grant 269484-INTERACT (SCL). 35 Printed with the permission of Connie de Vos, Marisa Casillas, Tom Uittenbogert, Onno Crasborn, & Stephen C. Levinson. © 2022. Publisher Copyright: © 2022, Linguistic Society of America. All rights reserved.
Date
2022-03-01
Journal Title
Journal ISSN
Volume Title
Publisher
Research Projects
Organizational Units
Journal Issue
Keywords
Turn-taking, Turn-end anticipation, Interactional Linguistics, Conversation Analysis, Discourse processing, Sign Language of the Netherlands, Gesture
Citation
de Vos, C, Casillas, M, Uittenbogert, T, Crasborn, O & Levinson, S C 2022, 'Predicting conversational turns: Signers' and nonsigners' sensitivity to language-specific and globally accessible cues', Language, vol. 98, no. 1, pp. 35-62. https://doi.org/10.1353/lan.2021.0085
