@article{e2e4d311a982451c803d8e169873fe4a,
title = "Catchments, prosody and discourse",
abstract = "We present our work on the temporal integration of hierarchies of communicative actions: kinesic, prosodic and discursive.We use the device of the {\textquoteleft}catchment{\textquoteright} as the locus around which this integration proceeds.We present a detailed case study of a gesture and speech elicitation experiment in which a subject describes her living space to an interlocutor. First, we process the video data to obtain the motion traces of both of the subject{\textquoteright}s hands using the vector coherence mapping algorithm.We code the gestures to identify the catchments.We recover discourse purposes utilizing a system of guided questions. Finally, we define prosody in terms of the ToBI system. The results of these analyses are compared against the computed motion traces to identify the cues accessible in the gestural and audio data that correlate well with the psycholinguistic analyses. The results show that motion, prosody and discourse structure are integrated at each moment of speaking.",
keywords = "Catchments, Discourse, Discourse purposes, Gesture groups, Temporal integration, Tobi",
author = "{Mc Neill}, David and Francis Quek and {Mc Cullough}, {Karl Erik} and Susan Duncan and Nobuhiro Furuyama and Robert Bryll and Ma, {Xin Feng} and Rashid Ansari",
note = "Funding Information: * This research and the preparation of this paper have been supported by the Spencer Foundation, the U.S. National Science Foundation STIMULATE program, Grant No. IRI-9618887, “Gesture, Speech, and Gaze in Discourse Segmentation”, and the National Science Foundation KDI program, Grant No. BCS-9980054, “Cross-Modal Analysis of Signal and Sense: Multimedia Corpora and Tools for Gesture, Speech, and Gaze Research”. 1. Surprisingly little published work following up on Kendon{\textquoteright}s original insights currently exists. 2. Notation: Boxes above each line are numbered F0 segments recovered by ESPS/waves+ (breaks in F0 shorter than 0.25 seconds disregarded). The same numbering scheme is followed in Figs. 3–5. Catchments are numbered below each line of text. Gestures also are numbered below each line when a given line has more than one gesture. Square brackets show the hand{\textquoteright}s motion relative to speech — {\textquoteleft}[{\textquoteleft} is the onset of a gesture phrase,{\textquoteright}]{\textquoteright} is its end. Brackets are doubled or tripled when successive gesture phrases are not separated by local rests. Boldface shows the stroke phase of the gesture — the phase with semantic content and the quality of {\textquoteleft}effort{\textquoteright}. Beats are not considered to have strokes. Unbolded speech within square brackets and before the stroke is the preparation phase; after it is the retraction phase. Underlining shows holds — the hand(s) held in midair (both prestroke and poststroke holds; cf. Kita, 1990). An asterisk is a self-interruption of speech. Comments in angled brackets are gesture meanings (<move up back staircase> for example). One or more slashes show silent pauses of increasing duration (bolded when the stroke starts and/or continues during the pause). <uh> is a filled pause (schwa, in this example). Double letters show vowel elongation, and # an audible breath pause. % smack is an audible lip smack. PTD=palms toward down; PTB=palms toward body; PAB=palms away from body; AB=away from body; RC=right center space; BH/mirror=hands move in similar ways but in opposite directions; TC=toward center space; B (flat palm), G (index finger extended), 5 (all fingers extended) hand shapes from ASL finger spelling. Diacritics show repeated gestures. L eft and right hands coded separately when form and/or timing differs. Motion notation as in McNeill (1992).",
year = "2001",
doi = "10.1075/gest.1.1.03mcn",
language = "English",
volume = "1",
pages = "9--33",
journal = "Gesture",
issn = "1568-1475",
publisher = "John Benjamins Publishing Company",
number = "1",
}