Evidence for children’s online integration of simultaneous information from speech and iconic gestures: an ERP study

Kazuki Sekine*, Christina Schoechl, Kimberley Mulder, Judith Holler, Spencer Kelly, Reyhan Furman, Asli Özyürek

*この研究の対応する著者

研究成果: Article査読

5 被引用数 (Scopus)

抄録

Children perceive iconic gestures, along with speech they hear. Previous studies have shown that children integrate information from both modalities. Yet it is not known whether children can integrate both types of information simultaneously as soon as they are available (as adults do) or whether they initially process them separately and integrate them later. Using electrophysiological measures, we examined the online neurocognitive processing of gesture-speech integration in 6- to 7-year-old children. We focused on the N400 event-related potential component which is modulated by semantic integration load. Children watched video clips of matching or mismatching gesture-speech combinations, which varied the semantic integration load. The ERPs showed that the amplitude of the N400 was larger in the mismatching condition than in the matching condition. This finding provides the first neural evidence that by the ages of 6 or 7, children integrate multimodal semantic information in an online fashion comparable to that of adults.

本文言語English
ページ(範囲)1283-1294
ページ数12
ジャーナルLanguage, Cognition and Neuroscience
35
10
DOI
出版ステータスPublished - 2020 12月

ASJC Scopus subject areas

  • 言語および言語学
  • 実験心理学および認知心理学
  • 言語学および言語
  • 認知神経科学

フィンガープリント

「Evidence for children’s online integration of simultaneous information from speech and iconic gestures: an ERP study」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル