Watching non-corresponding gestures helps learners with high visuospatial ability to learn about movements with dynamic visualizations: An fNIRS study

Birgit Brucker, Björn de Koning, Ann Christine Ehlis, David Rosenbaum, Peter Gerjets

Research output: Chapter/Conference proceedingConference proceedingAcademicpeer-review

1 Citation (Scopus)

Abstract

This study investigates whether making and observing (human) gestures facilitates learning about non-human biological movements and whether correspondence between gesture and to-be-learned movement is superior to non-correspondence. Functional near-infrared spectroscopy was used to address whether gestures activate the human mirror-neuron system (hMNS) and whether this activation mediates the facilitation of learning. During learning, participants viewed the animations of the to-be-learned movements twice. Depending on the condition, the second viewing was supplemented with either a self-gesturing instruction (Y/N) and/or a gesture video (corresponding/non-corresponding/no). Results showed that high-visuospatial-ability learners showed better learning outcomes with non-corresponding gestures, whereas those gestures were detrimental for low-visuospatial-ability learners. Furthermore, the activation of the inferior-parietal cortex (part of the hMNS) tended to predict better learning outcomes. Unexpectedly, making gestures did not influence learning, but cortical activation differed for learners who self-gestured depending on which gesture they observed. Results and implications are discussed.

Original languageEnglish
Title of host publicationCogSci 2017 - Proceedings of the 39th Annual Meeting of the Cognitive Science Society
Subtitle of host publicationComputational Foundations of Cognition
Pages168-173
Number of pages6
ISBN (Electronic)9780991196760
Publication statusPublished - 2017
Event39th Annual Meeting of the Cognitive Science Society: Computational Foundations of Cognition, CogSci 2017 - London, United Kingdom
Duration: 26 Jul 201729 Jul 2017

Publication series

SeriesCogSci 2017 - Proceedings of the 39th Annual Meeting of the Cognitive Science Society: Computational Foundations of Cognition

Conference

Conference39th Annual Meeting of the Cognitive Science Society: Computational Foundations of Cognition, CogSci 2017
Country/TerritoryUnited Kingdom
CityLondon
Period26/07/1729/07/17

Bibliographical note

Publisher Copyright:
© CogSci 2017.

Research programs

  • ESSB PSY

Fingerprint

Dive into the research topics of 'Watching non-corresponding gestures helps learners with high visuospatial ability to learn about movements with dynamic visualizations: An fNIRS study'. Together they form a unique fingerprint.

Cite this