There is increasing evidence that learning manual tasks from dynamic visualizations (e.g., origami folding) is facilitated when human hands are shown or gestures can be observed in the visualizations. This study examined whether observing and making gestures improves learning about non-human biological movements (i.e., fish locomotion) and whether gestures that correspond to the to-be-learned movement are superior to non-corresponding gestures. Moreover, learners’ visuospatial ability was assessed as a possible moderator. Regarding underlying neurophysiological processes, functional near-infrared spectroscopy (fNIRS) was used to investigate whether gestures activate the human mirror-neuron system (hMNS) and whether this activation mediates the facilitation of learning. During learning, participants viewed animations that were supplemented with either a self-gesturing instruction (yes/no) and/or a gesture video (corresponding/non-corresponding/no gesture) resulting in six conditions (2x3-between-subjects design). Results showed that higher-visuospatial-ability learners benefitted from learning with non-corresponding gestures, whereas those gestures were detrimental for lower-visuospatial-ability learners. Furthermore, activation of the inferior frontal cortex (part of hMNS) tended to predict better learning outcomes. Making gestures did not influence learning, but participants observing corresponding gestures showed higher inferior frontal cortex activation if they self-gestured than when they did not self-gesture. Implications of the results for the design of instructional materials are discussed.