- Hearing Impairment and Communication
- Action Observation and Synchronization
- Hand Gesture Recognition Systems
- Speech and dialogue systems
- Animal Vocal Communication and Behavior
- Language, Metaphor, and Cognition
- Tactile and Sensory Interactions
- Child and Animal Learning Development
- Neuroscience and Music Perception
- Multisensory perception and integration
- Motor Control and Adaptation
- Language, Discourse, Communication Strategies
- Neurobiology of Language and Bilingualism
- Speech and Audio Processing
- Music Technology and Sound Studies
- Music and Audio Processing
- Language and cultural evolution
- Visual perception and processing mechanisms
- Embodied and Extended Cognition
- Robotics and Automated Systems
- Visual and Cognitive Learning Processes
- Hemispheric Asymmetry in Neuroscience
- Phonetics and Phonology Research
- Intelligent Tutoring Systems and Adaptive Learning
- Subtitles and Audiovisual Media
Radboud University Nijmegen
2019-2024
Google (United States)
2024
Max Planck Institute for Psycholinguistics
2019-2022
University of Connecticut
2018-2020
Universitat Oberta de Catalunya
2019-2020
Erasmus University Rotterdam
2013-2019
University of Wollongong
2015-2019
Vrije Universiteit Amsterdam
2012
Gestures are often considered to be demonstrative of the embodied nature mind (Hostetter & Alibali, 2008). In this article we review current theories and research targeted at intra-cognitive role gestures. We ask question how can gestures support internal cognitive processes gesturer? suggest that extant in a sense disembodied, because they focus solely on embodiment terms sensorimotor neural precursors As result, lacking explanatory scope address gestures-as-bodily-acts fulfill function. On...
Abstract There is increasing evidence that hand gestures and speech synchronize their activity on multiple dimensions timescales. For example, gesture’s kinematic peaks (e.g., maximum speed) are coupled with prosodic markers in speech. Such coupling operates very short timescales at the level of syllables (200 ms), therefore requires high-resolution measurement gesture kinematics acoustics. High-resolution analysis common for studies, given field’s classic ties (psycho)linguistics. However,...
We show that the human voice has complex acoustic qualities are directly coupled to peripheral musculoskeletal tensioning of body, such as subtle wrist movements. In this study, vocalizers produced a steady-state vocalization while rhythmically moving or arm at different tempos. Although listeners could only hear and not see vocalizer, they were able completely synchronize their own rhythmic movement with vocalizer which perceived in acoustics. This study corroborates recent evidence...
The phenomenon of gesture-speech synchrony involves tight coupling prosodic contrasts in gesture movement (e.g., peak velocity) and speech peaks fundamental frequency; F0). Gesture-speech has been understood as completely governed by sophisticated neural-cognitive mechanisms. However, may have its original basis the resonating forces that travel through body. In current preregistered study, movements with high physical impact affected phonation line observed natural contexts. Rhythmic...
Abstract Gesture–speech synchrony re‐stabilizes when hand movement or speech is disrupted by a delayed feedback manipulation, suggesting strong bidirectional coupling between gesture and speech. Yet it has also been argued from case studies in perceptual–motor pathology that gestures are special kind of action does not require closed‐loop re‐afferent to maintain with In the current pre‐registered within‐subject study, we used motion tracking conceptually replicate McNeill's ( ) classic study...
Expressive moments in communicative hand gestures often align with emphatic stress speech. It has recently been found that acoustic markers of arise naturally during steady-state phonation when upper-limb movements impart physical impulses on the body, most likely affecting acoustics via respiratory activity. In this confirmatory study, participants (N = 29) repeatedly uttered consonant-vowel (/pa/) mono-syllables while moving particular phase relations speech, or not upper limbs. This study...
Abstract How does communicative efficiency shape language use? We approach this question by studying it at the level of dyad, and in terms multimodal utterances. investigate whether how people minimize their joint speech gesture efforts face-to-face interactions, using linguistic kinematic analyses. zoom on other-initiated repair—a conversational microcosm where coordinate utterances to solve problems with perceiving or understanding. find that spoken gestural modalities are wielded parallel...
Humans typically move and vocalize in a time-synchronized fashion, aligning prominence-lending hand movements to acoustically emphasized syllables. This requires complex coordination. When speaking foreign language, learners often place prominence on the wrong syllable word, which contributes noticeable accent. In this pre-registered kinematic-acoustic study, we test whether accent is present timing of co-speech manual movements. Results demonstrate ‘kinematic accent’ Dutch Spanish producing...
We introduce the EnvisionHGdetector toolkit (v1.0.0.1), which allows for kinematic analysisof automatically detected co-speech gestures from single-person videos. The convolutional neural net-work model detecting is trained using TensorFlow on five open datasets – ZHUBO, SaGA,MULTISIMO, ECOLANG, and TEDM3D which, combined, represent over 8,000 instances of ges-tures with varying recording conditions. Inspired further building action detection tool Nod-ding Pigeon [57], it utilizes MediaPipe...
During silent problem solving, hand gestures arise that have no communicative intent. The role of such co-thought in cognition has been understudied cognitive research as compared to co-speech gestures. We investigated whether gesticulation during solving supported subsequent performance a Tower Hanoi problem-solving task, relation visual working-memory capacity and task complexity. Seventy-six participants were assigned either an instructed gesture condition or allowed them gesture, but...
The split-attention effect entails that learning from spatially separated, but mutually referring information sources (e.g., text and picture), is less effective than the equivalent integrated sources. According to cognitive load theory, impaired caused by working memory imposed need distribute attention between mentally integrate them. In this study, we directly tested whether spatial separation per se. Spatial distance was varied in basic tasks involving pictures (Experiment 1)...
Abstract In many musical styles, vocalists manually gesture while they sing. Coupling between kinematics and vocalization has been examined in speech contexts, but it is an open question how these couple music making. We examine this a corpus of South Indian, Karnatak vocal that includes motion‐capture data. Through peak magnitude analysis (linear mixed regression) continuous time‐series analyses (generalized additive modeling), we assessed whether trajectories around peaks vertical...
We introduce applications of established methods in time-series and network analysis that we jointly apply here for the kinematic study gesture ensembles. define a ensemble as set gestures produced during discourse by single person or group persons. Here are interested how kinematically relate to one another. use bivariate called dynamic time warping assess similar each is other terms their velocity profiles (as well studying multivariate cases with speech amplitude envelope profiles). By...
Abstract It is commonly understood that hand gesture and speech coordination in humans culturally cognitively acquired, rather than having a biological basis. Recently, however, the biomechanical physical coupling of arm movements to vocalization has been studied steady‐state monosyllabic utterances, where forces produced during gesturing are transferred onto tensioned body, leading changes respiratory‐related activity thereby affecting F0 intensity. In current experiment ( n = 37), we...
Purpose: This study investigated whether temporal coupling was present between lower limb motion rate and different speech tempi during exercise intensities. We hypothesized that increased physical workload would increase cycling this could account for previous findings of tempo exercise. also the choice task (read vs. spontaneous speech) affected results. Method: Forty-eight women who were ages 18–35 years participated. A within-participant design used with fixed-order counterbalanced...
Summary Previous research has established that gesture observation aids learning in children. The current study examined whether of gestures (i.e. depictive and tracing gestures) differentially affected verbal visual–spatial retention when a route its street names. Specifically, we explored children ( n = 97) with lower visual working‐memory capacity benefited more from observing as compared who score higher on these traits. To this end, 11‐ to 13‐year‐old were presented an instructional...
The gesture-speech physics theory suggests that there are biomechanical interactions of the voice with whole body, driving speech to align fluctuations in loudness and F0 upper-limb movement. This exploratory study offers a possible falsification theory, which would predict effects movement on as well respiration. We therefore investigate co-movement expiration. Seventeen participants were asked produce continuous exhalation for several seconds. After 3s, they execute one five within-subject...
Biological structures are defined by rigid elements, such as bones, and elastic like muscles membranes. Computer vision advances have enabled automatic tracking of moving animal skeletal poses. Such developments provide insights into complex time-varying dynamics biological motion. Conversely, the soft-tissues organisms, nose elephant seals, or buccal sac frogs, poorly studied no computer methods been proposed. This leaves major gaps in different areas biology. In primatology, most...