SRPP Beyond reaction time: Articulatory evidence of perception-production link in speech using the Stimulus-Response Compatibility paradigm.

Takayuki Nagamine (Department of Speech Hearing and Phonetic Sciences, University College London)
20 February 2026, 14h0015h30

This talk will introduce our ongoing project investigating the link between perception and production in speech. In the Stimulus-Response Compatibility (SRC) paradigm, in which participants are typically prompted to produce a target syllable while being presented with either congruent or incongruent distractors. Responses tend to be slower in incongruent trials (i.e., covert imitation effect), reflecting a competition between perception-driven and goal-driven motor plans. The short response-distractor time lag in the SRC task design makes it suited to study the motor system engagement upon speech perception during speech planning. Our aim is to obtain finer-grained insights into the nature of perception-production link using electromagnetic articulography (EMA).

The discussion will be based on our preliminary analyses based on a subset of data from ten L1 British English speakers, using /ɹa/ and /va/ as prompt and distractor syllables. Reaction time (RT) analysis based on acoustic data shows a clear covert imitation effect for /ɹa/ but not for /va/. The timing of maximal displacement of tongue tip (TT) for /ɹa/ and lower lip (LL) for /va/ also followed a similar pattern. Time-varying position trajectories and tangential velocity profiles, however, show evidence of TT gestural intrusion for the /va/ production in the incongruent trials (i.e., with the distractor /ɹa/). Such between-condition difference in TT activity, despite a lack of clear congruency effects in RT measurement, might result from a greater degree of TT activation during speech planning due to the perception of distractor stimulus, demonstrating that the motor patterns activated based on observation only might in part be executed. The implications of the behavioural results will be discussed in the light of articulatory complexity, multimodal speech perception, and cognitive sensorimotor theories.

Prochains événements

Voir la liste d'événements