AUTHOR=Zhang Yixin , Chen Xi , Chen Si , Meng Yuzhe , Lee Albert TITLE=Visual-auditory perception of prosodic focus in Japanese by native and non-native speakers JOURNAL=Frontiers in Human Neuroscience VOLUME=Volume 17 - 2023 YEAR=2023 URL=https://www.frontiersin.org/journals/human-neuroscience/articles/10.3389/fnhum.2023.1237395 DOI=10.3389/fnhum.2023.1237395 ISSN=1662-5161 ABSTRACT=Speech communication is multi-sensory in nature. Seeing a speaker's head and face movements may significantly influence the listeners' speech processing, but the visual-auditory integration in the perception of prosodic focus is less well investigated, especially not from a cross-linguistic perspective. In the present study, thirty native Tokyo Japanese speakers and thirty Cantonesespeaking Japanese learners who had passed the Japanese-Language Proficiency Test with level N2 or N3 were asked to judge the naturalness of 28 question-answer pairs made up of broad focus eliciting questions and three-word answers carrying broad focus, or contrastive or non-contrastive narrow focus on the middle object words. Question-answer pairs were presented in two sensory modalities, auditory-only and visual-auditory modalities in two separate experimental sessions. Both the Japanese and Cantonese groups showed weak integration of visual cues in the judgement of naturalness. Visual-auditory modality only significantly influenced Japanese participants' perception when the questions and answers were mismatched, but when the answers carried non-contrastive narrow focus, the visual cues impeded rather than facilitated their judgement. Also, the influences of specific visual cues like the displacement of eyebrows or head movements of both Japanese and Cantonese participants' responses were only significant when the questions and answers were This is a provisional file, not the final typeset article mismatched. While Japanese participants consistently relied on the left eyebrow for focus perception, the Cantonese participants referred to head movements more often. Overall, the present findings indicate that the integration of visual cues in perception of focus may be specific to languages rather than universal, adding to our understanding of multisensory speech perception.