Skip to content Skip to menu

Multimodal Language in Aphasia

multimodalIt is clear that there is a relationship between some co-speech gestures (those that imagistically evoke some characteristics of the referents) and spoken language. Understanding such relationship may be key to answer basic questions concerning the nature of meaning-based representations and processes engaged in communication.

It is also clear that there is a relationship between speech and mouth movements. Mouth movements necessarily accompany speech and can provide sensory information (regarding visemes) that support speech processing. We have known for long that seeing a speaker helps in auditory speech perception and lexical recognition.

We study how speakers, including aphasic and apraxic patients integrate information from speech, gesture and mouth movements in order to understand whether these cues can be used in neurorehabilitation of aphasia and how. Work with brain damaged individuals is carried out in collaboration with Drs Laurel Buxbaum and Myrna Schwartz at the Moss Rehabilitation Research Institute in Philadelphia.

 

Key references

Vigliocco, G., Krason, A., Stoll, H., Monti, A. & Buxbaum, L. (2020). Multimodal comprehension in left hemisphere stroke patients. Cortex. Publication available here