Perception of co-speech gestures in aphasic patients: A visual exploration study during the observation of dyadic conversations.

Preisig, Basil; Eggenberger, Noëmi; Zito, Giuseppe Angelo; Vanbellingen, Tim; Schumacher, Rahel; Hopfner, Simone; Nyffeler, Thomas; Gutbrod, Klemens; Annoni, Jean-Marie; Bohlhalter, Stephan; Müri, René Martin (2015). Perception of co-speech gestures in aphasic patients: A visual exploration study during the observation of dyadic conversations. Cortex, 64, pp. 157-168. Elsevier 10.1016/j.cortex.2014.10.013

[img] Text
Preisig14_Cortex.pdf - Published Version
Restricted to registered users only
Available under License Publisher holds Copyright.

Download (1MB)

BACKGROUND

Co-speech gestures are part of nonverbal communication during conversations. They either support the verbal message or provide the interlocutor with additional information. Furthermore, they prompt as nonverbal cues the cooperative process of turn taking. In the present study, we investigated the influence of co-speech gestures on the perception of dyadic dialogue in aphasic patients. In particular, we analysed the impact of co-speech gestures on gaze direction (towards speaker or listener) and fixation of body parts. We hypothesized that aphasic patients, who are restricted in verbal comprehension, adapt their visual exploration strategies.

METHODS

Sixteen aphasic patients and 23 healthy control subjects participated in the study. Visual exploration behaviour was measured by means of a contact-free infrared eye-tracker while subjects were watching videos depicting spontaneous dialogues between two individuals. Cumulative fixation duration and mean fixation duration were calculated for the factors co-speech gesture (present and absent), gaze direction (to the speaker or to the listener), and region of interest (ROI), including hands, face, and body.

RESULTS

Both aphasic patients and healthy controls mainly fixated the speaker's face. We found a significant co-speech gesture × ROI interaction, indicating that the presence of a co-speech gesture encouraged subjects to look at the speaker. Further, there was a significant gaze direction × ROI × group interaction revealing that aphasic patients showed reduced cumulative fixation duration on the speaker's face compared to healthy controls.

CONCLUSION

Co-speech gestures guide the observer's attention towards the speaker, the source of semantic input. It is discussed whether an underlying semantic processing deficit or a deficit to integrate audio-visual information may cause aphasic patients to explore less the speaker's face.

Item Type:

Journal Article (Original Article)

Division/Institute:

04 Faculty of Medicine > University Psychiatric Services > University Hospital of Geriatric Psychiatry and Psychotherapy
04 Faculty of Medicine > Department of Head Organs and Neurology (DKNS) > Clinic of Neurology
10 Strategic Research Centers > ARTORG Center for Biomedical Engineering Research > ARTORG Center - Gerontechnology and Rehabilitation
10 Strategic Research Centers > Center for Cognition, Learning and Memory (CCLM)

Graduate School:

Graduate School for Health Sciences (GHS)

UniBE Contributor:

Preisig, Basil, Eggenberger, Noëmi, Zito, Giuseppe Angelo, Vanbellingen, Tim, Schumacher, Rahel, Hopfner, Simone, Nyffeler, Thomas, Gutbrod, Klemens, Bohlhalter, Stephan, Müri, René Martin

Subjects:

600 Technology > 610 Medicine & health

ISSN:

0010-9452

Publisher:

Elsevier

Language:

English

Submitter:

Valentina Rossetti

Date Deposited:

18 Feb 2015 10:30

Last Modified:

05 Dec 2022 14:40

Publisher DOI:

10.1016/j.cortex.2014.10.013

PubMed ID:

25461716

Uncontrolled Keywords:

Aphasia, Apraxia, Dialogue, Eye movements, Gestures, Visual exploration

BORIS DOI:

10.7892/boris.63304

URI:

https://boris.unibe.ch/id/eprint/63304

Actions (login required)

Edit item Edit item
Provide Feedback