Statistical pattern recognition reveals shared neural signatures for displaying and recognizing specific facial expressions

dc.contributor.authorVolynets Sofia
dc.contributor.authorSmirnov Dmitry
dc.contributor.authorSaarimäki Heini
dc.contributor.authorNummenmaa Lauri
dc.contributor.organizationfi=PET-keskus|en=Turku PET Centre|
dc.contributor.organizationfi=psykologia|en=Psychology|
dc.contributor.organizationfi=tyks, vsshp|en=tyks, varha|
dc.contributor.organization-code1.2.246.10.2458963.20.14646305228
dc.contributor.organization-code1.2.246.10.2458963.20.15586825505
dc.converis.publication-id51055376
dc.converis.urlhttps://research.utu.fi/converis/portal/Publication/51055376
dc.date.accessioned2022-10-27T11:56:14Z
dc.date.available2022-10-27T11:56:14Z
dc.description.abstractHuman neuroimaging and behavioural studies suggest that somatomotor 'mirroring' of seen facial expressions may support their recognition. Here we show that viewing specific facial expressions triggers the representation corresponding to that expression in the observer's brain. Twelve healthy female volunteers underwent two separate fMRI sessions: one where they observed and another where they displayed three types of facial expressions (joy, anger and disgust). Pattern classifier based on Bayesian logistic regression was trained to classify facial expressions (i) within modality (trained and tested with data recorded while observing or displaying expressions) and (ii) between modalities (trained with data recorded while displaying expressions and tested with data recorded while observing the expressions). Cross-modal classification was performed in two ways: with and without functional realignment of the data across observing/displaying conditions. All expressions could be accurately classified within and also across modalities. Brain regions contributing most to cross-modal classification accuracy included primary motor and somatosensory cortices. Functional realignment led to only minor increases in cross-modal classification accuracy for most of the examined ROIs. Substantial improvement was observed in the occipito-ventral components of the core system for facial expression recognition. Altogether these results support the embodied emotion recognition model and show that expression-specific somatomotor neural signatures could support facial expression recognition.
dc.format.pagerange803
dc.format.pagerange813
dc.identifier.eissn1749-5024
dc.identifier.jour-issn1749-5016
dc.identifier.olddbid172929
dc.identifier.oldhandle10024/156023
dc.identifier.urihttps://www.utupub.fi/handle/11111/30726
dc.identifier.urnURN:NBN:fi-fe2021042822020
dc.language.isoen
dc.okm.affiliatedauthorNummenmaa, Lauri
dc.okm.affiliatedauthorDataimport, tyks, vsshp
dc.okm.discipline3126 Surgery, anesthesiology, intensive care, radiologyen_GB
dc.okm.discipline515 Psychologyen_GB
dc.okm.discipline3126 Kirurgia, anestesiologia, tehohoito, radiologiafi_FI
dc.okm.discipline515 Psykologiafi_FI
dc.okm.internationalcopublicationnot an international co-publication
dc.okm.internationalityInternational publication
dc.okm.typeA1 ScientificArticle
dc.publisherOXFORD UNIV PRESS
dc.publisher.countryUnited Kingdomen_GB
dc.publisher.countryBritanniafi_FI
dc.publisher.country-codeGB
dc.relation.doi10.1093/scan/nsaa110
dc.relation.ispartofjournalSocial Cognitive and Affective Neuroscience
dc.relation.issue8
dc.relation.volume15
dc.source.identifierhttps://www.utupub.fi/handle/10024/156023
dc.titleStatistical pattern recognition reveals shared neural signatures for displaying and recognizing specific facial expressions
dc.year.issued2020

Tiedostot

Näytetään 1 - 1 / 1
Ladataan...
Name:
nsaa110.pdf
Size:
4.82 MB
Format:
Adobe Portable Document Format
Description:
Publishers's PDF