Browsing by Subject "FACIAL EXPRESSIONS"

Sort by: Order: Results:

Now showing items 1-4 of 4
  • Quarto, Tiziana; Paparella, Isabella; De Tullio, Davide; Viscanti, Ciovanna; Fazio, Leonardo; Taurisano, Paolo; Romano, Raffaella; Rampino, Antonio; Masellis, Rita; Popolizio, Teresa; Selvaggi, Pierluigi; Pergola, Giulio; Bertolino, Alessandro; Blasi, Giuseppe (2018)
    The brain functional mechanisms translating genetic risk into emotional symptoms in schizophrenia (SCZ) may include abnormal functional integration between areas key for emotion processing, such as the amygdala and the lateral prefrontal cortex (LPFC). Indeed, investigation of these mechanisms is also complicated by emotion processing comprising different subcomponents and by disease-associated state variables. Here, our aim was to investigate the relationship between risk for SCZ and effective connectivity between the amygdala and the LPFC during different subcomponents of emotion processing. Thus, we first characterized with dynamic causal modeling (DCM) physiological patterns of LPFC amygdala effective connectivity in healthy controls (HC) during implicit and explicit emotion processing. Then, we compared DCM patterns in a subsample of HC, in patients with SCZ and in healthy siblings of patients (SIB), matched for demographics. Finally, we investigated in HC association of LPFC amygdala effective connectivity with a genome-wide supported variant increasing genetic risk for SCZ and possibly relevant to emotion processing (DRD2 rs2514218). In HC, we found that a "bottom-up" amygdala-to-LPFC pattern during implicit processing and a "top-down" LPFC-to-amygdala pattern during explicit processing were the most likely directional models of effective connectivity. Differently, implicit emotion processing in SIB, SCZ, and HC homozygous for the SCZ risk rs2514218 C allele was associated with decreased probability for the "bottom-up" as well as with increased probability for the "top-down" model. These findings suggest that task-specific anomaly in the directional flow of information or disconnection between the amygdala and the LPFC is a good candidate endophenotype of SCZ.
  • Quarto, Tiziana; Blasi, Giuseppe; Pallesen, Karen Johanne; Bertolino, Alessandro; Brattico, Elvira (2014)
  • Lindeman, Marjaana; Koirikivi, Iivo; Lipsanen, Jari (2018)
    Research on empathy has increased rapidly during the last decades but brief assessment methods are not easily available. The aim was to develop a test for affective empathic reactions which would be simple to translate into different languages, easy to use in a variety of research settings, and which would catch the empathic reactions at the moment they arise. We describe the development and validation of the Pictorial Empathy Test (PET) in three studies (Study 1, N = 91: Study 2, N = 2,789: and Study 3, N = 114). The PET includes seven photographs about distressed individuals and the participants are asked to rate on a 5-point scale how emotionally moving they find the photograph. The results indicated that the PET displayed a unitary factor structure and it had high internal consistency and good seven-month test-retest reliability. In addition, the results supported convergent and discriminant validity of the test. The results suggest that the PET is a useful addition to the prevailing methods for assessing affective empathy.
  • Muukkonen, Ilkka; Ölander, Kaisu; Numminen, Jussi Kustaa; Salmela, Viljami (2020)
    The temporal and spatial neural processing of faces has been investigated rigorously, but few studies have unified these dimensions to reveal the spatio-temporal dynamics postulated by the models of face processing. We used support vector machine decoding and representational similarity analysis to combine information from different locations (fMRI), time windows (EEG), and theoretical models. By correlating representational dissimilarity matrices (RDMs) derived from multiple pairwise classifications of neural responses to different facial expressions (neutral, happy, fearful, angry), we found early EEG time windows (starting around 130 ​ms) to match fMRI data from primary visual cortex (V1), and later time windows (starting around 190 ​ms) to match data from lateral occipital, fusiform face complex, and temporal-parietal-occipital junction (TPOJ). According to model comparisons, the EEG classification results were based more on low-level visual features than expression intensities or categories. In fMRI, the model comparisons revealed change along the processing hierarchy, from low-level visual feature coding in V1 to coding of intensity of expressions in the right TPOJ. The results highlight the importance of a multimodal approach for understanding the functional roles of different brain regions in face processing.