What we do
What we do
The API group aims to explore the interplay between mind and body, focusing on interoception processes and bodily representations in the social and affective domains. Through psychophysiological measurements (e.g., skin conductance, heart rate, body temperature) and behavioral techniques, we investigate how emotions influence physiological responses and bodily sensations.
Who we are
Who we are
Principal InvestigatorAssistant Professor (RTD-A), PhD
Scholar, ResearchGate, Twitter
Giada is interested in understanding the interactions between mind and body in shaping affective experiences.
Scholar, ResearchGate, Twitter
Giada is interested in understanding the interactions between mind and body in shaping affective experiences.
SamuelePinna
Research fellow -Psychology
Samuele is interested in understanding interoception through affective and cognitive components according to recent psychological and neuroscientific theories.
Samuele is interested in understanding interoception through affective and cognitive components according to recent psychological and neuroscientific theories.
What we publish
What we publish
Dissecting abstract, modality-specific and experience-dependent coding of affect in the human brain
Lettieri, Handjaras, Cappello, Setti, Bottari, Bruno, Diano, Leo, Tinti, Garbarini, Pietrini, Ricciardi, CecchettiScience Advances, 2024. DOI: 10.1126/sciadv.adk6840ABSTRACT: Emotion and perception are tightly intertwined, as affective experiences often arise from the appraisal of sensory information. Nonetheless, whether the brain encodes emotional instances using a sensory-specific code or in a more abstract manner is unclear. Here, we answer this question by measuring the association between emotion ratings collected during a unisensory or multisensory presentation of a full-length movie and brain activity recorded in typically developed, congenitally blind and congenitally deaf participants. Emotional instances are encoded in a vast network encompassing sensory, prefrontal, and temporal cortices. Within this network, the ventromedial prefrontal cortex stores a categorical representation of emotion independent of modality and previous sensory experience, and the posterior superior temporal cortex maps the valence dimension using an abstract code. Sensory experience more than modality affects how the brain organizes emotional information outside supramodal regions, suggesting the existence of a scaffold for the representation of emotional states where sensory inputs during development shape its functioning.
Default and Control networks connectivity dynamics track the stream of affect at multiple timescales
Lettieri, Handjaras, Setti, Cappello, Bruno, Diano, Leo, Ricciardi, Pietrini, CecchettiSCAN, 2021. DOI: 10.1093/scan/nsab112ABSTRACT: In everyday life the stream of affect results from the interaction between past experiences, expectations, and the unfolding of events. How the brain represents the relationship between time and affect has been hardly explored, as it requires modeling the complexity of everyday life in the laboratory setting. Movies condense into hours a multitude of emotional responses, synchronized across subjects and characterized by temporal dynamics alike real-world experiences. Here, we use time-varying intersubject brain synchronization and real-time behavioral reports to test whether connectivity dynamics track changes in affect during movie watching. Results show that polarity and intensity of experiences relate to connectivity of the default mode and control networks and converge in the right temporo-parietal cortex. We validate these results in two experiments including four independent samples, two movies, and alternative analysis workflows. Lastly, we reveal chronotopic connectivity maps within temporo-parietal and prefrontal cortex, where adjacent areas preferentially encode affect at specific timescales.
Emotionotopy in the human right temporo-parietal cortex
Lettieri, Handjaras, Ricciardi, Leo, Papale, Betta, Pietrini, CecchettiNature Communications, 2019. DOI: 10.1038/s41467-019-13599-zABSTRACT: Humans use emotions to decipher complex cascades of internal events. However, which mechanisms link descriptions of affective states to brain activity is unclear, with evidence supporting either local or distributed processing. A biologically favorable alternative is provided by the notion of gradient, which postulates the isomorphism between functional representations of stimulus features and cortical distance. Here, we use fMRI activity evoked by an emotionally charged movie and continuous ratings of the perceived emotion intensity to reveal the topographic organization of affective states. Results show that three orthogonal and spatially overlapping gradients encode the polarity, complexity and intensity of emotional experiences in right temporo-parietal territories. The spatial arrangement of these gradients allows the brain to map a variety of affective states within a single patch of cortex. As this organization resembles how sensory regions represent psychophysical properties (e.g., retinotopy), we propose emotionotopy as a principle of emotion coding.