What We Do

During brain development maturation and experience work jointly to provide the optimal neural representations of the environment to cope with future needs. By studying typically developed individuals,  sensory deprivation, and sensory restoration, the SEED group explores mechanisms underlying functional and structural development, organization, and representations of the senses. Investigations on sensory neural systems that are deprived of their typical sensory input, as in the case of congenital blindness or deafness, reveal the complexity of neural systems and their adaptations: on the one hand, neural representations of specific cortical areas can be functionally preserved despite the lack of a sense, on the other hand, intra - and cross-modal plasticity, that is changes of neural responses in spared and deprived sensory modalities, disclose neural plasticity effects which can lead to compensatory abilities. The model of sensory restoration, as in the case of sight or hearing recovery, proved to be effective to test for the existence of sensitive or critical periods, during which specific experience must be available for typical development of neural circuits or, in striking contrast, to test for the existence of functions which develop independently of early input. All these approaches rely on the understanding of sensory systems development and functioning in a multisensory framework.

Research is conducted at the interface between cognitive neuroscience, psychology, and biological engineering – applying multiple methods such as computational neuroscience, electrical neuroimaging, functional and structural magnetic resonance imaging, psychophysics to elucidate complex neural dynamics.

ONGOING PROJECTS


Who We Are

DAVIDE BOTTARI
Principal InvestigatorAssistant Professor, PhDScholar, ResearchGate
davide.bottari@imtlucca.it
ALESSANDRA FEDERICI
MARTA  FANTONI
NICOLÒ CASTELLANI
ROBERTA LASALA
MARTINA BATTISTA
FRANCESCA COLLESEI
PhD Student
francesca.collesei@imtlucca.it

Past Members

Martina Berto, Dila Suay, Alice Martinelli, Evgenia Bednaya

What We Publish

Brain Encoding of Naturalistic, Continuous, and Unpredictable Tactile Events

Nicolò Castellani, Alessandra Federici, Marta Fantoni, Emiliano Ricciardi, Francesca Garbarini and Davide Bottari (2024). 

eNeuro.

https://doi.org/10.1523/ENEURO.0238-24.2024


Studies employing EEG to measure somatosensory responses have been typically optimized to compute event-related potentials in response to discrete events. However, tactile interactions involve continuous processing of nonstationary inputs that change in location, duration, and intensity. To fill this gap, this study aims to demonstrate the possibility of measuring the neural tracking of continuous and unpredictable tactile information. Twenty-seven young adults (females, 15) were continuously and passively stimulated with a random series of gentle brushes on single fingers of each hand, which were covered from view. Thus, tactile stimulations were unique for each participant and stimulated fingers. An encoding model measured the degree of synchronization between brain activity and continuous tactile input, generating a temporal response function (TRF). Brain topographies associated with the encoding of each finger stimulation showed a contralateral response at central sensors starting at 50 ms and peaking at ∼140 ms of lag, followed by a bilateral response at ∼240 ms. A series of analyses highlighted that reliable tactile TRF emerged after just 3 min of stimulation. Strikingly, topographical patterns of the TRF allowed discriminating digit lateralization across hands and digit representation within each hand. Our results demonstrated for the first time the possibility of using EEG to measure the neural tracking of a naturalistic, continuous, and unpredictable stimulation in the somatosensory domain. Crucially, this approach allows the study of brain activity following individualized, idiosyncratic tactile events to the fingers.

The impact of face masks on face-to-face neural tracking of speech: auditory and visual obstacles

M. Fantoni, A. Federici, I. Camponogara, G. Handjaras, A. Martinelli, E. Bednaya, E.Ricciardi, F.Pavani, D.Bottari(2024). 

BioRxiv.

https://doi.org/10.1101/2024.02.12.577414


Face masks provide fundamental protection against the transmission of respiratory viruses but hamper communication. We estimated auditory and visual obstacles generated by face masks on communication by measuring the neural tracking of face-to-face speech. To this end, we recorded the EEG while participants were exposed to naturalistic audio-visual speech, embedded in multi-talker noise, in three contexts: (i) no-mask (audio-visual information was fully available), (ii) virtual mask (occluded lips, but intact audio), and (iii) real mask (occluded lips and degraded audio). The neural tracking of lip movements and the sound envelope of speech was measured through backward modeling, that is, by reconstructing stimulus properties from neural activity. Behaviorally, face masks increased listening -phonological-errors in speech content retrieval and perceived listening difficulty. At the neural level, we observed that the occlusion of the mouth abolished lip tracking and dampened neural tracking of the speech envelope at the earliest processing stages. Degraded acoustic information due to face mask filtering altered neural tracking at later processing stages instead. Finally, a consistent link emerged between the increment of listening perceived difficulty and the drop in reconstruction performance of speech envelope when attending to a speaker wearing a face mask. Results clearly dissociated the visual and auditory impacts of face masks on face-to-face neural tracking of speech. While face masks hampered the ability to predict and integrate audio-visual speech, the auditory filter generated by face masks impacted the neural processing stages typically associated with auditory selective attention. The link between perceived difficulty and neural tracking drop provided evidence of a major impact of face masks on the metacognitive levels subtending speech processing.

Distinguishing Fine Structure and Summary Representation of Sound Textures from Neural Activity

Martina Berto, Emiliano Ricciardi, Pietro Pietrini, Nathan Weisz & Davide Bottari (2023). 

eNeuro.

https://doi.org/10.1523/ENEURO.0026-23.2023


The auditory system relies on both local and summary representations; acoustic local features exceeding system constraints are compacted into a set of summary statistics. Such compression is pivotal for sound-object recognition. Here, we assessed whether computations subtending local and statistical representations of sounds could be distinguished at the neural level. A computational auditory model was employed to extract auditory statistics from natural sound textures (i.e., fire, rain) and to generate synthetic exemplars where local and statistical properties were controlled. Twenty-four human participants were passively exposed to auditory streams while the electroencephalography (EEG) was recorded. Each stream could consist of short, medium, or long sounds to vary the amount of acoustic information. Short and long sounds were expected to engage local or summary statistics representations, respectively. Data revealed a clear dissociation. Compared with summary-based ones, auditory-evoked responses based on local information were selectively greater in magnitude in short sounds. Opposite patterns emerged for longer sounds. Neural oscillations revealed that local features and summary statistics rely on neural activity occurring at different temporal scales, faster (beta) or slower (theta-alpha). These dissociations emerged automatically without explicit engagement in a discrimination task. Overall, this study demonstrates that the auditory system developed distinct coding mechanisms to discriminate changes in the acoustic environment based on fine structure and summary representations. 

Altered neural oscillations underlying visuospatial processing in cerebral visual impairment.

Federici, A., Bennett, C. R., Bauer, C. M., Manley, C. E., Ricciardi, E., Bottari, D., & Merabet, L. B. (2023). 

Brain Communications.

https://doi.org/10.1093/braincomms/fcad232



Visuospatial processing deficits are commonly observed in individuals with cerebral visual impairment, even in cases where visual acuity and visual field functions are intact. Cerebral visual impairment is a brain-based visual disorder associated with the maldevelopment of central visual pathways and structures. However, the neurophysiological basis underlying higher-order perceptual impairments in this condition has not been clearly identified, which in turn poses limits on developing rehabilitative interventions. Using combined eye tracking and EEG recordings, we assessed the profile and performance of visual search on a naturalistic virtual reality-based task. Participants with cerebral visual impairment and controls with neurotypical development were instructed to search, locate and fixate on a specific target placed among surrounding distractors at two levels of task difficulty. We analysed evoked (phase-locked) and induced (non-phase-locked) components of broadband (4-55 Hz) neural oscillations to uncover the neurophysiological basis of visuospatial processing. We found that visual search performance in cerebral visual impairment was impaired compared to controls (as indexed by outcomes of success rate, reaction time and gaze error). Analysis of neural oscillations revealed markedly reduced early-onset evoked theta [4-6 Hz] activity (within 0.5 s) regardless of task difficulty. Moreover, while induced alpha activity increased with task difficulty in controls, this modulation was absent in the cerebral visual impairment group identifying a potential neural correlate related to deficits with visual search and distractor suppression. Finally, cerebral visual impairment participants also showed a sustained induced gamma response [30-45 Hz]. We conclude that impaired visual search performance in cerebral visual impairment is associated with substantial alterations across a wide range of neural oscillation frequencies. This includes both evoked and induced components suggesting the involvement of feedforward and feedback processing as well as local and distributed levels of neural processing.

Crossmodal plasticity following short-term monocular deprivation.

Federici, A., Bernardi, G., Senna, I., Fantoni, M., Ernst, M. O., Ricciardi, E., & Bottari, D. (2023). 

NeuroImage.

https://doi.org/10.1016/j.neuroimage.2023.120141

A brief period of monocular deprivation (MD) induces short-term plasticity of the adult visual system. Whether MD elicits neural changes beyond visual processing is yet unclear. Here, we assessed the specific impact of MD on neural correlates of multisensory processes. Neural oscillations associated with visual and audio-visual processing were measured for both the deprived and the non-deprived eye. Results revealed that MD changed neural activities associated with visual and multisensory processes in an eye-specific manner. Selectively for the deprived eye, alpha synchronization was reduced within the first 150 ms of visual processing. Conversely, gamma activity was enhanced in response to audio-visual events only for the non-deprived eye within 100–300 ms after stimulus onset. The analysis of gamma responses to unisensory auditory events revealed that MD elicited a crossmodal upweight for the non-deprived eye. Distributed source modeling suggested that the right parietal cortex played a major role in neural effects induced by MD. Finally, visual and audio-visual processing alterations emerged for the induced component of the neural oscillations, indicating a prominent role of feedback connectivity. Results reveal the causal impact of MD on both unisensory (visual and auditory) and multisensory (audio-visual) processes and, their frequency-specific profiles. These findings support a model in which MD increases excitability to visual events for the deprived eye and audio-visual and auditory input for the non-deprived eye. 

A modality-independent proto-organization of human multisensory areas.

Setti, F., Handjaras, G., Bottari, D., Leo, A., Diano, M., Bruno, V., Tinti, C., Cecchetti, L., Garbarini, F., Pietrini, P., Ricciardi, E. (2023). 

Nature Human Behaviour.

www.nature.com/articles/s41562-022-01507-3

The processing of multisensory information is based upon the capacity of brain regions, such as the superior temporal cortex, to combine information across modalities. However, it is still unclear whether the representation of coherent auditory and visual events requires any prior audiovisual experience to develop and function. Here we measured brain synchronization during the presentation of an audiovisual, audio-only or video-only version of the same narrative in distinct groups of sensory-deprived (congenitally blind and deaf) and typically developed individuals. Intersubject correlation analysis revealed that the superior temporal cortex was synchronized across auditory and visual conditions, even in sensory-deprived individuals who lack any audiovisual experience. This synchronization was primarily mediated by low-level perceptual features, and relied on a similar modality-independent topographical organization of slow temporal dynamics. The human superior temporal cortex is naturally endowed with a functional scaffolding to yield a common representation across multisensory events.

 Neuroplasticity following cochlear implants. 

Pavani, F., & Bottari, D. (2022). 

In Handbook of Clinical Neurology (Vol. 187, pp. 89-108). Elsevier. 


Delayed Auditory Brainstem Responses (ABR) in children after sight-recovery       

Martinelli, A., Bianchi, B., Fratini, C., Handjaras, G., Fantoni, M., Trabalzini, F., Polizzi, S., Caputo, R., Bottari, D. (2021Neuropsychologia.   https://doi.org/10.1016/j.neuropsychologia.2021.108089 

Studies in non-human animal models have revealed that in early development, the onset of visual input gates the critical period closure of some auditory functions. The study of rare individuals whose sight was restored after a period of congenital blindness offers the rare opportunity to assess whether early visual input is a prerequisite for the full development of auditory functions in humans as well. Here, we investigated whether a few months of delayed visual onset would affect the development of Auditory Brainstem Responses (ABRs). ABRs are widely used in the clinical practice to assess both functionality and development of the subcortical auditory pathway and, provide reliable data at the individual level. We collected Auditory Brainstem Responses from two case studies, young children (both having less than 5 years of age) who experienced a transient visual deprivation since birth due to congenital bilateral dense cataracts (BC), and who acquired sight at about two months of age. As controls, we tested 41 children (sighted controls, SC) with typical development, as well as two children who were treated (at about two months of age) for congenital monocular cataracts (MC). The SC group data served to predict, at the individual level, wave latencies of each BC and MC participant. Statistics were performed both at the single subject as well as at the group levels on latencies of main ABR waves (I, III, V and SN10). Results revealed delayed response latencies for both BC children compared with the SC group starting from the wave III. Conversely, no difference emerged between MC children and the SC group. These findings suggest that in case the onset of patterned visual input is delayed, the functional development of the subcortical auditory pathway lags behind typical developmental trajectories. Ultimately results are in favor of the presence of a crossmodal sensitive period in the human subcortical auditory system. 


Interactions between auditory statistics processing and visual experience emerge only in late development 

Berto, M., Ricciardi, E., Pietrini, P., Bottari. D. (2021iScience.  https://doi.org/10.1016/j.isci.2021.103383 
The auditory system relies on local and global representations to discriminate sounds. This study investigated whether vision influences the development and functioning of these fundamental sound computations. We employed a computational approach to control statistical properties embedded in sounds and tested samples of sighted controls (SC), congenitally (CB) and late-onset (LB) blind individuals in two experiments. In experiment 1, performance relied on local features analysis; in experiment 2, performance benefited from computing global representations. In both experiments, SC and CB performance remarkably overlapped. Conversely, LB performed systematically worse than the other groups when relying on local features, with no alterations on global representations. Results suggest that auditory computations tested here develop independently from vision. The efficiency of local auditory processing can be hampered in case sight becomes unavailable later in life, supporting the existence of an audio-visual interplay for the processing of auditory details, which emerges only in late development.

Oscillatory signatures of Repetition Suppression and Novelty Detection reveal altered induced visual responses in early deafness

Bednaya, E., Pavani, F., Ricciardi, E., Pietrini, P., Bottari. D. (2021Cortex.  https://doi.org/10.1016/j.cortex.2021.05.017

The ability to differentiate between repeated and novel events represents a fundamental property of the visual system. Neural responses are typically reduced upon stimulus repetition, a phenomenon called Repetition Suppression (RS). On the contrary, following a novel visual stimulus, the neural response is generally enhanced, a phenomenon referred to as Novelty Detection (ND). Here, we aimed to investigate the impact of early deafness on the oscillatory signatures of RS and ND brain responses. To this aim, electrophysiological data were acquired in early deaf and hearing control individuals during processing of repeated and novel visual events unattended by participants. By studying evoked and induced oscillatory brain activities, as well as inter-trial phase coherence, we linked response modulations to feedback and/or feedforward processes. Results revealed selective experience-dependent changes on both RS and ND mechanisms. Compared to hearing controls, early deaf individuals displayed: (i) greater attenuation of the response following stimulus repetition, selectively in the induced theta-band (4–7 Hz); (ii) reduced desynchronization following the onset of novel visual stimuli, in the induced alpha and beta bands (8–12 and 13–25 Hz); (iii) comparable modulation of evoked responses and inter-trial phase coherence. The selectivity of the effects in the induced responses parallels findings observed in the auditory cortex of deaf animal models following intracochlear electric stimulation. The present results support the idea that early deafness alters induced oscillatory activity and the functional tuning of basic visual processing.

Three factors to characterize plastic potential transitions in the visual system.

Bottari, D., & Berto, M. (2021). Neuroscience & Biobehavioral Reviews, 126, 444-446.https://doi.org/10.1016/j.neubiorev.2021.03.035
A comprehensive understanding of brain-environment interactions is elusive even at the sensory level as neural plasticity waxes and wanes across the lifespan. Temporary and permanent visual deprivations remain pivotal approaches for studying the degree of experience-dependent plasticity of sensory functions. Natural models and experimental manipulations of visual experiences have contributed to uncovering some of the guiding principles that characterize transitions of plastic potentials in the human visual system. The existing literature regarding the neural plasticity associated with visual systems has been extensively discussed by two recent reviews articles (R¨oder et al., 2020; Castaldi et al., 2020) which provided an overview of different models of study and methods of investigations, gathering insights on both developing and adult brains. Here, we propose a framework of three main factors to characterize how the driving forces shaping visual circuits mutate, both quantitatively and qualitatively, between early development and adulthood.

EEG frequency-tagging demonstrates increased left hemispheric involvement and crossmodal plasticity for face processing in congenitally deaf signers 

Bottari, D., Bednaya, E., Dormal, G., Villwock, A., Dzhelyova, M., Grin, K., ... & Röder, B. (2020). NeuroImage, 223, 117315.https://doi.org/10.1016/j.neuroimage.2020.117315 
In humans, face-processing relies on a network of brain regions predominantly in the right occipito-temporal cortex. We tested congenitally deaf (CD) signers and matched hearing controls (HC) to investigate the experience dependence of the cortical organization of face processing. Specifically, we used EEG frequency-tagging to evaluate: (1) Face-Object Categorization, (2) Emotional Facial-Expression Discrimination and (3) Individual Face Discrimination. The EEG was recorded to visual stimuli presented at a rate of 6 Hz, with oddball stimuli at a rate of 1.2 Hz. In all three experiments and in both groups, significant face discriminative responses were found. Face-Object categorization was associated to a relative increased involvement of the left hemisphere in CD individuals compared to HC individuals. A similar trend was observed for Emotional Facial-Expression discrimination but not for Individual Face Discrimination. Source reconstruction suggested a greater activation of the auditory cortices in the CD group for Individual Face Discrimination. These findings suggest that the experience dependence of the relative contribution of the two hemispheres as well as crossmodal plasticity vary with different aspects of face processing.

The sensory-deprived brain as a unique tool to understand brain development and function

Ricciardi, E., Bottari, D., Ptito, M., Roder, B., & Pietrini, P. (2020). Neurosci. Biobehav. Rev, 108, 78-82.https://doi.org/10.1016/j.neubiorev.2019.10.017 
On October 11th–13th 2018, the second edition of “The Blind Brain Workshop” was held in Lucca (Italy), which gathered most among the leading worldwide experts in the study of the sensory-deprived brain. The aim of the workshop was to tackle, from multiple and different perspectives, the current conceptual and methodological challenges on the topic and to understand how perceptual experience sculpts the brain during development, as well as in adulthood.Altogether, the contributions of this three-day workshop empha- sized that the current understanding of the structural and functional organization as well as the development of the brain has significantly been promoted by the studies on the consequences of sensory-depri- vation both in humans and animals. Nevertheless, by providing a un- ique opportunity for a direct comparison of different sensory-depriva- tion models, the workshop has uncovered open aspects in blindness, deafness and even somatosensory deprivation research. Suggestions for a substantial rethinking were postulated. The event additionally high- lighted the role of early sensory experiences for functional develop- ment. In particular, the research on sensory-restoration has provided first evidence for the role of experience in typical development of dif- ferent neural systems.
See the whole Special Issue 'Rethinking the sensory-deprived brain: hints from the Blind Brain Workshop 2018' on Neuroscience and Biobehavioral Reviews

Some of our recent talks

Alessandra Federici-Talk: "Neural plasticity induced by different degrees of perturbation in auditory and visual sensory systems" at Trinity College Institute of Neuroscience (TCIN), Trinity College Dublin, October 2023

Alessandra Federici-Talk: "Neural tracking of continuous speech in cochlear-implanted children" Associazione Italiana Psicologia (AIP), Lucca (IT), September 2023

Alessandra Federici-Talk: "Crossmodal plasticity following short-term monocular deprivation" Associazione Italiana Psicologia (AIP), Lucca (IT), September 2023

Martina Berto, Chiara Battaglini, Nicolò Castellani, Pietro Pietrini, Nathan Weisz, Emiliano Ricciardi, Davide Bottari- Talk: "Interactions between auditory statistics processing and visual experience", Associazione Italiana Psicologia (AIP), Lucca (IT), 2023

Davide Bottari- Talk : “Experience dependence brain plasticity revealed by temporary and permanent sensory deprivation”,  Milab Bicocca, March, 2023

Martina Berto, Emiliano Ricciardi, Pietro Pietrini, Davide Bottari. Interactions between auditory statistics processing and visual experience emerge only in late development IMRF 2022, Ulm

Our Collaborations




Contacts

Twitter: @SEED_IMTLucca

E-mail:  seedlabimt@gmail.com