skip to primary navigation skip to content

CBSU bibliography search


To request a reprint of a CBSU publication, please click here to send us an email (reprints may not be available for all publications)

Spatiotemporal dissociation of perceptual and semantic processes in the ventral object processing stream
Authors:
Naci, L., PULVERMULLER, F., Taylor, K., SHTYROV, Y., HAUK, O. & Tyler, L
Reference:
Neuroimage 31, S178
Year of publication:
2006
CBU number:
6450
Abstract:
Objective A central issue in cognitive neuroscience is the time-course with which sensory inputs are transformed and integrated into coherent conceptual representations. With respect to object processing, it has been suggested that auditory and visual object features are analyzed within hierarchically structured sensory processing streams from sensory-specific cortex to superior/inferior temporal cortex, and are integrated in antero-medial temporal regions [1][2]. In addition, inferior-frontal regions may constrain in a top-down fashion the hypothesis-space for object recognition [3]. This EEG study aimed to: determine the time-course of posterior occipital, anterior temporal, and frontal region involvement in cross-modal object processing, investigate cross-modal effects on early auditory/visual perceptual processes, and investigate the early conceptual-semantic processes in cross-modal object processing. Methods High-density (128 channel) ERPs were recorded from nine healthy participants while performing a congruency task on auditory (Au), visual (Vi) and audio-visual (XM) stimuli. The activations' loci from an fMRI study [2] using the same stimuli and task were used to constrain the multiple dipole model of the grand averaged ERPs. In addition, BESA's (MEGIS software GmbH) distributed source localization technique was used to asses the generator locations. To determine cross-modal effects, T tests and repeated measures ANOVAs were performed on time intervals centered at the signal power peaks in the grand root-mean-square (RMS) of the ERP difference wave [XM - (Au+Vi)]. Results & Discussion Dipole modeling and minimum norm analyses showed significant and recurring cross-modal effects localized to the left/right occipital temporal, left/right superior temporal, left/right anterior temporal, and left inferior frontal sources, 0-200ms from stimulus presentation. Cross-modal effects on early sensory processing are seen as amplitude enhancement of early visual (P1) and auditory (N1, P2) ERP components in the cross-modal condition compared to each unimodal condition and compared to their sum. Cross-modal effects modulated by the semantic factor of stimulus congruency are seen in the anterior temporal source starting from 160ms. Between 60-130ms, these effects are significantly stronger in the posterior occipital sources (Fig 1) than in the anterior temporal sources, whereas between 180-200ms they become significantly stronger in the anterior temporal sources (Fig 2) (Fig 3). Conclusions These results suggest that the recurrent interplay between bottom-up (sensory-specific cortex) and top-down processes (frontal, anterior temporal cortex) may underpin cross-modal object recognition. We show early cross-modal effects on perceptual (60ms), and conceptual-semantic processes (160ms). Conceptual-semantic processes in anterior temporal regions seem to dominate object processing already between 180-200ms, whereas perceptual processes in occipital regions dominate object processing as early as 60-130ms. We discuss the implication of cross-modal congruency effects for hierarchical models of object processing.


genesis();