Elsevier

Cortex

Volume 60, November 2014, Pages 34-51
Cortex

Special issue: Research report
Selective attention modulates high-frequency activity in the face-processing network

https://doi.org/10.1016/j.cortex.2014.06.006Get rights and content

Abstract

Face processing depends on the orchestrated activity of a large-scale neuronal network. Its activity can be modulated by attention as a function of task demands. However, it remains largely unknown whether voluntary, endogenous attention and reflexive, exogenous attention to facial expressions equally affect all regions of the face-processing network, and whether such effects primarily modify the strength of the neuronal response, the latency, the duration, or the spectral characteristics. We exploited the good temporal and spatial resolution of intracranial electroencephalography (iEEG) and recorded from depth electrodes to uncover the fast dynamics of emotional face processing. We investigated frequency-specific responses and event-related potentials (ERP) in the ventral occipito-temporal cortex (VOTC), ventral temporal cortex (VTC), anterior insula, orbitofrontal cortex (OFC), and amygdala when facial expressions were task-relevant or task-irrelevant. All investigated regions of interest (ROI) were clearly modulated by task demands and exhibited stronger changes in stimulus-induced gamma band activity (50–150 Hz) when facial expressions were task-relevant. Observed latencies demonstrate that the activation is temporally coordinated across the network, rather than serially proceeding along a processing hierarchy. Early and sustained responses to task-relevant faces in VOTC and VTC corroborate their role for the core system of face processing, but they also occurred in the anterior insula. Strong attentional modulation in the OFC and amygdala (300 msec) suggests that the extended system of the face-processing network is only recruited if the task demands active face processing. Contrary to our expectation, we rarely observed differences between fearful and neutral faces. Our results demonstrate that activity in the face-processing network is susceptible to the deployment of selective attention. Moreover, we show that endogenous attention operates along the whole face-processing network, and that these effects are reflected in frequency-specific changes in the gamma band.

Introduction

Emotionally and socially significant stimuli in our environment receive prioritized perceptual processing. This processing bias has been attributed to the engagement of reflexive, exogenous attention (Vuilleumier, 2005) and entails an adaptive advantage for the organism (Öhman & Mineka, 2001). Facial expressions are among the most emotionally and socially significant stimuli in the human environment because they signify intentions and emotional states of our conspecifics, making them essential for social communication. This has led to the hypothesis that a processing bias for emotional facial expressions is hard-wired into the human brain (Palermo & Rhodes, 2007).

Processing of faces in general and emotional facial expressions in particular depend on the orchestrated activity of large-scale neuronal networks (Gobbini and Haxby, 2007, Haxby et al., 2000, Vuilleumier and Pourtois, 2007). A study using functional magnetic resonance imaging (fMRI) in humans identified a network of face-responsive regions involving the inferior occipital gyrus, the fusiform gyrus, the superior temporal sulcus, the amygdala, the hippocampus, the inferior frontal gyrus, and the orbitofrontal cortex (OFC) (Ishai, Schmidt, & Boesiger, 2005), confirming the importance of these regions for face processing. The visual perception of faces has been attributed to occipital and temporal regions including the inferior occipital, the fusiform, and the inferior temporal gyri (Kanwisher et al., 1997, Parvizi et al., 2012, Pourtois et al., 2010a, Tsuchiya et al., 2008). However, the face-processing network can be dynamically extended with regions recruited for the extraction of specific aspects of a face depending on the task or context at hand. Consequently, the terms “core system” and “extended system” have been coined to describe networks involved in basic visual perception and subsequent, context-related analysis of faces, respectively (Gobbini and Haxby, 2007, Haxby et al., 2000). Processing of facial expressions involves the core system and additional parts of the extended system such as the amygdala, the insula, and the OFC. Previous accounts ascribed a dominant role to the amygdala in processing especially fearful faces (Adolphs et al., 1994, Cornwell et al., 2008, Krolak-Salmon et al., 2004, Morris et al., 1996, Pourtois et al., 2010b, Vuilleumier et al., 2004). More recent work showed that amygdala activity (1) is not limited to fearful facial expressions but can also be found with faces depicting neutral or happy expressions and (2) can be subsumed under processing stimulus relevance or significance (Adolphs, 2010, Canli et al., 2002, Fusar-Poli et al., 2009, Rutishauser et al., 2011, Sander et al., 2003). The OFC is involved in identification of facial expressions and their associated meaning (Adolphs, 2002, Rolls, 2004). The anterior portion of the insula has been associated with the perception of facial disgust (Fusar-Poli et al., 2009, Phillips et al., 1998) and salience detection (Menon & Uddin, 2010). Converging evidence comes from studies investigating non-human primates: face-selective clusters have been found in the temporal lobe, referred to as the anterior and middle face patch (Tsao et al., 2006, Tsao and Livingstone, 2008), in the frontal lobe (Tsao, Schweers, Moeller, & Freiwald, 2008), and in the amygdala (Gothard et al., 2007, Hoffman et al., 2007, Leonard et al., 1985). In summary, the ventral occipito-temporal cortex (VOTC), the amygdala, the OFC, and the anterior insula form a network that mediates both perceptual processing and detailed analysis of facial expressions for further guidance of behavior.

Although some studies show that emotional facial expressions capture attention automatically (Fenker et al., 2010, Vuilleumier, 2002), the activity of the face-processing network can be modulated by voluntary, endogenous attention such as task demands or the specific context at hand. For example, Monroe et al. (2013) reported larger amplitudes of the magnetic counterpart of the N170, a prominent event-related component reflecting face processing (Bentin, Allison, Puce, Perez, & McCarthy, 1996), for fearful than for happy or neutral faces in the fusiform gyrus but only when attention had to be directed to the faces' expression and not to their age. The authors concluded that a valence modulation in the fusiform gyrus is more likely under conditions of directed attention to facial expressions. Likewise, larger event-related potentials (ERP) in the amygdala were observed specifically for fearful faces in an intracranial electroencephalography (iEEG) study, but only when the patients had to pay attention to the facial expression and not to gender (Krolak-Salmon et al., 2004). Results from a meta-analysis of fMRI data further support the notion that directed attention to facial expression boosts activity within the core and extended face-processing network. Specifically, explicit compared to implicit processing of facial expressions was associated with stronger responses in the fusiform gyrus, the amygdala, and inferior frontal regions (Fusar-Poli et al., 2009). Moreover, attentional capture by emotion is not limited to visual stimuli, such as faces, and has been reported for auditory and audiovisual stimuli both of negative and positive valence especially in temporal and frontal brain regions (Grandjean et al., 2008, Kreifelts et al., 2007, Sander et al., 2005). In particular, results of a cross-modal dot-probe paradigm revealed that emotional prosody modulated visual target processing in occipital cortex around 100 msec (Brosch, Grandjean, Sander, & Scherer, 2009).

iEEG recordings allow the investigation of face processing with precise information on the temporal structure (e.g., latency, duration) of neuronal responses. This temporal information is obtained with high spatial precision, as electrodes are directly placed within targeted neuronal populations. Furthermore, the neuronal responses recorded with iEEG can be analyzed with respect to their spectral components, reflecting different processes. It has been suggested that dynamic interactions of cell assemblies, reflected in temporal synchronization of neuronal activity, provide indices of network interactions (Engel et al., 2001, Siegel et al., 2012). In the local field potential, gamma band activity (GBA; >30 Hz) has been related to local oscillatory activity (Donner & Siegel, 2011), population level spiking activity (Lachaux et al., 2012, Manning et al., 2009, Ray and Maunsell, 2011), and the hemodynamic responses measured with fMRI (Lachaux et al., 2007, Logothetis et al., 2001). Directed attention reliably increases GBA and concomitantly decreases lower frequencies in the alpha (8–12 Hz) and beta (13–30 Hz) band (Fries et al., 2001, Jensen et al., 2007, Ossandón et al., 2012, Siegel et al., 2008). GBA has been related to the perception (Rodriguez et al., 1999) and structural encoding of faces (Gao et al., 2013, Zion-Golumbic and Bentin, 2007). Furthermore, processing emotional compared to neutral facial expressions elicited power changes in the delta (.5–4 Hz), theta (4–8 Hz), and gamma band (Balconi and Lucchiari, 2006, Balconi and Pozzoli, 2007). Studies using either iEEG or electrocorticography (ECoG) with subdural grids or strips revealed that (emotional) face processing has been associated with specific ERP components (Allison et al., 1994, Allison et al., 1999, Halgren et al., 1994, Krolak-Salmon et al., 2004, McCarthy et al., 1999, Pourtois et al., 2010a, Pourtois et al., 2010b, Puce et al., 1999). Few intracranial studies investigated modulation of frequency-specific neuronal responses in the face-processing network (Engell and McCarthy, 2010, Engell and McCarthy, 2011, Lachaux et al., 2005, Tsuchiya et al., 2008, Vidal et al., 2010). Only one research group examined synchronization of GBA in response to neutral (Sato et al., 2012) and fearful facial expressions (Sato et al., 2011b). However, these experiments were confined to the amygdala and did not investigate attentional modulation. Here we investigated whether directing attention towards or away from facial expressions is associated with fast changes of neuronal activity in the face-processing network.

To address this question, we recorded from depth electrodes implanted in patients undergoing resective neurosurgical treatment for drug-resistant epilepsy in order to uncover the fast dynamics of emotional face processing. We investigated frequency-specific neuronal activity and ERPs in different regions of the face-processing network: the VOTC (including the posterior fusiform gyrus) and ventral temporal cortex (VTC; including the anterior inferior temporal gyrus), the anterior insula, the OFC, and the amygdala. We compared neuronal responses between faces and control stimuli (nonfaces) and between two facial expressions (fearful vs neutral) under two different detection tasks. Spatial attention was always directed toward centrally presented face and nonface stimuli. However, the attentional focus on facial expressions was manipulated by the task. The tasks demanded either to focus on the facial expressions (explicit task) or on low level features of the image (implicit task). We predicted that neuronal activity in the face-processing network can be modulated by two factors: (1) by reflexive, exogenous attention driven by stimulus salience (face > nonface, fearful > neutral) and (2) by voluntary, endogenous attention driven by task demands (explicit > implicit). We expected that task demands and facial expressions modulate the increase and duration of GBA and concomitant decreases of alpha-beta-band activity (ABBA). Furthermore, we investigated whether task demands and stimulus salience globally affect the face-processing network, or whether regions of the core and the extended system are recruited specifically.

Section snippets

Participants

We obtained intracranial recordings from ten right-handed patients with drug-resistant epilepsy (5 males, mean age, M ± standard deviation, SD = 29.6 ± 6.4 years) who were evaluated for possible surgery at the Epilepsy Department of the Grenoble University Hospital (Grenoble, France). Table 1 summarizes medical history, pathological information, current medication at the time of experiment and resected tissue for each patient. Recording sites were solely determined according to clinical

Behavioral performance

The mean hit and false alarm rates (M ± SD) were 87.8% ± 13.8; and 1.2% ± 1.3 for the explicit task and 96.1% ± 7.6 and .7% ± 1.7 for the implicit task. The sensitivity index d’ did not differ between tasks (explicit: M ± SD = 3.7 ± .9; implicit: M ± SD = 4.3 ± .6; t9 = −2.02, p = .074). Patients responded faster to red-tinted targets in the implicit task (M ± SD = 470.9 msec ± 39.7) than to happy face targets in the explicit task (M ± SD = 603.5 msec ± 70.9; t9 = 6.68, p < .001). These results

Discussion

This study investigated whether directing attention towards or away from facial expressions in a detection task was associated with fast changes of neuronal activity in the face-processing network. Specifically, neuronal responses in the VOTC, VTC, anterior insula, OFC, and amygdala were compared when facial expressions were either task-relevant or task-irrelevant. We observed strong attentional modulation of neuronal activity during explicit face processing in all regions of the

Acknowledgments

We thank all the patients who participated in this study and the staff at the CHU Grenoble. Furthermore, we thank Tomás Ossandón and Juan R. Vidal for help in data analysis and Sarang S. Dalal for support in MRI coregistration. Grants from the German Research Foundation (SFB TRR58/B04, SFB936/A3, A.K.E.), the European Union (HEALTH-F2-2008-200728, A.K.E., J.P.L.; ERC-2010-AdG-269716, A.K.E), the G.A. Lienert Foundation (K.M.), the French National Research Agency (ANR-MLA ANR-10-BLAN-1409,

References (107)

  • K.L. Hoffman et al.

    Facial-expression and gaze-selective responses in the monkey amygdala

    Current Biology

    (2007)
  • N. Hoogenboom et al.

    Localizing human visual gamma-band activity in frequency, time and space

    NeuroImage

    (2006)
  • A. Ishai et al.

    Face perception is mediated by a distributed cortical network

    Brain Research Bulletin

    (2005)
  • O. Jensen et al.

    Human gamma-frequency oscillations associated with attention and memory

    Trends in Neurosciences

    (2007)
  • B. Kreifelts et al.

    Audiovisual integration of emotional signals in voice and face: an event-related fMRI study

    NeuroImage

    (2007)
  • P. Krolak-Salmon et al.

    Early amygdala reaction to fear spreading in occipital, temporal, and frontal cortex: a depth electrode ERP study in human

    Neuron

    (2004)
  • J.P. Lachaux et al.

    High-frequency neural activity and human cognition: past, present and possible future of intracranial EEG research

    Progress in Neurobiology

    (2012)
  • J.P. Lachaux et al.

    The many faces of the gamma band response to complex visual stimuli

    NeuroImage

    (2005)
  • J.P. Lachaux et al.

    Intracranial EEG and human brain mapping

    Journal of Physiology

    (2003)
  • M. Le Van Quyen et al.

    Comparison of Hilbert transform and wavelet methods for the analysis of neuronal synchrony

    The Journal of Neuroscience Methods

    (2001)
  • C.M. Leonard et al.

    Neurons in the amygdala of the monkey with responses selective for faces

    Behavioural Brain Research

    (1985)
  • H. Liu et al.

    Timing, timing, timing: fast decoding of object information from intracranial field potentials in human visual cortex

    Neuron

    (2009)
  • P.P. Mitra et al.

    Analysis of dynamic brain imaging data

    Biophysical Journal

    (1999)
  • R. Palermo et al.

    Are you always on my mind? A review of how face perception and attention interact

    Neuropsychologia

    (2007)
  • K.L. Phan et al.

    Functional neuroanatomy of emotion: a meta-analysis of emotion activation studies in PET and fMRI

    NeuroImage

    (2002)
  • E.T. Rolls

    The functions of the orbitofrontal cortex

    Brain and Cognition

    (2004)
  • U. Rutishauser et al.

    Single-unit responses selective for whole faces in the human amygdala

    Current Biology,

    (2011)
  • D. Sander et al.

    Emotion and attention interactions in social cognition: brain regions involved in processing anger prosody

    NeuroImage

    (2005)
  • W. Sato et al.

    Rapid amygdala gamma oscillations in response to fearful facial expressions

    Neuropsychologia

    (2011)
  • M. Siegel et al.

    Neuronal Synchronization along the dorsal visual pathway reflects the focus of spatial attention

    Neuron

    (2008)
  • C. Tallon-Baudry et al.

    Oscillatory gamma activity in humans and its role in object representation

    Trends in Cognitive Sciences

    (1999)
  • N. Tzourio-Mazoyer et al.

    Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain

    NeuroImage

    (2002)
  • P. Vuilleumier

    How brains beware: neural mechanisms of emotional attention

    Trends in Cognitive Sciences

    (2005)
  • R. Adolphs

    Recognizing emotion from facial expressions: psychological and neurological mechanisms

    Behavioral and Cognitive Neuroscience Reviews

    (2002)
  • R. Adolphs

    What does the amygdala contribute to social cognition?

    Annals of the New York Academy of Sciences

    (2010)
  • R. Adolphs et al.

    Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala

    Nature

    (1994)
  • T. Allison et al.

    Human extrastriate visual cortex and the perception of faces, words, numbers, and colors

    Cerebral Cortex

    (1994)
  • T. Allison et al.

    Electrophysiological studies of human face perception. I: potentials generated in occipitotemporal cortex by face and non-face stimuli

    Cerebral Cortex

    (1999)
  • A.K. Anderson et al.

    Neural correlates of the automatic processing of threat facial signals

    The Journal of Neuroscience

    (2003)
  • M. Balconi et al.

    Event-related oscillations (EROs) and event-related potentials (ERPs) comparison in facial expression recognition

    Journal of Neuropsychology

    (2007)
  • M. Bar et al.

    Top-down facilitation of visual recognition

    Proceedings of the National Academy of Sciences of the United States of America

    (2006)
  • Y. Benjamini et al.

    Controlling the false discovery rate – a practical and powerful approach to multiple testing

    Journal of the Royal Statistical Society – Series B: Methodological

    (1995)
  • S. Bentin et al.

    Electrophysiological studies of face perception in humans

    Journal of Cognitive Neuroscience

    (1996)
  • T. Brosch et al.

    Cross-modal emotional attention: emotional voices modulate early stages of visual processing

    Journal of Cognitive Neuroscience

    (2009)
  • T. Canli et al.

    Amygdala response to happy faces as a function of extraversion

    Science,

    (2002)
  • W.A. Cunningham et al.

    Motivational salience: amygdala tuning from traits, needs, values, and goals

    Current Directions in Psychological Science

    (2012)
  • A.K. Engel et al.

    Dynamic predictions: oscillations and synchrony in top–down processing

    Nature Reviews Neuroscience

    (2001)
  • A.D. Engell et al.

    Selective attention modulates face-specific induced gamma oscillations recorded from ventral occipitotemporal cortex

    The Journal of Neuroscience

    (2010)
  • A.D. Engell et al.

    The relationship of gamma oscillations and face-specific ERPs recorded subdurally from occipitotemporal cortex

    Cerebral Cortex

    (2011)
  • A.C. Evans et al.

    3D statistical neuroanatomical models from 305 MRI volumes

  • Cited by (27)

    • Face-selective multi-unit activity in the proximity of the FFA modulated by facial expression stimuli

      2022, Neuropsychologia
      Citation Excerpt :

      According to an alternative view, facial expression modulation occurs later, for example at ∼250 ms [i.e., an early posterior negativity ERP component] (Bentin and Deouell, 2000; Schupp et al., 2004; Rellecke et al., 2012; Maffei et al., 2021). Finally, several intracranial local field potential (LFP) recording studies reported modulation elicited by the facial expression in the posterior fusiform gyrus (Pourtois et al., 2010; Kawasaki et al., 2012; Müsch et al., 2014; Li et al., 2019; Babo-Rebelo et al., 2021). Interestingly, the latency of modulation relative to stimulus onset varied across the studies from very early (∼120 ms: Kawasaki et al., 2012) to very late (300–800 ms: Pourtois et al., 2010).

    • Modulation of neuronal oscillatory activity in the beta- and gamma-band is associated with current individual anxiety levels

      2018, NeuroImage
      Citation Excerpt :

      Gamma-band activity has been implicated in a broad range of cognitive and perceptual functions such as attentional selection (Engel et al., 2001; Fries, 2009; Siegel et al., 2008), memory (Jensen et al., 2007), object recognition (Tallon-Baudry and Bertrand, 1999), and is also modulated by the emotional saliency of stimuli (Güntekin and Başar, 2014). Enhanced gamma-band activity was observed in response to fear conditioned stimuli (Keil et al., 2007), emotional images (Keil et al., 2001; Müller et al., 1999), and human faces (Lachaux et al., 2005; Luo et al., 2013; Müsch et al., 2014; Zion-Golumbic et al., 2008), possibly reflecting enhanced attentional engagement for emotional stimuli (Eysenck et al., 2007). Also, in the theta-band increased activity in response to fearful compared to neutral facial expressions were observed starting early (at about 70 ms).

    • The dynamics of human cognition: Increasing global integration coupled with decreasing segregation found using iEEG

      2018, NeuroImage
      Citation Excerpt :

      Beyond the high temporal resolution intrinsic to iEEG measurements, this technique also allows higher levels of spatial resolution and enhanced signal-to-noise ratio (Engel et al., 2005; Lachaux et al., 2003). These advantages have led scientists to use the technique (Lachaux et al., 2012) to study several cognitive processes such as attention (Musch et al., 2014), visual perception (Ossandon et al., 2012; Bertrand et al., 2014), language (Sahin et al., 2009; Chan et al., 2011; Hamame et al., 2014), memory (Kucewicz et al., 2014; Greenberg et al., 2015; Haque et al., 2015), decision making (Perez et al., 2015), emotion (Murray et al., 2014; Boucher et al., 2015) and consciousness (Gaillard et al., 2009). Most of this research has attempted to assign functions to specific local brain areas by correlating task-performance with measurements of neural activity - see also recent coherence network studies (Kingyon et al., 2015; Zheng et al., 2017; Mercier et al., 2015).

    View all citing articles on Scopus
    1

    These authors contributed equally to this work.

    View full text