<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2014.01498</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research Article</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Cortical response of the ventral attention network to unattended angry facial expressions: an EEG source analysis study</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Inuggi</surname> <given-names>Alberto</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/131615"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Sassi</surname> <given-names>Federica</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/163325"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Castillo</surname> <given-names>Alejandro</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/156800"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Campoy</surname> <given-names>Guillermo</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Leocani</surname><given-names>Letizia</given-names></name>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/75117"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Garc&#x000ED;a Santos</surname> <given-names>Jos&#x000E9; M.</given-names></name>
<xref ref-type="aff" rid="aff4"><sup>4</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/132839"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name><surname>Fuentes</surname> <given-names>Luis J.</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="author-notes" rid="fn002"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://community.frontiersin.org/people/u/11380"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Basque Center on Cognition, Brain and Language</institution> <country>San Sebasti&#x000E1;n, Spain</country></aff>
<aff id="aff2"><sup>2</sup><institution>Departamento de Psicolog&#x000ED;a B&#x000E1;sica y Metodolog&#x000ED;a, University of Murcia</institution> <country>Murcia, Spain</country></aff>
<aff id="aff3"><sup>3</sup><institution>Institute of Experimental Neurology, L&#x02019;Istituto di Ricovero e Cura a Carattere Scientifico San Raffaele</institution> <country>Milan, Italy</country></aff>
<aff id="aff4"><sup>4</sup><institution>Servicio de Radiolog&#x000ED;a, Hospital Morales Meseguer</institution> <country>Murcia, Spain</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: <italic>Alan J. Pegna, Geneva University Hospitals, Switzerland</italic></p></fn>
<fn fn-type="edited-by"><p>Reviewed by: <italic>Marzia Del Zotto, University of Geneva, Switzerland; Angela Gosling, Bournemouth University, UK</italic></p></fn>
<fn fn-type="corresp" id="fn002"><p>&#x0002A;Correspondence: <italic>Luis J. Fuentes, Departamento de Psicolog&#x000ED;a B&#x000E1;sica y Metodolog&#x000ED;a, University of Murcia, Campus Espinardo, 30100 Murcia, Spain e-mail: <email>lfuentes@um.es</email></italic></p></fn>
<fn fn-type="other" id="fn001"><p>This article was submitted to Emotion Science, a section of the journal Frontiers in Psychology.</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>19</day>
<month>12</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="collection">
<year>2014</year>
</pub-date>
<volume>5</volume>
<elocation-id>1498</elocation-id>
<history>
<date date-type="received">
<day>22</day>
<month>11</month>
<year>2014</year>
</date>
<date date-type="accepted">
<day>04</day>
<month>12</month>
<year>2014</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2014 Inuggi, Sassi, Castillo, Campoy, Leocani, Garc&#x000ED;a Santos and Fuentes.</copyright-statement>
<copyright-year>2014</copyright-year>
<license license-type="open-access" xlink:href="http://creativecommons.org/licenses/by/4.0/"><p> This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p><bold>Introduction:</bold> We used an affective prime task composed of emotional (happy, angry, and neutral) prime faces and target words with either positive or negative valence. By asking subjects to attend to either the faces&#x02019; emotional expression or to the glasses&#x02019; shape, we assessed whether angry facial expressions were processed when they were unattended and task-irrelevant.</p>
<p><bold>Methods:</bold> We conducted a distributed source analysis on the corresponding event-related potentials focused on the early activity of face processing and attention networks&#x02019; related areas. We also evaluated the magnitude of the affective priming effect.</p>
<p><bold>Results:</bold> We observed a reduction of occipitotemporal areas&#x02019; (BA37) activation to unattended compared to attended faces and a modulation of primary visual areas&#x02019; activity lateralization. The latter was more right lateralized for attended than for unattended faces, and emotional faces were more right lateralized than neutral ones only in the former condition. Affective priming disappeared when emotional expressions of prime faces were ignored. Moreover, an increased activation in the right temporo&#x02013;parietal junction (TPJ), but not in the intraparietal sulcus, was observed only for unattended angry facial expressions at &#x0223C;170 ms after face presentation.</p>
<p><bold>Conclusion:</bold> We suggest that attentional resources affect the early processing in visual and occipito-temporal areas, irrespective of the faces&#x02019; threatening content. The disappearance of the affective priming effect suggests that when subjects were asked to focus on glasses&#x02019; shape, attentional resources were not available to process the facial emotional expression, even though emotion-relevant and emotion-irrelevant features of the face were presented in the same position. On the other hand, unattended angry faces evoked a pre-attentive TPJ activity, which most likely represents a bottom&#x02013;up trigger that signals their high behavioral relevance, although it is unrelated to task demands.</p>
</abstract>
<kwd-group>
<kwd>ventral attentional network</kwd>
<kwd>temporo&#x02013;parietal junction</kwd>
<kwd>EEG source analysis</kwd>
<kwd>threatening facial expressions</kwd>
<kwd>attention modulation</kwd>
</kwd-group>
<counts>
<fig-count count="7"/>
<table-count count="2"/>
<equation-count count="0"/>
<ref-count count="81"/>
<page-count count="13"/>
<word-count count="0"/>
</counts>
</article-meta>
</front>
<body>
<sec>
<title>INTRODUCTION</title>
<p>Emotional events play a crucial role in how humans interact with one another and how they can adapt to changing environments. To foster survival, it is essential that threatening stimuli that originate from other people or from the environment may be processed in a rapid and efficient manner. Many pieces of evidence show that threatening information can be processed automatically and independently of attention or attentional resources (<xref ref-type="bibr" rid="B71">Stenberg et al., 1995</xref>; <xref ref-type="bibr" rid="B78">Vuilleumier et al., 2001</xref>; for reviews, see <xref ref-type="bibr" rid="B13">Compton, 2003</xref>; <xref ref-type="bibr" rid="B77">Vuilleumier, 2005</xref>). Moreover, this information processing can occur even without conscious perception (for a recent review, see <xref ref-type="bibr" rid="B73">Tamietto and de Gelder, 2010</xref>).</p>
<p>One common stimulus used to demonstrate how threatening information can be prioritized and processed efficiently is the fearful facial expression. Several studies using different paradigms have shown that even though the emotional content of the stimulus is task-irrelevant, it captures attention and interferes with the relevant task (<xref ref-type="bibr" rid="B50">Okon-Singer et al., 2007</xref>; <xref ref-type="bibr" rid="B29">Hart et al., 2010</xref>), delays disengagement of attention (<xref ref-type="bibr" rid="B24">Georgiou et al., 2005</xref>), is detected more easily than a neutral stimulus (<xref ref-type="bibr" rid="B28">Hansen and Hansen, 1988</xref>; <xref ref-type="bibr" rid="B3">Anderson, 2005</xref>; <xref ref-type="bibr" rid="B11">Calvo et al., 2006</xref>) and is better detected as a T2 in the attention blink paradigm compared with a neutral T2 (<xref ref-type="bibr" rid="B3">Anderson, 2005</xref>). Further evidence for the automatic processing of emotional expressions is derived from studies that explicitly manipulated the focus of attention by asking subjects to either attend to or ignore facial stimuli [e.g., <xref ref-type="bibr" rid="B78">Vuilleumier et al., 2001</xref>; <xref ref-type="bibr" rid="B4">Anderson et al., 2003</xref>; <xref ref-type="bibr" rid="B17">Eimer et al., 2003</xref>; see <xref ref-type="bibr" rid="B16">Eimer and Holmes, 2007</xref>, for a review of event-related potential (ERP) studies]. For instance, <xref ref-type="bibr" rid="B78">Vuilleumier et al. (2001)</xref> presented two faces and two houses arranged parafoveally in the vertical or horizontal axis. Subjects had to compare the faces (faces-attended, houses-unattended) or to compare the houses (faces-unattended, houses-attended). Fearful faces were compared with neutral faces. The activation in the amygdala, the hallmark of emotional processing, was higher with fearful than with neutral faces. Notably, activation in the amygdala did not differ whether the participants paid attention to the faces or to the houses.</p>
<p>Recent studies, however, have challenged the idea that the processing of emotional information can occur without requiring a sufficient amount of attentional resources (<xref ref-type="bibr" rid="B54">Pessoa et al., 2002</xref>, <xref ref-type="bibr" rid="B55">2005</xref>; <xref ref-type="bibr" rid="B32">Holmes et al., 2003</xref>; <xref ref-type="bibr" rid="B48">Ochsner and Gross, 2005</xref>; <xref ref-type="bibr" rid="B50">Okon-Singer et al., 2007</xref>; <xref ref-type="bibr" rid="B69">Silvert et al., 2007</xref>; <xref ref-type="bibr" rid="B64">Sassi et al., 2014</xref>). For instance, <xref ref-type="bibr" rid="B54">Pessoa et al. (2002</xref>; see also <xref ref-type="bibr" rid="B55">Pessoa et al., 2005</xref>) found emotion-related brain activity only when the subjects had to respond to the gender of the faces (easy task), but not when they had to discriminate the orientation of two peripheral bars (difficult task). <xref ref-type="bibr" rid="B32">Holmes et al. (2003)</xref> compared ERPs between fearful and neutral facial expressions when the subjects had to compare two faces (face attended) versus two houses (face unattended), with both faces and houses being simultaneously presented at different spatial locations. Differences between the two emotional expressions were observed only when faces were attended. In a recent behavioral study, <xref ref-type="bibr" rid="B64">Sassi et al. (2014)</xref> used an affective priming task in which a prime face showing either an emotional (positive or negative) or a neutral expression was followed by an emotionally laden target word (positive or negative). In the critical trials the target word could be preceded by a face prime that belonged to the same affective category of the target (congruent condition) or to a different affective category (incongruent condition). Affective priming was measured through congruency effects, that is, the difference in performance between the congruent condition and the incongruent condition. <xref ref-type="bibr" rid="B64">Sassi et al. (2014)</xref> observed affective priming when the subjects&#x02019; attention was allocated to the emotional information (emotion task), and also, albeit of a smaller size, when the emotion expression was made task-irrelevant by asking subjects to determine whether the face wore glasses (the glasses task). However, when the subjects were asked to determine whether the glasses were rounded or squared (the shape task), the affective priming effect vanished. This finding was probably a consequence of the fact that the shape task (difficult task) required more attentional monitoring than the glasses task (easy task), and therefore there were not sufficient attentional resources as to process the emotional expression of the face prime (see also <xref ref-type="bibr" rid="B50">Okon-Singer et al., 2007</xref>, for similar evidence using a cognitive load paradigm). A common feature of the studies that report attentional modulation of emotional processing is that the non-emotional task usually involves a high attentional load; therefore, sufficient attentional resources were not available to process the emotional content of the stimuli (<xref ref-type="bibr" rid="B42">Lavie, 1995</xref>; <xref ref-type="bibr" rid="B54">Pessoa et al., 2002</xref>, <xref ref-type="bibr" rid="B55">2005</xref>; <xref ref-type="bibr" rid="B50">Okon-Singer et al., 2007</xref>; <xref ref-type="bibr" rid="B51">Palermo and Rhodes, 2007</xref>).</p>
<p>The present study is a follow-up of the <xref ref-type="bibr" rid="B64">Sassi et al.&#x02019;s (2014)</xref> study, although only two tasks were used: the emotional task, in which subjects attended to the emotional expression of the face, and the shape task, in which subjects attended to the shape of the glasses so that the emotional facial expression was task-irrelevant. In addition, whereas many studies have investigated the processing of threatening stimuli using fearful faces, we were interested in extending our affective priming studies to other negative emotional expression. Thus, anger faces were selected for the present study. Anger is frequently exhibited in daily life as much as other negative expressions such as fear and sadness, but few studies have used this emotional expression in paradigms that used attentional manipulations.</p>
<p>On the basis of our previous results, we expected an affective priming effect with the emotion task, but not with the shape task. However, as <xref ref-type="bibr" rid="B50">Okon-Singer et al. (2007)</xref> pointed out, it is necessary to dissociate attention-dependent processing from automatic processing (at least the &#x0201C;weak&#x0201D; notion of automaticity, <xref ref-type="bibr" rid="B76">Tzelgov, 1997</xref>; <xref ref-type="bibr" rid="B53">Pessoa, 2005</xref>). Despite the lack of behavioral priming effects, which might depend on the availability of attentional resources, is still possible that processing of the negative facial expression in the shape task occurs in a &#x0201C;strong&#x0201D; automatic way, independently of both attentional resources and task relevance. Negative facial emotional expressions may be related to threat and therefore they may be behavioral relevant stimuli that require a fast automatic reaction to foster survival. If that were the case, we would be able to detect emotion-related brain activation even when subjects&#x02019; top&#x02013;down attention is allocated to an emotion-irrelevant feature of the face prime that requires fine-grained discrimination (the shape task). The rational for that hypothesis is the existence of a neural circuitry comprising both subcortical and cortical areas, that is involved in the rapid and automatic detection of threatening salient stimuli, and that may play a crucial role for survival (<xref ref-type="bibr" rid="B77">Vuilleumier, 2005</xref>).</p>
<p>In the study, we carried out distributed source analyses (<xref ref-type="bibr" rid="B22">Fuchs et al., 1999</xref>) over the ERP generated by the face. Unlike dipole analysis (<xref ref-type="bibr" rid="B66">Scherg and Von Cramon, 1985</xref>), which uses very few sources and needs strong <italic>a priori</italic> hypotheses about their characteristics, the source analysis technique represents the cortical brain activity through the intensity of a large number of cortical generators, providing a more realistic simulation of brain functioning. Among the several approaches available to solve the inverse problem of reconstructing the cortical sources that generated the recorded scalp potentials, we opted for a well-established post-processing method (<xref ref-type="bibr" rid="B37">Inuggi et al., 2010</xref>, <xref ref-type="bibr" rid="B35">2011a</xref>,<xref ref-type="bibr" rid="B36">b</xref>; <xref ref-type="bibr" rid="B25">Gonzalez-Rosa et al., 2013</xref>). It employs a sLORETA-weighted accurate minimum norm method (SWARM) algorithm (<xref ref-type="bibr" rid="B81">Wagner et al., 2007</xref>), which allows for the low reconstruction error of sLoreta (<xref ref-type="bibr" rid="B52">Pascual-Marqui, 2002</xref>) and also outputs a current density vector field that can later be post-processed.</p>
<p>We focused then on the cortical areas that are involved in the processing of the fine-grained facial features. Briefly, the process of recognizing the static (identity, gender, familiarity) and the dynamic (emotional expressions and gaze direction) characteristics of the observed face are thought to rely mainly on a cortical stream (<xref ref-type="bibr" rid="B30">Haxby et al., 2000</xref>; <xref ref-type="bibr" rid="B51">Palermo and Rhodes, 2007</xref>) embracing both the classical ventral stream (<xref ref-type="bibr" rid="B38">Ishai et al., 1999</xref>) and the superior temporal sulcus (STS). The ventral stream originates in the occipital areas and propagates through the occipital face area (OFA) and the fusiform face area (FFA). The FFA is specialized in decoding fine-grained static facial characteristics (<xref ref-type="bibr" rid="B40">Kanwisher et al., 1997</xref>; <xref ref-type="bibr" rid="B27">Halgren et al., 2000</xref>; <xref ref-type="bibr" rid="B32">Holmes et al., 2003</xref>; <xref ref-type="bibr" rid="B5">Bayle and Taylor, 2010</xref>), while the STS, especially its posterior part (pSTS), is involved in the processing of dynamic facial features, such as eye gaze, and in decoding the emotional information from facial features (<xref ref-type="bibr" rid="B59">Puce et al., 1998</xref>; <xref ref-type="bibr" rid="B2">Allison et al., 2000</xref>; <xref ref-type="bibr" rid="B31">Hoffman and Haxby, 2000</xref>; <xref ref-type="bibr" rid="B63">Said et al., 2010</xref>). Previous studies have observed that that FFA activates more with facial than with non-facial objects (see <xref ref-type="bibr" rid="B30">Haxby et al., 2000</xref>, for review), and therefore we expected reduced activation of that area in the shape task (focused on a non-face feature) compared to the emotion task (focused on a facial feature).</p>
<p>To model the activation of these areas, we performed both sources and sensors analysis in correspondence to the main ERP components. Besides modeling the two mostly investigated early components, the posterior P1 and the lateral occipito-temporal N170, we also modeled the anterior N1 (<xref ref-type="bibr" rid="B44">Luo et al., 2010</xref>) and a later positive component, peaking around 230&#x02013;250 ms, whose name and temporal location vary greatly across studies (e.g., VPP in <xref ref-type="bibr" rid="B44">Luo et al., 2010</xref>; P270 in <xref ref-type="bibr" rid="B43">Liu et al., 2012</xref>). Both P1 and N1 components have been associated with a first stage of automatic processing that differentiates negative facial expressions from positive or neutral facial expressions (<xref ref-type="bibr" rid="B56">Pourtois et al., 2004</xref>; <xref ref-type="bibr" rid="B44">Luo et al., 2010</xref>), which reflects an early negativity bias (<xref ref-type="bibr" rid="B70">Smith et al., 2003</xref>). The N170 component has been involved in the distinction between faces and non-faces stimuli (<xref ref-type="bibr" rid="B7">Bentin et al., 1996</xref>; <xref ref-type="bibr" rid="B60">Rossion et al., 2003</xref>; <xref ref-type="bibr" rid="B39">Itier and Taylor, 2004</xref>; <xref ref-type="bibr" rid="B44">Luo et al., 2010</xref>). As the aforementioned components have been shown to be affected by affective processing in an early phase of perception and attention processing (<xref ref-type="bibr" rid="B12">Carreti&#x000E9; et al., 2004</xref>; <xref ref-type="bibr" rid="B16">Eimer and Holmes, 2007</xref>; <xref ref-type="bibr" rid="B44">Luo et al., 2010</xref>), they constitute the main goal of our analysis of the first 300 ms postface prime onset.</p>
<p>Source analyses were also employed to assess whether angry and non-angry (happy and neutral) expressions were processed differently when attention was directed to emotion-irrelevant facial features. Specifically, because negative emotional expressions are behavioral relevant stimuli, we expect activation in the ventral attention network (VAN), which is supposed to detect behavioral relevant but task-irrelevant stimuli and to exert a bottom&#x02013;up modulation over the dorsal attention network (DAN; <xref ref-type="bibr" rid="B15">Corbetta and Shulman, 2002</xref>; <xref ref-type="bibr" rid="B14">Corbetta et al., 2008</xref>). However, because emotion-relevant and emotion-irrelevant features were foveally presented, we did not expect any reorienting process by the DAN, which is responsible for top&#x02013;down control as it contains, specifically in the Frontal Eye Field region, the proper circuitry to moves the eyes to the selected target. Thus, we may be able to test the hypothesis that the VAN might activate independently from the DAN by assessing brain activity in both the temporo&#x02013;parietal junction (TPJ; VAN) and the intra-parietal sulcus (DAN). These networks are considered supramodal (<xref ref-type="bibr" rid="B45">Macaluso et al., 2002</xref>; <xref ref-type="bibr" rid="B26">Green et al., 2011</xref>) and not directly related to face processing. Because their involvement in bottom&#x02013;up and top&#x02013;down control is derived mainly from functional magnetic resonance imaging (fMRI) studies, whose temporal resolution is not enough as to be coupled with electroencephalography (EEG) activation findings, their activation time course will be investigated here in the temporal proximity of the classical ERP peaks, where the face feature decoding process is expected to occur.</p>
</sec>
<sec id="s1" sec-type="materials|methods">
<title>MATERIALS AND METHODS</title>
<sec>
<title>SUBJECTS</title>
<p>Twenty-eight healthy, young (mean age 22.1 &#x000B1; 2.3 years, range 19&#x02013;30) subjects with no history of neurologic or neuropsychiatric disorders were recruited to participate in this study. Fourteen subjects (11 females and 3 males) participated in each task condition (emotion and shape). All subjects were right-handed according to their self-report and gave their written informed consent for participation in the study.</p>
</sec>
<sec>
<title>TASK</title>
<p>Subjects were tested individually in a sound-attenuated room. A computer program generated by E-Prime 2 (<xref ref-type="bibr" rid="B67">Schneider et al., 2002</xref>) controlled the experiment. The stimuli were presented on a 17<sup>&#x02032;&#x02032;</sup> TFT monitor (screen resolution: 1024 by 768 pixels; background color: silver &#x02013; RGB: 200, 200, 200) and participants responded via the keyboard. We used three grayscale pictures (4.5 cm wide by 7.7 cm height) of human faces as prime stimuli, one for each facial expression (happy, angry, and neutral). These stimuli were taken from the NimStim Set of Facial Expressions (<xref ref-type="bibr" rid="B74">Tottenham et al., 2009</xref>; the reference codes of the selected faces are 20_M_HA_O, 20_M_NE_C, and 20_M_AN_O). By using photo-editing software, we created two versions of each picture, one wearing rounded glasses, and the other wearing squared glasses. As target stimuli, we used 36 Spanish words divided into two sets, one comprising 18 positive words, the other containing 18 negative words. Mean valence ratings for the words of the two sets ranged from 1.7 to 2.8 (<italic>M</italic> = 2.3) for positive words and from &#x02013;0.9 to &#x02013;1.8 (<italic>M</italic> = &#x02013;2.3) for negative words, according to a preliminary study (<italic>N</italic> = 124; scale ranging from &#x02013;3 to +3; see <xref ref-type="bibr" rid="B64">Sassi et al., 2014</xref>). Positive and negative words were matched for word frequency, familiarity, and word length using the LEXESP database (<xref ref-type="bibr" rid="B68">Sebasti&#x000E1;n-Gall&#x000E9;s et al., 2000</xref>). Each trial consisted of the following sequence (the trial scheme is summarized in <bold>Figure <xref ref-type="fig" rid="F1">1</xref></bold>). First, a 1000-ms fixation point (a plus sign) appeared in the center of the screen followed by one of the three prime faces, which was presented for 200 ms. Then, after an interval of 100 ms (stimulus onset asynchrony, SOA = 300 ms), a target word was shown (in capital letters and black font) and subjects indicated whether the word was positive or negative by pressing the &#x0201C;n&#x0201D; or &#x0201C;m&#x0201D; key on the computer keyboard as quickly and accurately as possible (this first response is referred to as R1). Both prime faces and target words were presented centered. The specific response-key mapping was counterbalanced across participants. Immediately following R1, a double-choice question appeared on the screen, and subjects were prompted to press, with no time limit, the key (&#x0201C;z&#x0201D; or &#x0201C;x&#x0201D;) that corresponded to the correct answer (hereafter, R2). In the emotion condition, subjects were asked whether the prime face was neutral or emotional (the emotion task), whereas in the glasses&#x02019; shape condition they were asked whether the face wore rounded or squared glasses (the shape task). The whole experiment included 72 congruent trials, 72 incongruent trials, and 144 neutral trials. In congruent trials, the prime face and the target word belonged to the same affective valence, either positive, as it happened in happy-positive trials (<italic>N</italic> = 36) or negative, as it happened in anger-negative trials (<italic>N</italic> = 36). In incongruent trials, a prime face with different valence preceded the target word, as it happened in happy-negative trials (<italic>N</italic>= 36) and anger-positive trials (<italic>N</italic> = 36). In neutral trials, a neutral prime face preceded the target word, as it happened in neutral-positive (<italic>N</italic> = 72) and neutral-negative (<italic>N</italic> = 72) trials. Target words were drawn from each set at random, with the constraint that each word appeared in two congruent trials, in two incongruent trials and in four neutral trials. A short practice block of 18 trials preceded the experimental trials.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption><p><bold>Sequence of events and time duration in the experiment</bold>.</p></caption>
<graphic xlink:href="fpsyg-05-01498-g001.tif"/>
</fig>
</sec>
<sec>
<title>EEG RECORDINGS AND PREPROCESSING</title>
<p>Electroencephalography was recorded using 59 scalp channels mounted onto an elastic cap (ActiveCap, Brain Products GmbH), according to the 10&#x02013;20 international system, with the reference located close to the vertex. The EEG signal was amplified (BrainAmp, Brain Products GmbH), digitized (1000 Hz sampling frequency), and filtered (0.1 to 40 Hz). The electrode impedance was kept below 5 K&#x003A9;. Four additional electrodes were placed to monitor the left/right and horizontal/vertical ocular activity. The eye movements&#x02019; artifacts were corrected with an independent component analysis (ICA) Ocular Artifact Reduction algorithm (Vision Analyzer, Brain Products GmbH). The ERPs were obtained by averaging the EEG epochs from &#x02013;250 to +300 ms with respect to face onset, using the first 200 ms for baseline correction. Data were finally re-referenced using a common average reference approach.</p>
</sec>
<sec>
<title>ERP COMPONENTS DEFINITION</title>
<p>According to previous studies, we focused on the P1 and N170 components and also on N1, which peaks in frontal regions at &#x0223C;100 ms. Additionally, our data revealed a late positive deflection, peaking at &#x0223C;240 ms, that was also investigated. Four pairs of sensors clusters, whose amplitude was calculated as the mean amplitude of their constituent sensors, were defined to model the ERP components. In each subject and for each experimental condition, the amplitudes of components&#x02019; peaks were calculated as the maximum positive/negative deflections within the time windows specified in <bold>Table <xref ref-type="table" rid="T1">1</xref></bold>. To better compare ERP results with source analysis results, a further cluster, conventionally not investigated in previous studies, was defined for the N170 period that covered the temporo&#x02013;parietal region. These eight cluster measures were subjected to statistical analyses. In further analysis, the two occipital clusters were merged into a single cluster, and its activation was expressed in terms of the lateralization of its medial&#x02013;lateral center of gravity, calculated with the following formula:</p>
<disp-formula id="E1"><mml:math id="M12"><mml:mrow><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>C</mml:mi><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>O</mml:mi><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>G</mml:mi><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>X</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>=</mml:mo><mml:mrow><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>(</mml:mo><mml:mrow><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>a</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>*</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>P</mml:mi><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>O</mml:mi><mml:mn mathcolor='black' mathsize='12pt' mathvariant='normal'>8</mml:mn><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>+</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>b</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>*</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>P</mml:mi><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>O</mml:mi><mml:mn mathcolor='black' mathsize='12pt' mathvariant='normal'>4</mml:mn><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>+</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>c</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>*</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>O</mml:mi><mml:mn mathcolor='black' mathsize='12pt' mathvariant='normal'>2</mml:mn><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>&#x2212;</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>a</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>*</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>P</mml:mi><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>O</mml:mi><mml:mn mathcolor='black' mathsize='12pt' mathvariant='normal'>7</mml:mn><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>&#x2212;</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>b</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>*</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>P</mml:mi><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>O</mml:mi><mml:mn mathcolor='black' mathsize='12pt' mathvariant='normal'>3</mml:mn><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>&#x2212;</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>c</mml:mi><mml:mo>*</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>O</mml:mi><mml:mn mathcolor='black' mathsize='12pt' mathvariant='normal'>1</mml:mn></mml:mrow><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>)</mml:mo></mml:mrow><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>/</mml:mo><mml:mn mathcolor='black' mathsize='12pt' mathvariant='normal'>2</mml:mn><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>*</mml:mo><mml:mrow><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>(</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>a</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>+</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>b</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>+</mml:mo><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>c</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>)</mml:mo></mml:mrow></mml:mrow></mml:math>
</disp-formula>
<p>where a,b,c represent the medial&#x02013;lateral coordinates of those electrodes in a 10&#x02013;20 extended system.</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Event-related potential components investigated, electrodes contained in the eight clusters used, and the window of interest used to define the component&#x02019;s peak.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<th valign="top" align="left">Components</th>
<th valign="top" align="left">Cluster name</th>
<th valign="top" align="left">Electrodes in cluster</th>
<th valign="top" align="left">Window of interest</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">N1</td>
<td valign="top" align="left">L/R Frontal</td>
<td valign="top" align="left">F3/4, FC3/FC4</td>
<td valign="top" align="left">80&#x02013;130</td>
</tr>
<tr>
<td valign="top" align="left">P1</td>
<td valign="top" align="left">L/R Occipital</td>
<td valign="top" align="left">PO7/8, PO3/4, O1/2, Oz, POz</td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">N170</td>
<td valign="top" align="left">L/R Occipito-temporal</td>
<td valign="top" align="left">PO7/8, PO3/4, P7/8</td>
<td valign="top" align="left">130&#x02013;190</td>
</tr>
<tr>
<td valign="top" align="left">N170</td>
<td valign="top" align="left">L/R Temporo&#x02013;parietal</td>
<td valign="top" align="left">P5/6, CP5/6</td>
<td valign="top" align="left"></td>
</tr>
<tr>
<td valign="top" align="left">P240</td>
<td valign="top" align="left" colspan="2">All the previously defined clusters</td>
<td valign="top" align="left">220&#x02013;260</td></tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec>
<title>SOURCE ANALYSIS</title>
<p>A preliminary ICA (<xref ref-type="bibr" rid="B34">Hyvarinen, 1999</xref>) was performed on ERP data, which allowed for the decomposition of the signal to noise-normalized independent components (ICs). Only those ICs that showed an SNR below 1 across all intervals of interest (from &#x02013;250 to 300 ms with respect to the facial onset) were removed from the ERP data (<xref ref-type="bibr" rid="B35">Inuggi et al., 2011a</xref>,<xref ref-type="bibr" rid="B36">b</xref>). The source activity was reconstructed using the cortical current density (CCD) model with a conductor volume defined by a 3-compartment boundary element method (BEM), with conductivity values of 0.33-0.0042-0.33 S/m (<xref ref-type="bibr" rid="B21">Fuchs et al., 2002</xref>), derived from the FSL MNI template (www.fmrib.ox.ac.uk/fsl), dimensions of 91&#x000D7;109&#x000D7;91 and a voxel size of 2&#x000D7;2&#x000D7;2 mm. The sources number (6899) and positions were obtained by sampling the cortex (5 mm wide), with their orientations fixed perpendicular to the cortical patch they originated from, and their intensities were calculated using the SWARM algorithm (<xref ref-type="bibr" rid="B81">Wagner et al., 2007</xref>). The CCD was reconstructed with the Curry V6 software (Neuroscan Inc., Herndon, VA, USA).</p>
<sec>
<title>ROI definition</title>
<p>Cortical activity was calculated in seven pairs of right and left regions of interest (ROI) involving lateral fusiform gyrus (BA37), posterior superior temporal sulcus (pSTS), TPJ plus inferior parietal lobule (TPJ+IPL), intraparietal sulcus (IPS), middle frontal gyrus (MFC), inferior frontal gyrus (IFG), and primary visual area (V1). In an additional analysis, the two V1 ROIs were merged into a single ROI, and its activation was expressed in terms of lateralization of its medial&#x02013;lateral center of gravity, calculated as is explained later on.</p>
<p>Regions of interest were manually drawn on the MRI images using the Curry software internal anatomical atlas and previous research as references. TPJ+IPL ROI was created starting with the strict TPJ definition of <xref ref-type="bibr" rid="B47">Mort et al. (2003)</xref> but also included the inferior parietal lobe, like most studies that investigate the VAN and that located their activations around these areas. Its resulting center of gravity will clarify more specifically the anatomical localization of this activation. To take into account possible between-subjects electrodes&#x02019; slight montage misallocation, ROIs were enlarged (5 mm wide) and then smoothed (2 mm wide). ROIs are illustrated in <bold>Figure <xref ref-type="fig" rid="F2">2</xref></bold>.</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption><p><bold>Cortical areas investigated.</bold> IPS, intraparietal sulcus; pSTC, posterior superior temporal sulcus; TPJ, temporo&#x02013;parietal junction; BA37, lateral temporo&#x02013;occipital cortex; MFC, middle frontal cortex; IFG, inferior frontal gyrus. Within dotted lines, is represented that part of TPJ which overlaps with IPS and pSTS.</p></caption>
<graphic xlink:href="fpsyg-05-01498-g002.tif"/>
</fig>
</sec>
<sec>
<title>ROI activity</title>
<p>Three periods were investigated: L100, where N1 and P1 are active, L170, which corresponds to the N170 peak, and L240, which corresponds to our late peak. Within these periods, the mean cortical activation of each ROI was separately calculated using the following procedure: (i) within each latency, the intensity of all of the active sources contained in the ROI were summed; (ii) the latency with the highest value was defined as the peak latency (PL); and (iii) a 40 ms-length temporal window, centered on that peak, was used to calculate the area total activity (TA) within each period of each area, as was previously described (<xref ref-type="bibr" rid="B35">Inuggi et al., 2011a</xref>; <xref ref-type="bibr" rid="B25">Gonzalez-Rosa et al., 2013</xref>). This procedure was performed separately for each ROI, thus allowing us to take into account the onset differences of nearly simultaneous components (e.g., P1 and N1) and to create periods of the same temporal length to ensure proper comparisons. The length of the time window was selected according to a previous study (<xref ref-type="bibr" rid="B25">Gonzalez-Rosa et al., 2013</xref>).</p>
<p>The activations of center of gravity, decomposed in the medial&#x02013;lateral (CX), anterior-posterior (CY), and ventral-dorsal (CZ) positions, were calculated using the following formula (e.g., CX):</p>
<disp-formula id="E2"><mml:math id="M13"><mml:mrow><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>C</mml:mi><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>X</mml:mi><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>=</mml:mo><mml:mrow><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>(</mml:mo><mml:msub><mml:mrow><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>&#x03a3;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>i</mml:mi><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>j</mml:mi></mml:mrow></mml:msub><mml:msub><mml:mrow><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>S</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>i</mml:mi><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>j</mml:mi></mml:mrow></mml:msub><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>*</mml:mo><mml:msub><mml:mrow><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>X</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>i</mml:mi><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>j</mml:mi></mml:mrow></mml:msub><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>)</mml:mo></mml:mrow><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>/</mml:mo><mml:msub><mml:mrow><mml:mi mathcolor='black' mathsize='12pt' mathvariant='normal'>&#x03a3;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>i</mml:mi><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>j</mml:mi></mml:mrow></mml:msub><mml:msub><mml:mrow><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>S</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>i</mml:mi><mml:mi mathcolor='black' mathsize='8pt' mathvariant='normal'>j</mml:mi></mml:mrow></mml:msub><mml:mo mathcolor='black' mathsize='12pt' mathvariant='normal'>,</mml:mo></mml:mrow></mml:math>
</disp-formula>
<p>where s<sub>ij</sub> is the intensity of the i-th source at timepoint j and X<sub>ij</sub> is the medial&#x02013;lateral position of the i-th source at timepoint j.</p>
</sec>
</sec>
<sec>
<title>STATISTICAL ANALYSIS</title>
<p>The effects of the between-subjects factor <italic>task type</italic> (emotion task, shape task) and the within-subjects factor <italic>face expression</italic> (angry, happy, or neutral) and <italic>hemisphere</italic> (left and right) over TA within each area and period were analyzed with a mixed analysis of variance (ANOVA). The Kolmogorov&#x02013;Smirnov test was used to examine the normal distribution of the data, and, when appropriate, the Greenhouse&#x02013;Geisser correction was applied. The significance level of the main effects (task type, emotional expression, and hemisphere) and their interactions were corrected for multiple comparisons (14 ROI &#x000D7; 3 periods) using a false discovery rate (FDR) approach, but using a more conservative version (<xref ref-type="bibr" rid="B6">Benjamini and Yekutieli, 2001</xref>) compared to standard FDR. According to its formula (&#x003B1;/&#x003A3;<sub>i=1..k</sub>(1/i), where i = 42 is the number of multiple comparisons and &#x003B1; = 0.05 is the predetermined <italic>p</italic>-value), we report only the significant <italic>p</italic>-values below 0.0112. Because the number of multiple comparisons was lower in the ERP analysis (8 cluster &#x000D7; 3 periods), the corrected threshold was 0.0132. The size effects were reported through the <inline-formula><mml:math id="M1"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> value. <italic>Post hoc</italic> comparisons of within-subjects (facial expression) and between-subjects (task type) factors were performed with paired and unpaired <italic>t</italic>-tests. The multiple pairwise comparisons of facial expressions were adjusted with the Bonferroni correction.</p>
<p>To provide the ERP equivalent of our source analysis results, a mixed ANOVA, analyzing the effects of <italic>task type</italic> and <italic>face expression</italic>, was also performed over the ERP electrode clusters that overlay the ROI of the sources significantly affected by our experimental factors.</p>
</sec>
</sec>
<sec>
<title>RESULTS</title>
<sec>
<title>BEHAVIORAL DATA</title>
<p>Trials with incorrect responses to the target word (R1; 1.8 and 1.5% for the emotion task and the shape task, respectively), and trials with incorrect responses to the to-be-attended facial feature (R2; 3.1 and 5.0% for the emotion task and the shape task, respectively) were excluded from analysis. In addition, we excluded trials with RTs below 200 ms (anticipations) or more than three standard deviation (omissions) from the subject&#x02019;s mean for each condition (1,90%). The mean RT for R1 in the emotion task was 790 ms (SD = 144) for congruent trials (happy face/positive word and angry face/negative word trials) and 825 ms (SD = 164) for incongruent trials (angry face/positive word and happy face/negative word trials). In the shape task, the mean RT was 767 ms (SD = 173) for congruent trials and 768 ms (SD = 155) for incongruent trials. These means were submitted to mixed ANOVA with <italic>congruency</italic> (congruent, incongruent) and <italic>task type</italic> (emotion, shape) as factors. There was a main effect of <italic>congruency</italic>, <italic>F</italic>(1,26) = 9.75; <italic>MSE</italic> = 464; <italic>p</italic> = 0.004; <inline-formula><mml:math id="M2"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.27, revealing that responses were faster for congruent than for incongruent trials (this difference represents the affective priming effect, <italic>M</italic> = 18 ms). However, this effect was qualified by a <italic>congruency</italic> by <italic>task type</italic> interaction, <italic>F</italic>(1,26) = 9.10, <italic>MSE</italic> = 464, <italic>p</italic> = 0.006, <inline-formula><mml:math id="M3"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula>= 0.26. <italic>Post hoc</italic> Fisher&#x02019;s least significant difference (LSD) tests (<italic>MSE</italic> = 25411, <italic>df</italic> = 26,479) revealed significant <italic>congruency</italic> effect for the emotion task (priming effect = 35 ms, <italic>p</italic> &#x0003C; 0.001) but no effect at all for the shape task (priming effect = 0.6 ms, <italic>p</italic> = 0.941). Results, thus, replicate those obtained in our previous behavioral study (<xref ref-type="bibr" rid="B64">Sassi et al., 2014</xref>). To support the goodness of our protocol, we verified that neither the main effects of <italic>word valence</italic>, <italic>F</italic>(1,26) = 1.56, <italic>p</italic> = 0.222, and <italic>task type</italic>, <italic>F</italic> &#x0003C; 1, nor their interaction, <italic>F</italic>(1,26) = 1.30, <italic>p</italic> = 0.26, were statistically significant with the neutral expression. Analysis of error rate (CR1) revealed no statistically significant effects.</p>
</sec>
<sec>
<title>SOURCE ANALYSIS DATA</title>
<p>The group averages of the evoked potentials elicited by the two tasks, which merged the three emotional faces, are displayed in <bold>Figure <xref ref-type="fig" rid="F3">3</xref></bold>. <bold>Table <xref ref-type="table" rid="T2">2</xref></bold> summarizes the activation&#x02019;s center of gravity coordinates and PL values of the ROIs, where a significant effect of either task type or face emotion could be observed.</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption><p><bold>Group averages of ERP in emotion (solid line) and shape (dotted line) tasks in the first 300 ms after facial stimulus presentation.</bold> For all the electrodes, the vertical scale boundary is set at +10 &#x003BC;V.</p></caption>
<graphic xlink:href="fpsyg-05-01498-g003.tif"/>
</fig>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>Talairach coordinates of activations; center-of-gravity in right IPL + TPJ ROI at L170.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<th valign="top" align="left">Area</th>
<th valign="top" align="left">Task</th>
<th valign="top" align="center" colspan="3">Neutral<hr/></th>
<th valign="top" align="center" colspan="3">Happy<hr/></th>
<th valign="top" align="center" colspan="3">Angry<hr/></th>
</tr>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
<th valign="top" align="center">X</th>
<th valign="top" align="center">Y</th>
<th valign="top" align="center">Z</th>
<th valign="top" align="center">X</th>
<th valign="top" align="center">Y</th>
<th valign="top" align="center">Z</th>
<th valign="top" align="center">X</th>
<th valign="top" align="center">Y</th>
<th valign="top" align="center">Z</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">R TPJ 170</td>
<td valign="top" align="left">Emotion</td>
<td valign="top" align="center">49</td>
<td valign="top" align="center">&#x02013;51</td>
<td valign="top" align="center">24</td>
<td valign="top" align="center">48</td>
<td valign="top" align="center">&#x02013;49</td>
<td valign="top" align="center">23</td>
<td valign="top" align="center">49</td>
<td valign="top" align="center">&#x02013;49</td>
<td valign="top" align="center">22</td>
</tr>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left">Shape</td>
<td valign="top" align="center">49</td>
<td valign="top" align="center">&#x02013;51</td>
<td valign="top" align="center">29</td>
<td valign="top" align="center">48</td>
<td valign="top" align="center">&#x02013;53</td>
<td valign="top" align="center">28</td>
<td valign="top" align="center">49</td>
<td valign="top" align="center">&#x02013;50</td>
<td valign="top" align="center">23</td></tr>
</tbody>
</table>
</table-wrap>
<sec>
<title>Effect of task type</title>
<p>During L170, an effect of task type was observed in lateral BA37 activity, [<italic>F</italic>(1,26) = 7.93; <italic>p</italic> = 0.011, <inline-formula><mml:math id="M4"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.26], which was less intense (<bold>Figure <xref ref-type="fig" rid="F4">4</xref></bold>) in the shape task (<italic>M</italic> = 2.28, SD = 0.6 &#x003BC;A/mm<sup>2</sup>) than in the emotion task (<italic>M</italic> = 5.4, SD = 0.5 &#x003BC;A/mm<sup>2</sup>).</p>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption><p><bold>Effect of emotional expression and task type at L170. (Left)</bold> Task type effects on right lateral BA37. No significant differences were observed for facial emotion. <bold>(Right)</bold> Right TPJ sensitivities to angry facial expression in shape task only. On the y-axis, the mean activity of each ROI in the L170 time window is expressed in &#x003BC;A/mm<sup>2</sup>.</p></caption>
<graphic xlink:href="fpsyg-05-01498-g004.tif"/>
</fig>
</sec>
<sec>
<title>Interaction between task type and facial expressions</title>
<p>A significant type task x facial expression &#x000D7; hemisphere interaction was observed in IPL+TPJ during L170, [<italic>F</italic>(1.518,39.45) = 6.41, <italic>p</italic>= 0.010, <inline-formula><mml:math id="M5"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.23]. <italic>Post hoc</italic> analyses revealed that the <italic>type task</italic> &#x000D7; <italic>facial expression</italic> was significant only for the right side, [<italic>F</italic>(1.81,37.06) = 5.35, <italic>p</italic>= 0.010, <inline-formula><mml:math id="M6"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.218]. Additionally, while facial expressions did not differ from each other in the emotion task, an effect of facial expression was observed in the shape task when facial expressions had to be ignored, [<italic>F</italic>(1.58,20.56) = 12.06, <italic>p</italic>= 0.001, <inline-formula><mml:math id="M7"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.48], with higher activation to angry facial expressions (<italic>M</italic> = 8.1, SD = 1.1 &#x003BC;A/mm<sup>2</sup>) compared to both happy (<italic>M</italic> = 5.8, SD = 0.9 &#x003BC;A/mm<sup>2</sup>, <italic>p</italic>= 0.002) and neutral (<italic>M</italic> = 6.1, SD = 0.8 &#x003BC;A/mm<sup>2</sup>, <italic>p</italic>= 0.002) ones (<bold>Figure <xref ref-type="fig" rid="F4">4</xref></bold>, right; <bold>Figure <xref ref-type="fig" rid="F5">5</xref></bold>). The center of gravity position of cortical activation in IPL+TPJ ROI, reported in <bold>Table <xref ref-type="table" rid="T2">2</xref></bold>, was located in close proximity to the TPJ defined by <xref ref-type="bibr" rid="B47">Mort et al. (2003)</xref>, as shown in <bold>Figure <xref ref-type="fig" rid="F5">5</xref></bold>. We thus will refer to it as TPJ activation. No modulation over the IPS, pSTS, or middle and inferior frontal areas were observed at any latency.</p>
<fig id="F5" position="float">
<label>FIGURE 5</label>
<caption><p><bold>The shape task: increased activation in response to the angry facial expression (right) compared to happy (center) and neutral (left) expressions in the TPJ within the IPL+TPJ ROI (voxels enclosed within the yellow borders) at L170</bold>.</p></caption>
<graphic xlink:href="fpsyg-05-01498-g005.tif"/>
</fig>
</sec>
<sec>
<title>Lateralization of visual area activity</title>
<p>During the P100 component, the medial&#x02013;lateral center of gravity (CX) of the visual areas was more lateralized to the right hemisphere in the emotion (<italic>M</italic> = &#x02013;5, SD = 0.9 mm) task compared to the shape (<italic>M</italic> = 8.7, SD = 2.1 mm) task [<italic>F</italic>(1,26) = 8.21, <italic>p</italic>= 0.010, <inline-formula><mml:math id="M8"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.281; <bold>Figure <xref ref-type="fig" rid="F6">6</xref></bold>]. A significant task type &#x000D7; facial expression interaction was observed in visual areas, [<italic>F</italic>(1.53,38.21) = 6.55, <italic>p</italic>= 0.010, <inline-formula><mml:math id="M9"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.18]. The effect of facial emotion on the activation lateralization was observed only in the emotion task, with the angry (<italic>M</italic> = 12, SD = 2.5 mm, <italic>p</italic>= 0.011) and happy (<italic>M</italic> = 10, SD = 2.2 mm, <italic>p</italic>= 0.010) faces more lateralized to the right hemisphere with respect to the neutral faces (<italic>M</italic> = 4.7, SD = 2 mm). No significant differences emerged in the L240 interval.</p>
<fig id="F6" position="float">
<label>FIGURE 6</label>
<caption><p><bold>Visual area lateralization around L100. (Left)</bold> The effect of facial expression and task type on the medial&#x02013;lateral position of the activation&#x02019;s center-of-gravity (COG). The cortical current density (CCD) results of the emotional facial expression (center) compared to the neutral facial expression <bold>(right)</bold> in the emotion task. On the y-axis, the mean activity of the ROI in the L100 time window is expressed in &#x003BC;A/mm<sup>2</sup>.</p></caption>
<graphic xlink:href="fpsyg-05-01498-g006.tif"/>
</fig>
</sec>
</sec>
<sec>
<title>ERP DATA</title>
<p>During the P100 component, the medial&#x02013;lateral center of gravity of the cluster obtained by merging the right and left occipital clusters was modulated by task type [<italic>F</italic>(1,26) = 5.45, <italic>p</italic>= 0.011, <inline-formula><mml:math id="M10"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.25], which was more right-lateralized in the emotion task (<italic>M</italic> = 11.3, SD = 4.5 mm), than in the shape task (<italic>M</italic>= &#x02013;0.9, SD = 3.8 mm). At &#x0223C;170 ms, the occipito-temporal cluster that overlays the lateral BA37 was not affected by the task type. In the right occipito-temporal cluster, which should provide the ERP equivalent of the right TPJ activation, a significant interaction was found between task type and facial expression in the occipito-temporal cluster [<italic>F</italic>(1.52,21.13) = 5.20, <italic>p</italic>= 0.012, <inline-formula><mml:math id="M11"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.24]. Nevertheless, we found a trend (<italic>p</italic>= 0.065) versus a more negative peak to angry faces compared to neutral ones in the shape task (<bold>Figure <xref ref-type="fig" rid="F7">7</xref></bold>). No differences emerged within the parieto-temporal cluster.</p>
<fig id="F7" position="float">
<label>FIGURE 7</label>
<caption><p><bold>ERP results: (upper row) effect of task type over occipito-parietal cluster; (lower row) effect of facial expression over occipito-parietal cluster in emotion (left) and shape (right) tasks</bold>.</p></caption>
<graphic xlink:href="fpsyg-05-01498-g007.tif"/>
</fig>
</sec>
</sec>
<sec>
<title>DISCUSSION</title>
<p>In this study, the effect of a fine-grained, emotion-irrelevant, discriminatory task on the early emotional faces processing was investigated by reconstructing the cortical generators of the scalp-recorded potentials. Our main objective was to evaluate if angry expressions were processed differently from non-angry (neutral and positive) expressions when attention was diverted to another task. We opted to engage subjects in a fine discrimination of the shape of the glasses worn by the face stimuli, a task that was supposed to deplete, according to our previous behavioral study, the attentional resources (see <xref ref-type="bibr" rid="B64">Sassi et al., 2014</xref>). Several previous studies assessed the interaction of attention and emotion when emotion-relevant and emotion-irrelevant stimuli did not share the same geometrical space (<xref ref-type="bibr" rid="B78">Vuilleumier et al., 2001</xref>; <xref ref-type="bibr" rid="B32">Holmes et al., 2003</xref>). Because the redirection of the subject&#x02019;s attention to another position may represent a potential confounding issue, we opted to place both the emotion-relevant and emotion-irrelevant features in the same foveal position, removing any obstacles to the automatic processing of emotional faces when asked to ignore them. In addition to investigating the peculiar processing of ignored angry faces, we were also interested in giving a neurophysiological explanation for the loss of the affective priming effect observed in our behavioral results (<xref ref-type="bibr" rid="B64">Sassi et al., 2014</xref>, current study) when subjects were involved in an emotion-irrelevant task. We concentrated our analysis on the cortical areas involved in the processing of the fine-grained facial features, which are supposed to be highly modulated in a top&#x02013;down manner by the observers&#x02019; attention, making its processing not pre-attentive but strictly related to the availability of attentional resources. Moreover, considering the high priority of aversive facial expressions in capturing attentional resources, we also focused on the parietal areas that belong to both the ventral (TPJ) and the dorsal (IPS) attention networks and the partially overlapped frontal areas of the two networks, the inferior (IFG) and middle (MFG) frontal gyri (<xref ref-type="bibr" rid="B19">Fox et al., 2006</xref>).</p>
<sec>
<title>THE EFFECT OF ATTENTION ON THE VENTRAL STREAM</title>
<p>In the present study, we confirm that the ventral stream is highly modulated by the observer&#x02019;s attention. The activity of the occipital areas at &#x0223C;100 ms was more right lateralized in the emotion task than in the shape task and, more notably, when subjects attended to the facial expression, activation produced by emotional face expressions was more right lateralized than activation produced by neutral faces. Such selectivity disappeared when subjects attended to the glasses&#x02019; shape.</p>
<p>Considering that the assessment of FFA activity through scalp recordings is widely questioned, as the area lies within the inferior part of the temporal cortex, we created the lateral BA37 ROI because previous neuroimaging studies showed a correlation between the N170 EEG component, calculated by electrodes overlaying it and fMRI-derived FFA activity (<xref ref-type="bibr" rid="B33">Horovitz et al., 2004</xref>; <xref ref-type="bibr" rid="B62">Sadeh et al., 2010</xref>), which suggests that surface electrodes may capture at least part of FFA activity. Additionally, electro-corticography studies have revealed that lateral BA37 is also involved in face processing (<xref ref-type="bibr" rid="B60">Rossion et al., 2003</xref>; <xref ref-type="bibr" rid="B75">Tsuchiya et al., 2008</xref>). At &#x0223C;170 ms, lateral BA37 activation was reduced in the shape task compared with the emotional task, which suggests that when subjects were asked to ignore the facial expression and just concentrate on the glasses&#x02019; shape, the detailed face features might not have been very distinctive. This result agrees with previous findings that report larger activity in FFA for faces compared to non-face objects (<xref ref-type="bibr" rid="B30">Haxby et al., 2000</xref>; <xref ref-type="bibr" rid="B60">Rossion et al., 2003</xref>). Taken together, our behavioral and neurophysiological results strongly suggest that our shape task succeeded in guiding subjects&#x02019; attention away from any face feature, preventing any conscious monitoring of the emotional content of the face. In the long debate over the pre-attentive automaticity of emotional processing, our results suggest that an appropriate level of attention is needed to process emotional expressions. Although presented in the same visual focus, the reduced BA37 activity and the loss of emotional selectivity of primary visual areas in the shape task suggest that subjects presumably focused their attention just on the glasses&#x02019; shape and ignored the underlying emotional expression.</p>
<p>The lateralization of the activations found in the present study deserves further comments. The lateralization of emotional processing is still an open issue because the two main theories, supporting either the right-hemisphere hypothesis (RHH; <xref ref-type="bibr" rid="B8">Borod et al., 1998</xref>; <xref ref-type="bibr" rid="B9">Bourne, 2010</xref>) or the valence-specific hypothesis (VSH; <xref ref-type="bibr" rid="B46">Mandal et al., 1991</xref>; <xref ref-type="bibr" rid="B1">Adolphs et al., 2001</xref>), have been questioned by more recent fMRI meta-analysis investigations (<xref ref-type="bibr" rid="B23">Fusar-Poli et al., 2009</xref>; <xref ref-type="bibr" rid="B61">Sabatinelli et al., 2011</xref>). The bulk of evidence shows bilateral activation for emotional face processing in most emotion-related areas, although lateralization might be modulated by gender (see for example <xref ref-type="bibr" rid="B80">Wager et al., 2003</xref>). In the present study most (22 out of 28) of the subjects were women, and our data are consistent with a previous EEG report specifically investigating the gender effect over emotional face processing. <xref ref-type="bibr" rid="B58">Proverbio et al. (2006)</xref> in fact found maximal P1 amplitude over the right occipital cortex in both genders, consistently with our results showing that in the emotion task occipital activity around 100 ms was right lateralized. The lack of a right lateralization observed in our data during the N170 may appear inconsistent with the widely accepted right predominance of FFA in face processing (<xref ref-type="bibr" rid="B41">Kanwisher and Yovel, 2006</xref>). However, this again agrees with <xref ref-type="bibr" rid="B58">Proverbio et al.&#x02019;s (2006)</xref> findings of a right lateralization of N170 only in men. In contrast, women exhibited a bilateral pattern. These results can help foster better understanding of the inconsistencies in the literature on the right hemisphere advantage in the occipito-temporal cortices when processing faces and confirm the relevance of incorporating gender information.</p>
</sec>
<sec>
<title>ANGRY FACIAL EXPRESSION PROCESSING</title>
<p>Although both static and emotional features appeared under-processed by canonical face processing cortical areas, unattended angry expressions were able to activate the TPJ, a cortical expanse implicated in a wide spectrum of high-order cognitive functions ranging from social cognition (<xref ref-type="bibr" rid="B65">Saxe and Kanwisher, 2003</xref>) to attention selection (<xref ref-type="bibr" rid="B15">Corbetta and Shulman, 2002</xref>). The latter branch of investigation showed that the TPJ is part of the VAN, a fronto-parietal network that, during focused activities, is formally involved in re-orienting (shifting) attention to stimuli relevant to the immediate goal. Nevertheless, because the attentional focus covered a similar area in both tasks, no reorienting process was expected, as our IPS activity also indicates. The latter is in fact part of the DAN, which contains the proper circuitry to implement the focus reorienting, and was not modulated by our experimental conditions. The absence of any modulation over frontal areas might be interpreted accordingly; the integration between ventral and DANs, needed for attention re-orienting, occurs in such aforementioned frontal areas where the two networks highly overlap (<xref ref-type="bibr" rid="B19">Fox et al., 2006</xref>).</p>
<p>Thus, the present findings support the proposal that VAN activation, at least in its parietal areas, might not exclusively be involved in attentional reorienting. It is consistent with more recent reports that suggest that TPJ activity might be triggered by both external sensory stimuli and internal memory-based information, thus providing bottom&#x02013;up signals to other systems about relevant stimuli for further inspection (<xref ref-type="bibr" rid="B10">Cabeza et al., 2012</xref>). In agreement with the present results, VAN activity has also been observed when behaviorally relevant, rather than salient, stimuli are presented while the individual is engaged in another task (<xref ref-type="bibr" rid="B14">Corbetta et al., 2008</xref>). Accordingly, the activation of TPJ just when the unattended face was shown with an angry expression suggests that negative emotions can pre-attentively evoke bottom&#x02013;up cortical signals, according to their behavioral relevance, even when attention is focused on emotion-irrelevant features in a task that we assumed exhausted the attentional resources to process the emotional content of faces. Because the ventral stream and STS were not modulated by the degree of unattended emotional content and the VAN is considered a supramodal network (<xref ref-type="bibr" rid="B45">Macaluso et al., 2002</xref>; <xref ref-type="bibr" rid="B26">Green et al., 2011</xref>) not able to decode the threatening pattern from facial expression, we suggest that TPJ activation might be triggered from other brain regions. Several neuroimaging studies suggested that, in parallel with the cortical stream (<xref ref-type="bibr" rid="B51">Palermo and Rhodes, 2007</xref>), a subcortical pathway, that reaches the amygdala through fast and coarse subcortical inputs that originate in the superior colliculus and finally project onto fronto-parietal areas, is thought to implement a brain circuitry specialized in emotional attention (<xref ref-type="bibr" rid="B77">Vuilleumier, 2005</xref>). This circuitry, likely partly modulated by the attentional focus (<xref ref-type="bibr" rid="B57">Pourtois et al., 2013</xref>) is involved in the rapid and automatic detection of negative facial expressions (for review; see <xref ref-type="bibr" rid="B79">Vuilleumier and Pourtois, 2007</xref>), and it seems to play a crucial role in directing attention and information processing to threatening stimuli (<xref ref-type="bibr" rid="B49">Ohman and Mineka, 2001</xref>). Because reconstructing amygdala activity with EEG presents several accuracy limitations, as it will be discussed later, further studies that integrate EEG with neuroimaging techniques are surely needed, but our data are consistent with such a model. A previous MEG study showed in fact that the amygdala activates as early as 100 ms after stimulus presentation (<xref ref-type="bibr" rid="B72">Streit et al., 2003</xref>), a latency early enough to trigger TPJ activation at &#x0223C;150&#x02013;170 ms. The present TPJ activation of &#x0223C;170 ms is consistent with a recent ERP study that investigates the threat detection advantage (<xref ref-type="bibr" rid="B18">Feldmann-W&#x000FC;stefeld et al., 2011</xref>), which revealed that angry and happy expression processing started to differ at &#x0223C;160 ms. This suggests that angry faces may trigger a fear module that enables their rapid processing and recruit additional attentional resources, possibly by means of TPJ, as is here hypothesized.</p>
<p>In conclusion, within the VAN, TPJ activation at this early latency primarily signals the behavioral relevance of a task-irrelevant aversive stimulus, irrespective of whether that stimulus requires a physical shift of attention (involving the dorsal network). The fact that such a trigger was not followed by an actual over-processing of face features is likely due to the task demands that, immediately after face offset (&#x0223C;200 ms), required that subjects focus on word onset and the corresponding response related to its emotional valence.</p>
</sec>
<sec>
<title>DIFFERENCES BETWEEN SOURCES AND SENSORS ANALYSIS</title>
<p>In the present paper, we aimed to provide an ERP-equivalent of the activations produced by source analysis. We thus focused this analysis only on the time windows and clusters that surround the cortical areas affected by our experimental conditions. ERP analysis found that attended emotions, compared to ignored emotions, have their occipital P1 peak more right lateralized but was unable to assess the selectivity toward attended emotional faces, which disappeared in the shape task. In a similar manner, ERP analysis could detect the interaction between task and emotions at &#x0223C;170 ms in the right occipito-temporal cluster, but it did not find a significant difference between angry and non-angry ignored faces. Of course, the current ERP approach is only one of many possible approaches. We are not concluding that another ERP analysis would have been unable to locate the same effects found with source analysis. However, even if such an effect had been encountered in a cluster or in a channel (e.g., CP4 or CP6), it would have been impossible to clearly attribute it to one of the areas beneath and close to the sensors cluster. Ideally, both pSTS and BA37 would have been valid candidates, and we could have argued that because they are part of the cortical stream supposedly deputed to extract face features, they would have presumably shown such functioning also in the attended condition, but that doubt would have persisted, and the involvement of TPJ could have been just one of the possible hypotheses. Instead, source analysis, when calculating the center of gravity of the large ROI covering the temporal and parietal lobe, indicated the TPJ involvement.</p>
</sec>
<sec>
<title>METHODOLOGICAL CONSIDERATIONS AND LIMITS OF THE PRESENT INVESTIGATION</title>
<p>The main limits of EEG source analysis are its high sensitivity to artifacts, the low signal-to-noise ratio and the limited spatial resolution. To properly address these limits, we employed a consolidated methodological approach (<xref ref-type="bibr" rid="B37">Inuggi et al., 2010</xref>, <xref ref-type="bibr" rid="B35">2011a</xref>,<xref ref-type="bibr" rid="B36">b</xref>; <xref ref-type="bibr" rid="B25">Gonzalez-Rosa et al., 2013</xref>), which has consistently proved to obtain results in line with the neuroimaging literature. We used a seed-based analysis instead of a voxel-wise one because this approach is often used in both EEG and neuroimaging analyses, when strong hypothesis of the involved brain areas is possible. In fact, although the experimental task, seen as a whole, is brand new, the areas involved in the investigated interval have been accurately described in the past as producing a consistent picture that guided and supported our ROI selection. We adopted a conservative approach, selecting ROIs in areas on the outer surface of the brain where the spatial resolution of the EEG source analysis is maximal and avoiding the investigation of deep brain areas such as the proper FFA, orbitofrontal, para-hippocampal cortices and amygdala. These areas were reported in several neuroimaging studies but their reconstruction through EEG presents several methodological issues. EEG source analysis accuracy is in fact highly corrupted by the huge anisotropy and inhomogeneity of the brain that blur the emerging signal when it is not modeled by a proper volume conductor model. Deep sources are of course more buried within the brain as the ideal lines separating the sources from the scalp electrodes cross much more tissues of different conductivities than superficial sources, making the blurring much higher. Concerning the temporal selection, we opted to analyze up to &#x0223C;300 ms because we were interested in assessing the automatic processing of face stimuli, aware of the fact that the later components would have been altered by subjects&#x02019; intentions or strategies to concentrate on target stimulus (the word) decoding. We decided to investigate the task effect as a between-subject factor as we were interested in maximizing the &#x0201C;unattendeness&#x0201D; of face emotional expressions in the glasses shape task as much as possible. We feared that if half of subjects, due to the counterbalance of the task order, performed the emotional task first and then the glasses task, facial emotion might have acquired some relevance even when the glasses task asked subjects to attend and respond to only the glasses&#x02019; shape. In addition, we would have obtained an incredibly long task, with unpredictable consequences over subjects&#x02019; attention and performance level, with the risk of introducing undesired biases into our results. The failure to locate the areas that actually discriminate and extract the emotional features of the faces surely represents a limit of the present exploration. A trend versus a higher activation of pSTS in emotional compared to neutral expressions was found only in the emotion task. However, it was not significant even before applying the Benjamin and Yekuteli correction. This might be due to the spatio-temporal resolution of the method here implemented or, more presumably, because emotional processing also involves deep brain areas, such as FFA s, orbitofrontal cortices and subcortical regions.</p>
</sec>
</sec>
<sec>
<title>CONCLUSION</title>
<p>In the present study, we employed a novel approach to explore the role of attention in emotional face processing by setting up an ecological environment that involved faces wearing glasses. Moreover, by overlapping in space both the to-be-attended and the to-be-unattended facial features, we avoided any potential confounding produced by attention shifts, so that any emerging differences could be attributed more confidently to the availability of the attentional resources required to deal with facial emotional expressions. In studies that report emotional processing that was not affected by attentional manipulations, the emotion-unattended condition did not usually require investing a great amount of attentional resources; thus, it was difficult to claim that the emotional processing of faces could take place without attention. Here, consistent with our previous behavioral study (<xref ref-type="bibr" rid="B64">Sassi et al., 2014</xref>), in which emotion-irrelevant task demands were progressively increased, we observed that when subjects were involved in an emotion-irrelevant discrimination task that might have depleted attentional resources, behavioral results did not show any evidence of affective priming. These results corroborate the studies that support that emotional processing requires some attentional resources (<xref ref-type="bibr" rid="B54">Pessoa et al., 2002</xref>, <xref ref-type="bibr" rid="B55">2005</xref>; <xref ref-type="bibr" rid="B17">Eimer et al., 2003</xref>; <xref ref-type="bibr" rid="B32">Holmes et al., 2003</xref>; <xref ref-type="bibr" rid="B50">Okon-Singer et al., 2007</xref>; <xref ref-type="bibr" rid="B69">Silvert et al., 2007</xref>). Importantly, although the attentional resources were allocated to detect the characteristics of the glasses, the angry facial expression activated the temporo&#x02013;parietal area of the VAN. This automatic activation presumably represents a pre-attentive bottom&#x02013;up trigger, possibly evoked by a subcortical pathway centered on the amygdala, which, independently from the ventral stream areas, signals the presence of unattended and task-irrelevant but potentially threatening stimuli (<xref ref-type="bibr" rid="B49">Ohman and Mineka, 2001</xref>). These results are in line with more recent reports (<xref ref-type="bibr" rid="B10">Cabeza et al., 2012</xref>) that disentangle TPJ activation from a re-orienting process that involves the DAN and can, for example, explain why search performance of angry faces is more efficient when they are displayed among several distractors (the anger superiority effect, <xref ref-type="bibr" rid="B28">Hansen and Hansen, 1988</xref>).</p>
<p>From an evolutionary point of view, the presence of such an early pre-attentive response, which also appears when subjects are comfortably seated in a safe environment, may increase the potential for a faster and more accurate identification of aversive emotional expressions (in the absence of proper inhibitory top&#x02013;down signals aimed to ignore them, as in the present study). This mechanism would represent a successful adaptive process because a fast and correct prediction of aversive intentions may help the observers to better adapt their behavior and thus provide a crucial survival advantage (<xref ref-type="bibr" rid="B20">Frank and Sabatinelli, 2012</xref>).</p>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ack>
<p>This study was supported by the CSD2008-00048 and PSI2011-23340 grants obtained from the Spanish Ministry of Econom&#x000ED;a y Competitividad. The authors would also like to thank Francisco Garc&#x000ED;a and Violeta Pina for their help in the EEG recordings.</p>
</ack>
<ref-list>
<title>REFERENCES</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adolphs</surname> <given-names>R.</given-names></name> <name><surname>Jansari</surname> <given-names>A.</given-names></name> <name><surname>Tranel</surname> <given-names>D.</given-names></name></person-group> (<year>2001</year>). <article-title>Hemispheric perception of emotional valence from facial expressions.</article-title> <source><italic>Neuropsychology</italic></source> <volume>15</volume> <fpage>516</fpage>&#x02013;<lpage>524</lpage>. <pub-id pub-id-type="doi">10.1037/0894-4105.15.4.516</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Allison</surname> <given-names>T.</given-names></name> <name><surname>Puce</surname> <given-names>A.</given-names></name> <name><surname>McCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>2000</year>). <article-title>Social perception from visual cues: role of the STS region.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>4</volume> <fpage>267</fpage>&#x02013;<lpage>278</lpage>. <pub-id pub-id-type="doi">10.1016/S1364-6613(00)01501-1</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Anderson</surname> <given-names>A. K.</given-names></name></person-group> (<year>2005</year>). <article-title>Affective influences on the attentional dynamics supporting awareness.</article-title> <source><italic>J. Exp. Psychol. Gen.</italic></source> <volume>134</volume> <fpage>258</fpage>&#x02013;<lpage>281</lpage>. <pub-id pub-id-type="doi">10.1037/0096-3445.134.2.258</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Anderson</surname> <given-names>A. K.</given-names></name> <name><surname>Christoff</surname> <given-names>K.</given-names></name> <name><surname>Panitz</surname> <given-names>D.</given-names></name> <name><surname>De Rosa</surname> <given-names>E.</given-names></name> <name><surname>Gabrieli</surname> <given-names>J. D.</given-names></name></person-group> (<year>2003</year>). <article-title>Neural correlates of the automatic processing of threat facial signals.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>23</volume> <fpage>5627</fpage>&#x02013;<lpage>5633</lpage>.</citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bayle</surname> <given-names>D. J.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2010</year>). <article-title>Attention inhibition of early cortical activation to fearful faces.</article-title> <source><italic>Brain Res.</italic></source> <volume>255</volume> <fpage>113</fpage>&#x02013;<lpage>123</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2009.11.060</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Benjamini</surname> <given-names>Y.</given-names></name> <name><surname>Yekutieli</surname> <given-names>D.</given-names></name></person-group> (<year>2001</year>). <article-title>The control of the false discovery rate in multiple testing under dependency.</article-title> <source><italic>Ann. Stat.</italic></source> <volume>29</volume> <fpage>1165</fpage>&#x02013;<lpage>1188</lpage>. <pub-id pub-id-type="doi">10.1214/aos/1013699998</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bentin</surname> <given-names>S.</given-names></name> <name><surname>Allison</surname> <given-names>T.</given-names></name> <name><surname>Puce</surname> <given-names>A.</given-names></name> <name><surname>Perez</surname> <given-names>E.</given-names></name> <name><surname>McCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>1996</year>). <article-title>Electrophysiological studies of face perception in humans.</article-title> <source><italic>J. Cogn. Neurosci.</italic></source> <volume>8</volume> <fpage>551</fpage>&#x02013;<lpage>565</lpage>. <pub-id pub-id-type="doi">10.1162/jocn.1996.8.6.551</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Borod</surname> <given-names>J. C.</given-names></name> <name><surname>Cicero</surname> <given-names>B. A.</given-names></name> <name><surname>Obler</surname> <given-names>L. K.</given-names></name> <name><surname>Welkowitz</surname> <given-names>J.</given-names></name> <name><surname>Erhan</surname> <given-names>H. M.</given-names></name> <name><surname>Santschi</surname> <given-names>C.</given-names></name><etal/></person-group> (<year>1998</year>). <article-title>Right hemisphere emotional perception: evidence across multiple channels.</article-title> <source><italic>Neuropsychology</italic></source> <volume>12</volume> <fpage>446</fpage>&#x02013;<lpage>458</lpage>. <pub-id pub-id-type="doi">10.1037/0894-4105.12.3.446</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bourne</surname> <given-names>V. J.</given-names></name></person-group> (<year>2010</year>). <article-title>How are emotions lateralised in the brain? Contrasting existing hypotheses using the chimeric faces test.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>24</volume> <fpage>903</fpage>&#x02013;<lpage>911</lpage>. <pub-id pub-id-type="doi">10.1080/02699930903007714</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cabeza</surname> <given-names>R.</given-names></name> <name><surname>Ciaramelli</surname> <given-names>E.</given-names></name> <name><surname>Moscovitch</surname> <given-names>M.</given-names></name></person-group> (<year>2012</year>). <article-title>Cognitive contributions of the ventral parietal cortex: an integrative theoretical account.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>16</volume> <fpage>338</fpage>&#x02013;<lpage>352</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2012.04.008</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Calvo</surname> <given-names>M. G.</given-names></name> <name><surname>Avero</surname> <given-names>P.</given-names></name> <name><surname>Lundqvist</surname> <given-names>D.</given-names></name></person-group> (<year>2006</year>). <article-title>Facilitated detection of angry faces: initial orienting and processing efficiency.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>20</volume> <fpage>785</fpage>&#x02013;<lpage>811</lpage>. <pub-id pub-id-type="doi">10.1080/02699930500465224</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Carreti&#x000E9;</surname> <given-names>L.</given-names></name> <name><surname>Hinojosa</surname> <given-names>J.</given-names></name><name><surname> A.</surname> <given-names>Mart</given-names></name><name><surname>&#x000ED;n-Loeches</surname> <given-names>M.</given-names></name> <name><surname>Mercado</surname> <given-names>F.</given-names></name> <name><surname>Tapia</surname> <given-names>M.</given-names></name></person-group> (<year>2004</year>). <article-title>Automatic attention to emotional stimuli: neural correlates.</article-title> <source><italic>Hum. Brain Mapp.</italic></source> <volume>22</volume> <fpage>290</fpage>&#x02013;<lpage>299</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20037</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Compton</surname> <given-names>R. J.</given-names></name></person-group> (<year>2003</year>). <article-title>The interface between emotion and attention: a review of evidence from psychology and neuroscience.</article-title> <source><italic>Behav. Cogn. Neurosci. Rev.</italic></source> <volume>2</volume> <fpage>115</fpage>&#x02013;<lpage>129</lpage>. <pub-id pub-id-type="doi">10.1177/1534582303002002003</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Corbetta</surname> <given-names>M.</given-names></name> <name><surname>Patel</surname> <given-names>G.</given-names></name> <name><surname>Shulman</surname> <given-names>G. L.</given-names></name></person-group> (<year>2008</year>). <article-title>The reorienting system of the human brain: from environment to theory of mind.</article-title> <source><italic>Neuron</italic></source> <volume>58</volume> <fpage>306</fpage>&#x02013;<lpage>324</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuron.2008.04.017</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Corbetta</surname> <given-names>M.</given-names></name> <name><surname>Shulman</surname> <given-names>G. L.</given-names></name></person-group> (<year>2002</year>). <article-title>Control of goal-directed, and stimulus-driven attention in the brain.</article-title> <source><italic>Nat. Rev. Neurosci.</italic></source> <volume>3</volume> <fpage>201</fpage>&#x02013;<lpage>215</lpage>. <pub-id pub-id-type="doi">10.1038/nrn755</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name> <name><surname>Holmes</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>Event-related brain potential correlates of emotional face processing.</article-title> <source><italic>Neuropsychologia</italic></source> <volume>45</volume> <fpage>15</fpage>&#x02013;<lpage>31</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2006.04.022</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name> <name><surname>Holmes</surname> <given-names>A.</given-names></name> <name><surname>McGlone</surname> <given-names>F. P.</given-names></name></person-group> (<year>2003</year>). <article-title>The role of spatial attention in the processing of facial expression: an ERP study of rapid brain responses to six basic emotions.</article-title> <source><italic>Cogn. Affect. Behav. Neurosci.</italic></source> <volume>3</volume> <fpage>97</fpage>&#x02013;<lpage>110</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2006.04.022</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Feldmann-W&#x000FC;stefeld</surname> <given-names>T.</given-names></name> <name><surname>Schmidt-Daffy</surname> <given-names>M.</given-names></name> <name><surname>Schub&#x000F6;</surname> <given-names>A.</given-names></name></person-group> (<year>2011</year>). <article-title>Neural evidence for the threat detection advantage: differential attention allocation to angry and happy faces.</article-title> <source><italic>Psychophysiology</italic></source> <volume>48</volume> <fpage>697</fpage>&#x02013;<lpage>707</lpage>. <pub-id pub-id-type="doi">10.1111/j.1469-8986.2010.01130.x</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fox</surname> <given-names>M. D.</given-names></name><name><surname>Corbetta</surname> <given-names>M.</given-names></name> <name><surname>Snyder</surname> <given-names>A. Z.</given-names></name> <name><surname>Vincent</surname> <given-names>J. L.</given-names></name> <name><surname>Raichle</surname> <given-names>M. E.</given-names></name></person-group> (<year>2006</year>). <article-title>Spontaneous neuronal activity distinguishes human dorsal and ventral attention systems.</article-title> <source><italic>Proc. Natl. Acad. Sci. U.S.A.</italic></source> <volume>103</volume> <fpage>10046</fpage>&#x02013;<lpage>10051</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.0604187103</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Frank</surname> <given-names>D. W.</given-names></name> <name><surname>Sabatinelli</surname> <given-names>D.</given-names></name></person-group> (<year>2012</year>). <article-title>Stimulus-driven reorienting in the ventral frontoparietal attention network: the role of emotional content.</article-title> <source><italic>Front. Hum. Neurosci.</italic></source> <volume>6</volume>:<issue>116</issue>. <pub-id pub-id-type="doi">10.3389/fnhum.2012.00116</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fuchs</surname> <given-names>M.</given-names></name> <name><surname>Kastner</surname> <given-names>J.</given-names></name> <name><surname>Wagner</surname> <given-names>M.</given-names></name> <name><surname>Hawes</surname> <given-names>S.</given-names></name> <name><surname>Ebersole</surname> <given-names>J. S.</given-names></name></person-group> (<year>2002</year>). <article-title>A standardized boundary element method volume conductor model.</article-title> <source><italic>Clin. Neurophysiol.</italic></source> <volume>113</volume> <fpage>702</fpage>&#x02013;<lpage>712</lpage>. <pub-id pub-id-type="doi">10.1016/S1388-2457(02)00030-5</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fuchs</surname> <given-names>M.</given-names></name> <name><surname>Wagner</surname> <given-names>M.</given-names></name> <name><surname>Kohler</surname> <given-names>T.</given-names></name> <name><surname>Wischmann</surname> <given-names>H. A.</given-names></name></person-group> (<year>1999</year>). <article-title>Linear and nonlinear current density reconstructions.</article-title> <source><italic>Clin. Neurophysiol.</italic></source> <volume>16</volume> <fpage>267</fpage>&#x02013;<lpage>295</lpage>. <pub-id pub-id-type="doi">10.1097/00004691-199905000-00006</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fusar-Poli</surname> <given-names>P.</given-names></name> <name><surname>Placentino</surname> <given-names>A.</given-names></name> <name><surname>Carletti</surname> <given-names>F.</given-names></name> <name><surname>Allen</surname> <given-names>P.</given-names></name> <name><surname>Landi</surname> <given-names>P.</given-names></name> <name><surname>Abbamonte</surname> <given-names>M.</given-names></name><etal/></person-group> (<year>2009</year>). <article-title>Laterality effect on emotional faces processing: ALE meta-analysis of evidence.</article-title> <source><italic>Neurosci. Lett.</italic></source> <volume>452</volume> <fpage>262</fpage>&#x02013;<lpage>267</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2009.01.065</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Georgiou</surname> <given-names>G. A.</given-names></name> <name><surname>Bleakley</surname> <given-names>C.</given-names></name> <name><surname>Hayward</surname> <given-names>J.</given-names></name> <name><surname>Russo</surname> <given-names>R.</given-names></name> <name><surname>Dutton</surname> <given-names>K.</given-names></name> <name><surname>Eltiti</surname> <given-names>S.</given-names></name><etal/></person-group> (<year>2005</year>). <article-title>Focusing on fear: attentional disengagement from emotional faces.</article-title> <source><italic>Vis. Cogn.</italic></source> <volume>12</volume> <fpage>145</fpage>&#x02013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1080/13506280444000076</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gonzalez-Rosa</surname> <given-names>J. J.</given-names></name> <name><surname>Inuggi</surname> <given-names>A.</given-names></name> <name><surname>Blasi</surname> <given-names>V.</given-names></name> <name><surname>Cursi</surname> <given-names>M.</given-names></name> <name><surname>Annovazzi</surname> <given-names>P.</given-names></name> <name><surname>Comi</surname> <given-names>G.</given-names></name><etal/></person-group> (<year>2013</year>). <article-title>Response competition and response inhibition during different choice-discrimination tasks: evidence from ERP measured inside MRI scanner.</article-title> <source><italic>Int. J. Psychophysiol.</italic></source> <volume>89</volume> <fpage>37</fpage>&#x02013;<lpage>47</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2013.04.021</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Green</surname> <given-names>J. J.</given-names></name><name><surname>Doesburg</surname> <given-names>S. M.</given-names></name> <name><surname>Ward</surname> <given-names>L. M.</given-names></name> <name><surname>McDonald</surname> <given-names>J. J.</given-names></name></person-group> (<year>2011</year>). <article-title>Electrical neuroimaging of voluntary audiospatial attention: evidence for a supramodal attention control network.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>31</volume> <fpage>3560</fpage>&#x02013;<lpage>3564</lpage>. <pub-id pub-id-type="doi">10.1523/JNEUROSCI.5758-10.2011</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Halgren</surname> <given-names>E.</given-names></name> <name><surname>Raij</surname> <given-names>T.</given-names></name> <name><surname>Marinkovic</surname> <given-names>K.</given-names></name> <name><surname>Jousm&#x000E4;ki</surname> <given-names>V.</given-names></name> <name><surname>Hari</surname> <given-names>R.</given-names></name></person-group> (<year>2000</year>). <article-title>Cognitive response profile of the human fusiform face area as determined by MEG.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>10</volume> <fpage>69</fpage>&#x02013;<lpage>81</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/10.1.69</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hansen</surname> <given-names>C. H.</given-names></name> <name><surname>Hansen</surname> <given-names>R. D.</given-names></name></person-group> (<year>1988</year>). <article-title>Finding the face in the crowd: an anger superiority effect.</article-title> <source><italic>J. Pers. Soc. Psychol.</italic></source> <volume>54</volume> <fpage>917</fpage>&#x02013;<lpage>924</lpage>. <pub-id pub-id-type="doi">10.1037/0022-3514.54.6.917</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hart</surname> <given-names>S. J.</given-names></name> <name><surname>Green</surname> <given-names>S. R.</given-names></name> <name><surname>Casp</surname> <given-names>M.</given-names></name> <name><surname>Belger</surname> <given-names>A.</given-names></name></person-group> (<year>2010</year>). <article-title>Emotional priming effects during Stroop task performance.</article-title> <source><italic>Neuroimage</italic></source> <volume>49</volume> <fpage>2662</fpage>&#x02013;<lpage>2670</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.10.076</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haxby</surname> <given-names>J. V.</given-names></name> <name><surname>Hoffman</surname> <given-names>E. A.</given-names></name> <name><surname>Gobbini</surname> <given-names>M. I.</given-names></name></person-group> (<year>2000</year>). <article-title>The distributed human neural system for face perception.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>4</volume> <fpage>223</fpage>&#x02013;<lpage>233</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.10.076</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hoffman</surname> <given-names>E. A.</given-names></name><name><surname>Haxby</surname> <given-names>J. V.</given-names></name></person-group> (<year>2000</year>). <article-title>Distinct representations of eye gaze and identity in the distributed human neural system for face perception.</article-title> <source><italic>Nat. Neurosci.</italic></source> <volume>3</volume> <fpage>80</fpage>&#x02013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.1038/71152</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Holmes</surname> <given-names>A.</given-names></name> <name><surname>Vuilleumier</surname> <given-names>P.</given-names></name> <name><surname>Eimer</surname> <given-names>M.</given-names></name></person-group> (<year>2003</year>). <article-title>The processing of emotional facial expression is gated by spatial attention: evidence from event-related brain potentials.</article-title> <source><italic>Brain Res. Cogn. Brain Res.</italic></source> <volume>16</volume> <fpage>174</fpage>&#x02013;<lpage>184</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(02)00268-9</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Horovitz</surname> <given-names>S. G.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Skudlarski</surname> <given-names>P.</given-names></name> <name><surname>Gore</surname> <given-names>J. C.</given-names></name></person-group> (<year>2004</year>). <article-title>Parametric design and correlational analyses help integrating fMRI and electrophysiological data during face processing.</article-title> <source><italic>Neuroimage</italic></source> <volume>22</volume> <fpage>1587</fpage>&#x02013;<lpage>1595</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(02)00268-9</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hyvarinen</surname> <given-names>A.</given-names></name></person-group> (<year>1999</year>). <article-title>Fast and robust fixed-point algorithms for independent component analysis.</article-title> <source><italic>IEEE Trans. Neural Netw.</italic></source> <volume>10</volume> <fpage>626</fpage>&#x02013;<lpage>634</lpage>. <pub-id pub-id-type="doi">10.1109/72.761722</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Inuggi</surname> <given-names>A.</given-names></name> <name><surname>Amato</surname> <given-names>N.</given-names></name> <name><surname>Magnani</surname> <given-names>G.</given-names></name> <name><surname>Gonz&#x000E1;lez-Rosa</surname> <given-names>J. J.</given-names></name> <name><surname>Chieffo</surname> <given-names>R.</given-names></name> <name><surname>Comi</surname> <given-names>G.</given-names></name><etal/></person-group> (<year>2011a</year>). <article-title>Cortical control of unilateral simple movement in healthy aging.</article-title> <source><italic>Neurobiol. Aging</italic></source> <volume>32</volume> <fpage>524</fpage>&#x02013;<lpage>538</lpage>. <pub-id pub-id-type="doi">10.1016/j.neurobiolaging.2009.02.020</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Inuggi</surname> <given-names>A.</given-names></name> <name><surname>Riva</surname> <given-names>N.</given-names></name> <name><surname>Gonz&#x000E1;lez-Rosa</surname> <given-names>J. J.</given-names></name> <name><surname>Amadio</surname> <given-names>S.</given-names></name> <name><surname>Amato</surname> <given-names>N.</given-names></name> <name><surname>Fazio</surname> <given-names>R.</given-names></name><etal/></person-group> (<year>2011b</year>). <article-title>Compensatory movement-related recruitment in amyotrophic lateral sclerosis patients with dominant upper motor neuron signs: an EEG source analysis study.</article-title> <source><italic>Brain Res.</italic></source> <volume>255</volume> <fpage>37</fpage>&#x02013;<lpage>46</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2011.09.007</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Inuggi</surname> <given-names>A.</given-names></name> <name><surname>Filippi</surname> <given-names>M.</given-names></name> <name><surname>Chieffo</surname> <given-names>R.</given-names></name> <name><surname>Agosta</surname> <given-names>F.</given-names></name> <name><surname>Rocca</surname> <given-names>M. A.</given-names></name> <name><surname>Gonz&#x000E1;lez-Rosa</surname> <given-names>J. J.</given-names></name><etal/></person-group> (<year>2010</year>). <article-title>Motor area localization using fMRI-constrained cortical current density reconstruction of movement-related cortical potentials, a comparison with fMRI and TMS mapping.</article-title> <source><italic>Brain Res.</italic></source> <volume>255</volume> <fpage>68</fpage>&#x02013;<lpage>78</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2009.10.042</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ishai</surname> <given-names>A.</given-names></name> <name><surname>Ungerleider</surname> <given-names>L. G.</given-names></name> <name><surname>Martin</surname> <given-names>A.</given-names></name> <name><surname>Schouten</surname> <given-names>J. L.</given-names></name> <name><surname>Haxby</surname> <given-names>J. V.</given-names></name></person-group> (<year>1999</year>). <article-title>Distributed representation of objects in the human ventral visual pathway.</article-title> <source><italic>Proc. Natl. Acad. Sci. U.S.A.</italic></source> <volume>96</volume> <fpage>9379</fpage>&#x02013;<lpage>9384</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2011.09.007</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name><name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2004</year>). <article-title>Source analysis of the N170 to faces and objects.</article-title> <source><italic>Neuroreport</italic></source> <volume>15</volume> <fpage>1261</fpage>&#x02013;<lpage>1265</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.96.16.9379</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kanwisher</surname> <given-names>N.</given-names></name> <name><surname>McDermott</surname> <given-names>J.</given-names></name> <name><surname>Chun</surname> <given-names>M. M.</given-names></name></person-group> (<year>1997</year>). <article-title>The fusiform face area: a module in human extrastriate cortex specialized for face perception.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>17</volume> <fpage>4302</fpage>&#x02013;<lpage>4311</lpage>.</citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kanwisher</surname> <given-names>K.</given-names></name> <name><surname>Yovel</surname> <given-names>G.</given-names></name></person-group> (<year>2006</year>). <article-title>The fusiform face area: a cortical region specialized for the perception of faces.</article-title> <source><italic>Philos. Trans. R. Soc. Lond. B Biol. Sci.</italic></source> <volume>361</volume> <fpage>2109</fpage>&#x02013;<lpage>2128</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.2006.1934</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lavie</surname> <given-names>N.</given-names></name></person-group> (<year>1995</year>). <article-title>Perceptual load as a necessary condition for selective attention.</article-title> <source><italic>J. Exp. Psychol. Hum. Percept. Perform.</italic></source> <volume>21</volume> <fpage>451</fpage>&#x02013;<lpage>468</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.21.3.451</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liu</surname> <given-names>T.</given-names></name> <name><surname>Pinheiro</surname> <given-names>A.</given-names></name> <name><surname>Zhao</surname> <given-names>Z.</given-names></name> <name><surname>Nestor</surname> <given-names>P. G.</given-names></name><name><surname>McCarley</surname> <given-names>R. W.</given-names></name> <name><surname>Niznikiewicz</surname> <given-names>M. A.</given-names></name></person-group> (<year>2012</year>). <article-title>Emotional cues during simultaneous face and voice processing: electrophysiological insights.</article-title> <source><italic>PLoS ONE</italic></source> <volume>7</volume>:<issue>e31001</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0031001</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Luo</surname> <given-names>W.</given-names></name> <name><surname>Feng</surname> <given-names>W.</given-names></name> <name><surname>He</surname> <given-names>W.</given-names></name> <name><surname>Wang</surname> <given-names>N.-Y.</given-names></name> <name><surname>Luo</surname> <given-names>Y.-J.</given-names></name></person-group> (<year>2010</year>). <article-title>Three stages of facial expression processing: ERP study with rapid serial visual presentation.</article-title> <source><italic>Neuroimage</italic></source> <volume>49</volume> <fpage>1857</fpage>&#x02013;<lpage>1867</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.09.018</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Macaluso</surname> <given-names>E.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name> <name><surname>Driver</surname> <given-names>J.</given-names></name></person-group> (<year>2002</year>). <article-title>Supramodal effects of covert spatial orienting triggered by visual or tactile events.</article-title> <source><italic>J. Cogn. Neurosci.</italic></source> <volume>14</volume> <fpage>389</fpage>&#x02013;<lpage>401</lpage>. <pub-id pub-id-type="doi">10.1162/089892902317361912</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mandal</surname> <given-names>M. K.</given-names></name><name><surname>Tandon</surname> <given-names>S. C.</given-names></name> <name><surname>Asthana</surname> <given-names>H. S.</given-names></name></person-group> (<year>1991</year>). <article-title>Right brain damage impairs recognition of negative emotions.</article-title> <source><italic>Cortex</italic></source> <volume>27</volume> <fpage>247</fpage>&#x02013;<lpage>253</lpage>. <pub-id pub-id-type="doi">10.1162/089892902317361912</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mort</surname> <given-names>D. J.</given-names></name> <name><surname>Malhotra</surname> <given-names>P.</given-names></name> <name><surname>Mannan</surname> <given-names>S. K.</given-names></name> <name><surname>Rorden</surname> <given-names>C.</given-names></name> <name><surname>Pambakian</surname> <given-names>A.</given-names></name> <name><surname>Kennard</surname> <given-names>C.</given-names></name><etal/></person-group> (<year>2003</year>). <article-title>The anatomy of visual neglect.</article-title> <source><italic>Brain</italic></source> <volume>126</volume> <fpage>1986</fpage>&#x02013;<lpage>1997</lpage>. <pub-id pub-id-type="doi">10.1093/brain/awg200</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ochsner</surname> <given-names>K. N.</given-names></name><name><surname>Gross</surname> <given-names>J. J.</given-names></name></person-group> (<year>2005</year>). <article-title>The cognitive control of emotion.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>9</volume> <fpage>242</fpage>&#x02013;<lpage>249</lpage>. <pub-id pub-id-type="doi">10.1093/brain/awg200</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ohman</surname> <given-names>A.</given-names></name> <name><surname>Mineka</surname> <given-names>S.</given-names></name></person-group> (<year>2001</year>). <article-title>Fears, phobias, and preparedness: toward an evolved module of fear and fear learning.</article-title> <source><italic>Psychol. Rev.</italic></source> <volume>108</volume> <fpage>483</fpage>&#x02013;<lpage>522</lpage>. <pub-id pub-id-type="doi">10.1037/0033-295X.108.3.483</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Okon-Singer</surname> <given-names>H.</given-names></name> <name><surname>Tzelgov</surname> <given-names>J.</given-names></name> <name><surname>Henik</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>Distinguishing between automaticity and attention in the processing of emotionally significant stimuli.</article-title> <source><italic>Emotion</italic></source> <volume>7</volume> <fpage>147</fpage>&#x02013;<lpage>157</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.7.1.147</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Palermo</surname> <given-names>R.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2007</year>). <article-title>Are you always on my mind? A review of how face perception and attention interact.</article-title> <source><italic>Neuropsychologia</italic></source> <volume>45</volume> <fpage>75</fpage>&#x02013;<lpage>92</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2006.04.025</pub-id></citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pascual-Marqui</surname> <given-names>R. D.</given-names></name></person-group> (<year>2002</year>). <article-title>Standardized low resolution brain electromagnetic tomography (sLORETA): technical details.</article-title> <source><italic>Methods Find. Exp. Clin. Pharmacol.</italic></source> <volume>24</volume> <fpage>5</fpage>&#x02013;<lpage>12</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2006.04.025</pub-id></citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pessoa</surname> <given-names>L.</given-names></name></person-group> (<year>2005</year>). <article-title>To what extent are emotional visual stimuli processed without attention and awareness?</article-title> <source><italic>Curr. Opin. Neurobiol.</italic></source> <volume>15</volume> <fpage>188</fpage>&#x02013;<lpage>196</lpage>. <pub-id pub-id-type="doi">10.1016/j.conb.2005.03.002</pub-id></citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pessoa</surname> <given-names>L.</given-names></name> <name><surname>McKenna</surname> <given-names>M.</given-names></name> <name><surname>Gutierrez</surname> <given-names>E.</given-names></name> <name><surname>Ungerleider</surname> <given-names>L. G.</given-names></name></person-group> (<year>2002</year>). <article-title>Neural processing of emotional faces requires attention.</article-title> <source><italic>Proc. Natl. Acad. Sci. U.S.A.</italic></source> <volume>99</volume> <fpage>11458</fpage>&#x02013;<lpage>11463</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.172403899</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pessoa</surname> <given-names>L.</given-names></name> <name><surname>Padmala</surname> <given-names>S.</given-names></name> <name><surname>Morland</surname> <given-names>T.</given-names></name></person-group> (<year>2005</year>). <article-title>Fate of unattended fearful faces in the amygdala is determined by both attentional resources and cognitive modulation.</article-title> <source><italic>Neuroimage</italic></source> <volume>28</volume> <fpage>249</fpage>&#x02013;<lpage>255</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2005.05.048</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pourtois</surname> <given-names>G.</given-names></name> <name><surname>Grandjean</surname> <given-names>D.</given-names></name> <name><surname>Sander</surname> <given-names>D.</given-names></name> <name><surname>Vuilleumier</surname> <given-names>P.</given-names></name></person-group> (<year>2004</year>). <article-title>Electrophysiological correlates of rapid spatial orienting towards fearful faces.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>14</volume> <fpage>619</fpage>&#x02013;<lpage>633</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhh023</pub-id></citation></ref>
<ref id="B57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pourtois</surname> <given-names>G.</given-names></name> <name><surname>Schettino</surname> <given-names>A.</given-names></name> <name><surname>Vuilleumier</surname> <given-names>P.</given-names></name></person-group> (<year>2013</year>). <article-title>Brain mechanisms for emotional influences on perception and attention: what is magic and what is not.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>92</volume> <fpage>492</fpage>&#x02013;<lpage>512</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2012.02.007</pub-id></citation></ref>
<ref id="B58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Proverbio</surname> <given-names>A. M.</given-names></name><name><surname>Brignone</surname> <given-names>V.</given-names></name> <name><surname>Matarazzo</surname> <given-names>S.</given-names></name> <name><surname>Del Zotto</surname> <given-names>M.</given-names></name> <name><surname>Zani</surname> <given-names>A.</given-names></name></person-group> (<year>2006</year>). <article-title>Gender differences in hemispheric asymmetry for face processing.</article-title> <source><italic>BMC Neurosci.</italic></source> <volume>7</volume>:<issue>44</issue>. <pub-id pub-id-type="doi">10.1186/1471-2202-7-44</pub-id></citation></ref>
<ref id="B59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Puce</surname> <given-names>A.</given-names></name> <name><surname>Allison</surname> <given-names>T.</given-names></name> <name><surname>Bentin</surname> <given-names>S.</given-names></name> <name><surname>Gore</surname> <given-names>J. C.</given-names></name><name><surname>McCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>1998</year>). <article-title>Temporal cortex activation in humans viewing eye and mouth movements.</article-title> <source><italic>J. Neurosci.</italic></source> <volume>18</volume> <fpage>2188</fpage>&#x02013;<lpage>2199</lpage>.</citation></ref>
<ref id="B60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Joyce</surname> <given-names>C. A.</given-names></name> <name><surname>Cottrell</surname> <given-names>G. W.</given-names></name> <name><surname>Tarr</surname> <given-names>M. J.</given-names></name></person-group> (<year>2003</year>). <article-title>Early lateralization and orientation tuning for face, word, and object processing in the visual cortex.</article-title> <source><italic>Neuroimage</italic></source> <volume>20</volume> <fpage>1609</fpage>&#x02013;<lpage>1624</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2003.07.010</pub-id></citation></ref>
<ref id="B61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sabatinelli</surname> <given-names>D.</given-names></name> <name><surname>Fortune</surname> <given-names>E. E.</given-names></name> <name><surname>Li</surname> <given-names>Q.</given-names></name> <name><surname>Siddiqui</surname> <given-names>A.</given-names></name> <name><surname>Krafft</surname> <given-names>C.</given-names></name> <name><surname>Oliver</surname> <given-names>W. T.</given-names></name><etal/></person-group> (<year>2011</year>). <article-title>Emotional perception: meta-analyses of face and natural scene processing.</article-title> <source><italic>Neuroimage</italic></source> <volume>54</volume> <fpage>2524</fpage>&#x02013;<lpage>2533</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2010.10.011</pub-id></citation></ref>
<ref id="B62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sadeh</surname> <given-names>B.</given-names></name> <name><surname>Podlipsky</surname> <given-names>I.</given-names></name> <name><surname>Zhdanov</surname> <given-names>A.</given-names></name> <name><surname>Yovel</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>Event-related potential and functional MRI measures of face-selectivity are highly correlated: a simultaneous ERP-fMRI investigation.</article-title> <source><italic>Hum. Brain Mapp.</italic></source> <volume>31</volume> <fpage>1490</fpage>&#x02013;<lpage>1501</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20952</pub-id></citation></ref>
<ref id="B63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Said</surname> <given-names>C. P.</given-names></name> <name><surname>Moore</surname> <given-names>C. D.</given-names></name> <name><surname>Engell</surname> <given-names>A. D.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name> <name><surname>Haxby</surname> <given-names>J. V.</given-names></name></person-group> (<year>2010</year>). <article-title>Distributed representations of dynamic facial expressions in the superior temporal sulcus.</article-title> <source><italic>J. Vis.</italic></source> <volume>10</volume> <fpage>1</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1167/10.5.11</pub-id></citation></ref>
<ref id="B64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sassi</surname> <given-names>F.</given-names></name> <name><surname>Campoy</surname> <given-names>G.</given-names></name> <name><surname>Castillo</surname> <given-names>A.</given-names></name> <name><surname>Inuggi</surname> <given-names>A.</given-names></name> <name><surname>Fuentes</surname> <given-names>L. J.</given-names></name></person-group> (<year>2014</year>). <article-title>Task difficulty and response complexity modulates affective priming by emotional facial expressions.</article-title> <source><italic>Q. J. Exp. Psychol.</italic></source> <volume>67</volume> <fpage>861</fpage>&#x02013;<lpage>871</lpage>. <pub-id pub-id-type="doi">10.1080/17470218.2013.836233</pub-id></citation></ref>
<ref id="B65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Saxe</surname> <given-names>R.</given-names></name> <name><surname>Kanwisher</surname> <given-names>N.</given-names></name></person-group> (<year>2003</year>). <article-title>People thinking about thinking people. The role of the temporo-parietal junction in &#x0201C;theory of mind.&#x0201D;</article-title> <source><italic>Neuroimage</italic></source> <volume>19</volume> <fpage>1835</fpage>&#x02013;<lpage>1842</lpage>. <pub-id pub-id-type="doi">10.1016/S1053-8119(03)00230-1</pub-id></citation></ref>
<ref id="B66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Scherg</surname> <given-names>M.</given-names></name> <name><surname>Von Cramon</surname> <given-names>D.</given-names></name></person-group> (<year>1985</year>). <article-title>Two bilateral sources of the late AEP as identified by a spatio-temporal dipole model.</article-title> <source><italic>Electroencephalogr. Clin. Neurophysiol.</italic></source> <volume>62</volume> <fpage>32</fpage>&#x02013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1016/0168-5597(85)90033-4</pub-id></citation></ref>
<ref id="B67"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schneider</surname> <given-names>W.</given-names></name> <name><surname>Eschman</surname> <given-names>A.</given-names></name> <name><surname>Zuccolotto</surname> <given-names>A.</given-names></name></person-group> (<year>2002</year>). <source><italic>E-Prime User&#x02019;s Guide</italic>.</source> <publisher-loc>Pittsburgh, PA</publisher-loc>: <publisher-name>Psychology Software Tools, Inc</publisher-name>.</citation></ref>
<ref id="B68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sebasti&#x000E1;n-Gall&#x000E9;s</surname> <given-names>N.</given-names></name> <name><surname>Mart&#x000ED;</surname> <given-names>M. A.</given-names></name> <name><surname>Carreiras</surname> <given-names>M.</given-names></name> <name><surname>Cuetos</surname> <given-names>F.</given-names></name></person-group> (<year>2000</year>). <source><italic>LEXESP: Una Base de Datos Informatizada del Espa&#x000F1;ol</italic>.</source> <publisher-loc>Barcelona</publisher-loc>: <publisher-name>Universitat de Barcelona</publisher-name>.</citation></ref>
<ref id="B69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Silvert</surname> <given-names>L.</given-names></name> <name><surname>Lepsien</surname> <given-names>J.</given-names></name> <name><surname>Fragopanagos</surname> <given-names>N.</given-names></name> <name><surname>Goolsby</surname> <given-names>B.</given-names></name> <name><surname>Kiss</surname> <given-names>M.</given-names></name> <name><surname>Taylor</surname> <given-names>J. G.</given-names></name><etal/></person-group> (<year>2007</year>). <article-title>Influence of attentional demands on the processing of emotional facial expressions in the amygdala.</article-title> <source><italic>Neuroimage</italic></source> <volume>38</volume> <fpage>357</fpage>&#x02013;<lpage>366</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2007.07.023</pub-id></citation></ref>
<ref id="B70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Smith</surname> <given-names>N. K.</given-names></name><name><surname>Cacioppo</surname> <given-names>J. T.</given-names></name> <name><surname>Larsen</surname> <given-names>J. T.</given-names></name> <name><surname>Chartrand</surname> <given-names>T. L.</given-names></name></person-group> (<year>2003</year>). <article-title>May i have your attention, please: electrocortical responses to positive and negative stimuli.</article-title> <source><italic>Neuropsychologia</italic></source> <volume>41</volume> <fpage>171</fpage>&#x02013;<lpage>183</lpage>. <pub-id pub-id-type="doi">10.1016/S0028-3932(02)00147-1</pub-id></citation></ref>
<ref id="B71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stenberg</surname> <given-names>G.</given-names></name> <name><surname>Wilking</surname> <given-names>S.</given-names></name> <name><surname>Dhal</surname> <given-names>M.</given-names></name></person-group> (<year>1995</year>). <article-title>Judging words at face value: interference in a word processing task reveals automatic processing of affective facial expressions.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>12</volume> <fpage>755</fpage>&#x02013;<lpage>782</lpage>. <pub-id pub-id-type="doi">10.1080/026999398379420</pub-id></citation></ref>
<ref id="B72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Streit</surname> <given-names>M.</given-names></name> <name><surname>Dammers</surname> <given-names>J.</given-names></name> <name><surname>Simsek-Kraues</surname> <given-names>S.</given-names></name> <name><surname>Brinkmeyer</surname> <given-names>J.</given-names></name> <name><surname>W&#x000F6;lwer</surname> <given-names>W.</given-names></name> <name><surname>Ioannides</surname> <given-names>A.</given-names></name></person-group> (<year>2003</year>). <article-title>Time course of regional brain activations during facial emotion recognition in humans.</article-title> <source><italic>Neurosci. Lett.</italic></source> <volume>255</volume> <fpage>101</fpage>&#x02013;<lpage>104</lpage>. <pub-id pub-id-type="doi">10.1016/S0304-3940(03)00274-X</pub-id></citation></ref>
<ref id="B73"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tamietto</surname> <given-names>M.</given-names></name> <name><surname>de Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2010</year>). <article-title>Neural bases of the non-conscious perception of emotional signals.</article-title> <source><italic>Nat. Rev. Neurosci.</italic></source> <volume>11</volume> <fpage>697</fpage>&#x02013;<lpage>709</lpage>. <pub-id pub-id-type="doi">10.1038/nrn2889</pub-id></citation></ref>
<ref id="B74"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tottenham</surname> <given-names>N.</given-names></name> <name><surname>Tanaka</surname> <given-names>J. W.</given-names></name> <name><surname>Leon</surname> <given-names>A. C.</given-names></name> <name><surname>McCarry</surname> <given-names>T.</given-names></name> <name><surname>Nurse</surname> <given-names>M.</given-names></name> <name><surname>Hare</surname> <given-names>T. A.</given-names></name><etal/></person-group> (<year>2009</year>). <article-title>The NimStim set of facial expressions: judgments from untrained research participants.</article-title> <source><italic>Psychiatry Res.</italic></source> <volume>168</volume> <fpage>242</fpage>&#x02013;<lpage>249</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2008.05.006</pub-id></citation></ref>
<ref id="B75"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tsuchiya</surname> <given-names>N.</given-names></name> <name><surname>Kawasaki</surname> <given-names>H.</given-names></name> <name><surname>Oya</surname> <given-names>H.</given-names></name> <name><surname>Howard</surname> <given-names>M. A.</given-names></name><name><surname>Adolphs</surname> <given-names>R.</given-names></name></person-group> (<year>2008</year>). <article-title>Decoding face information in time, frequency and space from direct intracranial recordings of the human brain.</article-title> <source><italic>PLoS ONE</italic></source> <volume>3</volume>:<issue>e3892</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0003892</pub-id></citation></ref>
<ref id="B76"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tzelgov</surname> <given-names>J.</given-names></name></person-group> (<year>1997</year>). <article-title>Specifying the relations between automaticity and consciousness: a theoretical note.</article-title> <source><italic>Conscious. Cogn.</italic></source> <volume>6</volume> <fpage>441</fpage>&#x02013;<lpage>451</lpage>. <pub-id pub-id-type="doi">10.1006/ccog.1997.0303</pub-id></citation></ref>
<ref id="B77"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vuilleumier</surname> <given-names>P.</given-names></name></person-group> (<year>2005</year>). <article-title>How brains beware: neural mechanisms of emotional attention.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>9</volume> <fpage>585</fpage>&#x02013;<lpage>594</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2005.10.011</pub-id></citation></ref>
<ref id="B78"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vuilleumier</surname> <given-names>P.</given-names></name> <name><surname>Armony</surname> <given-names>J. L.</given-names></name> <name><surname>Driver</surname> <given-names>J.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2001</year>). <article-title>Effects of attention and emotion on face processing in the human brain: an event-related fMRI study.</article-title> <source><italic>Neuron</italic></source> <volume>30</volume> <fpage>829</fpage>&#x02013;<lpage>841</lpage>. <pub-id pub-id-type="doi">10.1016/S0896-6273(01)00328-2</pub-id></citation></ref>
<ref id="B79"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vuilleumier</surname> <given-names>P.</given-names></name> <name><surname>Pourtois</surname> <given-names>G.</given-names></name></person-group> (<year>2007</year>). <article-title>Distributed and interactive brain mechanisms during emotion face perception: evidence from functional neuroimaging.</article-title> <source><italic>Neuropsychologia</italic></source> <volume>45</volume> <fpage>174</fpage>&#x02013;<lpage>194</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2006.06.003</pub-id></citation></ref>
<ref id="B80"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wager</surname> <given-names>T. D.</given-names></name> <name><surname>Phan</surname> <given-names>K. L.</given-names></name> <name><surname>Liberzon</surname> <given-names>I.</given-names></name> <name><surname>Taylor</surname> <given-names>S. F.</given-names></name></person-group> (<year>2003</year>). <article-title>Valence, gender, and lateralization of functional brain anatomy in emotion: a meta-analysis of findings from neuroimaging.</article-title> <source><italic>Neuroimage</italic></source> <volume>19</volume> <fpage>513</fpage>&#x02013;<lpage>531</lpage>. <pub-id pub-id-type="doi">10.1016/S1053-8119(03)00078-8</pub-id></citation></ref>
<ref id="B81"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wagner</surname> <given-names>M.</given-names></name> <name><surname>Fuchs</surname> <given-names>M.</given-names></name> <name><surname>Kastner</surname> <given-names>J.</given-names></name></person-group> (<year>2007</year>). <article-title>SWARM: sLORETA-weighted accurate minimum norm inverse solutions.</article-title> <source><italic>Int. Congr. Ser.</italic></source> <volume>1300</volume> <fpage>185</fpage>&#x02013;<lpage>188</lpage>. <pub-id pub-id-type="doi">10.1016/j.ics.2007.02.043</pub-id></citation></ref>
</ref-list>
</back>
</article>