<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2017.01269</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>The Impact of Top-Down Prediction on Emotional Face Processing in Social Anxiety</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Ran</surname> <given-names>Guangming</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/437757/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Chen</surname> <given-names>Xu</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychology, Institute of Education, China West Normal University</institution> <country>Nanchong, China</country></aff>
<aff id="aff2"><sup>2</sup><institution>Faculty of Psychology, Southwest University</institution> <country>Chongqing, China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: <italic>Gianluigi Ciocca, University of Milano-Bicocca, Italy</italic></p></fn>
<fn fn-type="edited-by"><p>Reviewed by: <italic>Wolfgang Einhauser, Technische Universit&#x00E4;t Chemnitz, Germany; James Danckert, University of Waterloo, Canada</italic></p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x002A;Correspondence: <italic>Guangming Ran, <email>haiqi49@cwnu.edu.cn</email></italic></p></fn>
<fn fn-type="other" id="fn002"><p>This article was submitted to Perception Science, a section of the journal Frontiers in Psychology</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>25</day>
<month>07</month>
<year>2017</year>
</pub-date>
<pub-date pub-type="collection">
<year>2017</year>
</pub-date>
<volume>8</volume>
<elocation-id>1269</elocation-id>
<history>
<date date-type="received">
<day>09</day>
<month>05</month>
<year>2017</year>
</date>
<date date-type="accepted">
<day>11</day>
<month>07</month>
<year>2017</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2017 Ran and Chen.</copyright-statement>
<copyright-year>2017</copyright-year>
<copyright-holder>Ran and Chen</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>There is evidence that people with social anxiety show abnormal processing of emotional faces. To investigate the impact of top-down prediction on emotional face processing in social anxiety, brain responses of participants with high and low social anxiety (LSA) were recorded, while they performed a variation of the emotional task, using high temporal resolution event-related potential techniques. Behaviorally, we reported an effect of prediction with higher accuracy for predictable than unpredictable faces. Furthermore, we found that participants with high social anxiety (HSA), but not with LSA, recognized angry faces more accurately than happy faces. For the P100 and P200 components, HSA participants showed enhanced brain activity for angry faces compared to happy faces, suggesting a hypervigilance to angry faces. Importantly, HSA participants exhibited larger N170 amplitudes in the right hemisphere electrodes than LSA participants when they observed unpredictable angry faces, but not when the angry faces were predictable. This probably reflects the top-down prediction improving the deficiency at building a holistic face representation in HSA participants.</p>
</abstract>
<kwd-group>
<kwd>social anxiety</kwd>
<kwd>emotional faces</kwd>
<kwd>prediction</kwd>
<kwd>holistic processing</kwd>
<kwd>N170</kwd>
</kwd-group>
<counts>
<fig-count count="6"/>
<table-count count="2"/>
<equation-count count="0"/>
<ref-count count="56"/>
<page-count count="12"/>
<word-count count="0"/>
</counts>
</article-meta>
</front>
<body>
<sec><title>Introduction</title>
<p>Social anxiety disorder (SAD), also known as social phobia (SP), is characterized by a severe impairment of social interactions (<xref ref-type="bibr" rid="B40">Ruscio et al., 2008</xref>; <xref ref-type="bibr" rid="B13">Harbort et al., 2013</xref>). SAD is one of the most common psychiatric disorders, with a lifetime prevalence in the general population of approximately 12% (<xref ref-type="bibr" rid="B2">Alden and Taylor, 2011</xref>). It is also known to be frequently accompanied by anxiety and depressive disorders (<xref ref-type="bibr" rid="B6">Beesdo et al., 2007</xref>; <xref ref-type="bibr" rid="B10">Delong and Pollack, 2008</xref>). There is a wide range of comorbidity rates, depending on the investigated study sample and diagnostic criteria (<xref ref-type="bibr" rid="B13">Harbort et al., 2013</xref>).</p>
<p>As fundamental emotional stimuli, emotional faces convey important information in social interactions (<xref ref-type="bibr" rid="B26">Luo et al., 2010</xref>; <xref ref-type="bibr" rid="B35">Ran et al., 2014a</xref>). An increasing number of electrophysiological studies have investigated the time course of emotional face processing in social anxiety. An early event-related potential (ERP) component, the P100, is assumed to reflect early visual processing and is sensitive to top-down attentional influences (<xref ref-type="bibr" rid="B42">Schendan et al., 1998</xref>; <xref ref-type="bibr" rid="B50">Taylor and Khan, 2000</xref>). It has been shown that, in individuals with SAD, there is a significant increase in the P100 amplitude in response to threat/angry faces (<xref ref-type="bibr" rid="B27">Mueller et al., 2009</xref>). Recently, <xref ref-type="bibr" rid="B30">Peschard et al. (2013)</xref> reported that people with SAD demonstrated enhancement of the P100 amplitude for all, not just social, stimuli.</p>
<p>The N170, an occipito-temporal negative deflection, is usually observed in experiments using faces as stimuli (<xref ref-type="bibr" rid="B51">Tortosa et al., 2013</xref>; <xref ref-type="bibr" rid="B55">Wiese et al., 2014</xref>). There is growing recognition that the N170 reflects a specific attention to the eye region of human faces (<xref ref-type="bibr" rid="B18">Itier et al., 2006</xref>, <xref ref-type="bibr" rid="B16">2007</xref>). Numerous ERP studies have shown that faces presented upside down trigger larger N170 amplitudes compared to upright faces (<xref ref-type="bibr" rid="B39">Rossion et al., 1999</xref>; <xref ref-type="bibr" rid="B19">Itier and Taylor, 2002</xref>). Such amplitude increase with the face inversion is believed to reflect the disruption of early holistic processing stages specific to human faces and has been suggested to be driven by the eye region (<xref ref-type="bibr" rid="B39">Rossion et al., 1999</xref>; <xref ref-type="bibr" rid="B19">Itier and Taylor, 2002</xref>). In addition to its important role in the face inversion, the eyes are important in conveying different emotions (<xref ref-type="bibr" rid="B17">Itier and Batty, 2009</xref>). While some studies failed to detect the moderating effect of SAD on the N170 (<xref ref-type="bibr" rid="B23">Kolassa et al., 2007</xref>; <xref ref-type="bibr" rid="B28">M&#x00FC;hlberger et al., 2009</xref>), others reported that SAD individuals exhibited more negative N170 amplitudes than controls (<xref ref-type="bibr" rid="B24">Kolassa and Miltner, 2006</xref>; <xref ref-type="bibr" rid="B56">Wieser et al., 2010</xref>). One can speculate that these inconsistencies may be due to differences in experimental designs and tasks (<xref ref-type="bibr" rid="B30">Peschard et al., 2013</xref>).</p>
<p>Subsequent to N170, the P200 component has been suggested to reflect the sustained perceptual processing (<xref ref-type="bibr" rid="B44">Schupp et al., 2004</xref>) and the complexity of emotional appraisal (<xref ref-type="bibr" rid="B22">Kolassa et al., 2009</xref>). An increasing number of studies have explored how SAD modulates the P200 (<xref ref-type="bibr" rid="B24">Kolassa and Miltner, 2006</xref>; <xref ref-type="bibr" rid="B52">van Peer et al., 2010</xref>; <xref ref-type="bibr" rid="B11">Gardener et al., 2013</xref>). For example, <xref ref-type="bibr" rid="B52">van Peer et al. (2010)</xref> reported a specific increase in P200 amplitude to threat faces in SAD individuals, reflecting an attentional bias for social threat. Moreover, a recent study observed a correlation between P200 amplitude in response to self-focus cues and reduced task performance in individuals with SAD (<xref ref-type="bibr" rid="B20">Judah et al., 2015</xref>).</p>
<p>As the aforementioned studies demonstrate, the N170 amplitudes were increased in response to threat faces in individuals with SAD compared to controls. This seems to suggest a disruption of holistic face processing in SAD, since the N170 appears to be enhanced when a more feature-based processing is induced (<xref ref-type="bibr" rid="B47">Stahl et al., 2008</xref>; <xref ref-type="bibr" rid="B7">Caharel et al., 2011</xref>; <xref ref-type="bibr" rid="B36">Ran et al., 2014b</xref>). As top-down prediction induces a switch from feature-based to holistic processing (<xref ref-type="bibr" rid="B36">Ran et al., 2014b</xref>,<xref ref-type="bibr" rid="B37">c</xref>), SAD individuals, with a disturbed face representation, may adopt a holistic coding strategy for perceiving emotional faces when these faces were predictable. This idea is in line with a recent predictive translation hypothesis (<xref ref-type="bibr" rid="B33">Ran et al., 2016b</xref>), which proposes that, in individuals with social perception disorders, prior prediction contributes to the normalization of the abnormal processing of social information obtained from faces.</p>
<p>The main goal of the current study was to explore the impact of the top-down prediction on emotional face processing in social anxiety. In accordance with previous literature, we hypothesized that high socially anxious (HSA) participants would exhibit larger N170 amplitudes than low socially anxious (LSA) participants when they perceived unpredictable angry faces. However, when exposed to predictable angry faces, we expected that HSA participants showed no differences in N170 amplitudes compared to LSA participants. In addition, based on the theory of hypervigilance to social threat cues in individuals with SAD (<xref ref-type="bibr" rid="B3">Amir et al., 2009</xref>; <xref ref-type="bibr" rid="B21">Klumpp and Amir, 2009</xref>), we predicted that there will be increased P100 and P200 amplitudes to angry faces in the HSA compared to LSA participants.</p>
<p>To test these hypotheses, we adopted a variant of the cue-target paradigm that we employed previously (<xref ref-type="bibr" rid="B8">Chen et al., 2015</xref>; <xref ref-type="bibr" rid="B34">Ran et al., 2016c</xref>). An instruction cue was used to manipulate participants&#x2019; prediction bias toward the corresponding emotion. HSA and LSA participants were instructed to perform an emotional task in which angry and happy target faces were presented randomly, and their brain responses were recorded using high temporal resolution ERP techniques.</p>
<p>Investigating the impact of the top-down prediction on emotional face processing in individuals with SAD contributes to the growing body of literature exploring the methods to reduce social anxiety. Moreover, this study could provide a better understanding of the time course of emotional face processing in social anxiety.</p>
</sec>
<sec id="s1" sec-type="materials|methods">
<title>Materials and Methods</title>
<sec><title>Participants</title>
<p>Thirty volunteers (15 women and 15 men; mean age = 21.02 years, <italic>SD</italic> = 1.81 years; all right-handed) with no history of neurological, psychiatric, or visual impairments were preselected from a group of 911 students based on their social anxiety scores on the Liebowitz Social Anxiety Scale-Self -Report Version (LSAS-SR, <xref ref-type="bibr" rid="B9">Coles et al., 2001</xref>). The LSAS-SR is a 24-item scale measuring dimensional severity of social anxiety symptoms. On the basis of previous studies (e.g., <xref ref-type="bibr" rid="B41">Rytwinski et al., 2009</xref>; <xref ref-type="bibr" rid="B54">Wangelin et al., 2012</xref>), HSA participants (<italic>N</italic> = 13, 8 women) were defined as those who scored 60 or greater on the LSAS-SR while the LSA participants (<italic>N</italic> = 17, 7 women) were those scoring under 40. In addition to the LSAS-SR, participants completed the Spielberger State-Trait Anxiety Inventory (<xref ref-type="bibr" rid="B46">Spielberger et al., 1983</xref>) and the Beck Depression Inventory (<xref ref-type="bibr" rid="B5">Beck and Beamesderfer, 1974</xref>). All participants of this study gave written informed consent and were financially compensated for their participation. The study was approved by the local ethics committee and the experiments were carried out in accordance with the approved guidelines.</p>
<p>As reported in <bold>Table <xref ref-type="table" rid="T1">1</xref></bold>, HSA and LSA participants differed in the LSAS-SR total scores [HSA: 79.62 &#x00B1; 14.23, LSA: 26.24 &#x00B1; 7.61; <italic>t</italic>(28) = 13.23, <italic>p</italic> &#x003C; 0.001], but no group differences were found for age [HSA: 21.15 &#x00B1; 1.63, LSA: 21.24 &#x00B1; 1.99; <italic>t</italic>(28) = -0.12, <italic>p</italic> = 0.905], state anxiety level [HSA: 42.39 &#x00B1; 6.55, LSA: 37.94 &#x00B1; 8.27; <italic>t</italic>(28) = 1.58, <italic>p</italic> = 0.123], trait anxiety level [HSA: 44.23 &#x00B1; 8.19, LSA: 40.00 &#x00B1; 9.59; <italic>t</italic>(28) = 1.27, <italic>p</italic> = 0.213] and depression [HSA: 12.15 &#x00B1; 6.41, LSA: 8.24 &#x00B1; 5.62; <italic>t</italic>(28) = 1.78, <italic>p</italic> = 0.086].</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Participants&#x2019; characteristics for HSA and LSA group.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left"></td>
<th valign="top" align="center">HSA participants (<italic>N</italic> = 13)</th>
<th valign="top" align="center">LSA participants (<italic>N</italic> = 17)</th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">LSAS</td>
<td valign="top" align="center">79.62 (14.23)</td>
<td valign="top" align="center">26.24 (7.61)</td>
</tr>
<tr>
<td valign="top" align="left">Age</td>
<td valign="top" align="center">21.15 (1.63)</td>
<td valign="top" align="center">21.24 (1.99)</td>
</tr>
<tr>
<td valign="top" align="left">STAI-S</td>
<td valign="top" align="center">42.39 (6.55)</td>
<td valign="top" align="center">37.94 (8.27)</td>
</tr>
<tr>
<td valign="top" align="left">STAI-T</td>
<td valign="top" align="center">44.23 (8.19)</td>
<td valign="top" align="center">40.00 (9.59)</td>
</tr>
<tr>
<td valign="top" align="left">Beck</td>
<td valign="top" align="center">12.15 (6.41)</td>
<td valign="top" align="center">8.24 (5.62)</td>
</tr>
<tr>
<td valign="top" align="left"></td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<attrib><italic>STAI-S is Spielberger Anxiety Inventory-State; STAI-T is Spielberger Anxiety Inventory-Trait.</italic></attrib>
</table-wrap-foot>
</table-wrap>
</sec>
<sec><title>Stimuli</title>
<p>The experimental task used 80 images of faces (40 happy and 40 angry faces) sourced from the Chinese Facial Affective Picture System (CFAPS, <xref ref-type="bibr" rid="B53">Wang and Luo, 2005</xref>). The faces were selected in such a way that they all differed significantly in the valence dimension [happy faces: 5.89 &#x00B1; 0.81, angry faces: 2.87 &#x00B1; 0.50; <italic>t</italic>(78) = 20.03, <italic>p</italic> &#x003C; 0.001], but were similar in arousal [happy faces: 5.92 &#x00B1; 1.40, angry faces: 5.89 &#x00B1; 0.24; <italic>t</italic>(78) = -0.11, <italic>p</italic> = 0.914]. All faces were standardized to one size (260 &#x00D7; 300 pixels), and all were gray-scale. Mean luminance and contrast values of the faces were extracted using MATLAB script (<xref ref-type="bibr" rid="B29">Nikolla et al., 2017</xref>). There were no significant differences on the group level between happy and angry faces in the mean luminance [happy faces: 110.28&#x2013;157.30 (range), 131.61 &#x00B1; 10.88; angry faces: 105.97&#x2013;160.31, 135.95 &#x00B1; 14.78; <italic>t</italic>(78) = -1.50, <italic>p</italic> = 0.139] and contrast [happy faces: 0.43&#x2013;0.62, 0.52 &#x00B1; 0.04; angry faces: 0.42&#x2013;0.63, 0.53 &#x00B1; 0.06; <italic>t</italic>(78) = -1.51, <italic>p</italic> = 0.135] values. The viewing angle of each image was 2.8 &#x00D7; 3.7&#x00B0;, with a screen resolution of 72 pixels per inch.</p>
</sec>
<sec><title>Procedure</title>
<p>The current study adopted a variant of the cue-target paradigm that we employed previously (<xref ref-type="bibr" rid="B8">Chen et al., 2015</xref>; <xref ref-type="bibr" rid="B34">Ran et al., 2016c</xref>). While their EEG data were acquired, the participants performed an emotional task. The task consisted of 36 blocks of 8 trials, yielding a total of 288 trials per participant. Each block of the task was preceded by a prediction cue, which was shown for 2000 ms (<bold>Figure <xref ref-type="fig" rid="F1">1</xref></bold>). The prediction cue consisted of either the word &#x201C;happy&#x201D; (indicating that a happy target face was shown on 75% of trials), &#x201C;angry&#x201D; (indicating that an angry target face was presented on 75% of trials), or &#x201C;unknown&#x201D; (50% of trials for each emotional target face).</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption><p>Schematic illustration of the experimental procedure (<bold>A</bold>: angry predictable trials; <bold>B</bold>: happy predictable trials; <bold>C</bold>: Unpredictable trials).</p></caption>
<graphic xlink:href="fpsyg-08-01269-g001.tif"/>
</fig>
<p>Each trial started with a black cross (&#x201C;+&#x201D;), centered on a white background for 500 ms. After a blank screen was displayed for a randomized amount of time (500&#x2013;1000 ms), a target face (happy or angry) was presented for 500 ms. Following the face stimulus, a blank screen was depicted for 300 ms and subsequently a blue point was presented at the center of the screen. The inter-trial interval (ITI) was 1000&#x2013;2000 ms. Upon observing the blue point, the participants were asked to identify the emotion of the observed face.</p>
</sec>
<sec><title>EEG Recording</title>
<p>The electroencephalograph was recorded at 64 scalp sites using tin electrodes mounted in an elastic cap (Brain Products, Munchen, Germany), with a reference on FCz electrode (<xref ref-type="bibr" rid="B14">Herrmann et al., 2007</xref>; <xref ref-type="bibr" rid="B7">Caharel et al., 2011</xref>; <xref ref-type="bibr" rid="B35">Ran et al., 2014a</xref>). The vertical electrooculogram (EOG) was recorded with electrodes placed below the right eye and the horizontal EOG was recorded from the right orbital rim. The inter-electrode impedance was maintained below 5 k&#x03A9;. The EEG and EOG activities were amplified using a 0.01&#x2013;100 Hz bandpass and continuously sampled at 500 Hz per channel.</p>
</sec>
<sec><title>EEG Analysis</title>
<p>For ERP analysis, EEG data for correct responses in each condition were aligned and averaged separately. The data were recomputed to average mastoid reference. Epochs were from -200 to 1200 ms relative to the onset of the target face, using a 200 ms pre-stimulus baseline. The electrode position and time window for each component in the current study were chosen according to inspection of the grand mean ERPs, as well as previous studies (<xref ref-type="bibr" rid="B26">Luo et al., 2010</xref>; <xref ref-type="bibr" rid="B43">Schmitz et al., 2012</xref>; <xref ref-type="bibr" rid="B35">Ran et al., 2014a</xref>,<xref ref-type="bibr" rid="B36">b</xref>). The P100 component (70&#x2013;130 ms) and posterior P200 component (200&#x2013;260 ms) were determined over O1/O2, PO3/PO4, and P3/P4 electrodes. In addition, the N170 component was analyzed within a time frame of 120&#x2013;180 ms after stimulus onset at P5/P6, P7/P8, and PO7/PO8 electrodes. Peak amplitudes and latencies of these components were subjected to repeated-measures analysis of variance (ANOVA) with prediction (predictable vs. unpredictable), emotion (happy vs. angry) and hemisphere (left vs. right) as within-participant factors and group (HSA participants vs. LSA participants) as a between- participant factor. The ERP data were analyzed off-line with BrainVision Analyzer (Brain Products; Gilching, Germany). All degrees of freedom for the F-ratio were corrected according to the Greenhouse-Geisser method.</p>
</sec>
<sec><title>Correlation Analysis between Behavioral and Electrophysiological Data</title>
<p>In the present study, correlation analyses were carried out to assess the relationship between the LSAS-SR total scores and the effect of prediction on the N170. Similar to previous studies (e.g., <xref ref-type="bibr" rid="B36">Ran et al., 2014b</xref>), the effect of prediction on the N170 was obtained by subtracting the N170 peak amplitudes in the predictable condition to those in the unpredictable condition for happy and angry faces of each participant.</p>
</sec>
<sec><title>Independent Component Analysis (ICA) Cluster and Source Analysis</title>
<p>Independent component analysis cluster and source analysis was performed using EEGLAB (<xref ref-type="bibr" rid="B8">Chen et al., 2015</xref>). The recorded EEG data were decomposed in independent components (ICs) via ICA. The ICs, which had the same ERP morphology and scalp topography, were clustered across participants. The IC scalp maps of each cluster were used for source dipole modeling. The Boundary Element Head Model was adopted (<xref ref-type="bibr" rid="B12">Han et al., 2005</xref>).</p>
</sec>
</sec>
<sec><title>Results</title>
<sec><title>Behavioral Results</title>
<p>The mean accuracy and reaction time for each condition are displayed in <bold>Table <xref ref-type="table" rid="T2">2</xref></bold>. A repeated-measure ANOVAs, with prediction (predictable vs. unpredictable) and emotion (happy vs. angry) as within-participant factors, and group (HSA participants vs. LSA participants) as a between- participant factor, was performed on participants&#x2019; accuracy and reaction time.</p>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>Means and standard deviations of reaction time (RT) and accuracy for HSA and LSA group for angry and happy faces in the predictable and unpredictable conditions.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
<th valign="top" align="center" colspan="4">Accuracy (%)</th>
<th valign="top" align="center" colspan="4">Response Time (ms)</th>
</tr>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
<td valign="top" align="left" colspan="4"><hr/></td>
<td valign="top" align="left" colspan="4"><hr/></td>
</tr>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
<th valign="top" align="center" colspan="2">HSA</th>
<th valign="top" align="center" colspan="2">LSA</th>
<th valign="top" align="center" colspan="2">HSA</th>
<th valign="top" align="center" colspan="2">LSA</th>
</tr>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left"></td>
<td valign="top" align="left" colspan="2"><hr/></td>
<td valign="top" align="left" colspan="2"><hr/></td>
<td valign="top" align="left" colspan="2"><hr/></td>
<td valign="top" align="left" colspan="2"><hr/></td>
</tr>
<tr>
<th valign="top" align="left">Emotion</th>
<th valign="top" align="left">Prediction</th>
<th valign="top" align="center"><italic>M</italic></th>
<th valign="top" align="center"><italic>SD</italic></th>
<th valign="top" align="center"><italic>M</italic></th>
<th valign="top" align="center"><italic>SD</italic></th>
<th valign="top" align="center"><italic>M</italic></th>
<th valign="top" align="center"><italic>SD</italic></th>
<th valign="top" align="center"><italic>M</italic></th>
<th valign="top" align="center"><italic>SD</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Happy faces</td>
<td valign="top" align="left">Predictable</td>
<td valign="top" align="center">97.95</td>
<td valign="top" align="center">3.13</td>
<td valign="top" align="center">98.92</td>
<td valign="top" align="center">2.04</td>
<td valign="top" align="center">374.56</td>
<td valign="top" align="center">87.44</td>
<td valign="top" align="center">413.27</td>
<td valign="top" align="center">132.48</td>
</tr>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left">Unpredictable</td>
<td valign="top" align="left">92.43</td>
<td valign="top" align="left">6.92</td>
<td valign="top" align="center">96.23</td>
<td valign="top" align="center">4.94</td>
<td valign="top" align="center">361.85</td>
<td valign="top" align="center">64.16</td>
<td valign="top" align="center">419.04</td>
<td valign="top" align="center">116.39</td>
</tr>
<tr>
<td valign="top" align="left">Angry faces</td>
<td valign="top" align="left">Predictable</td>
<td valign="top" align="center">98.97</td>
<td valign="top" align="center">1.60</td>
<td valign="top" align="center">97.35</td>
<td valign="top" align="center">4.37</td>
<td valign="top" align="center">434.35</td>
<td valign="top" align="center">192.28</td>
<td valign="top" align="center">417.56</td>
<td valign="top" align="center">122.10</td>
</tr>
<tr>
<td valign="top" align="left"></td>
<td valign="top" align="left">Unpredictable</td>
<td valign="top" align="left">97.48</td>
<td valign="top" align="left">2.89</td>
<td valign="top" align="center">97.34</td>
<td valign="top" align="center">2.69</td>
<td valign="top" align="center">395.03</td>
<td valign="top" align="center">112.90</td>
<td valign="top" align="center">390.53</td>
<td valign="top" align="center">136.50</td>
</tr>
<tr>
<td valign="top" align="left"></td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Analysis of the accuracy data yielded a main effect of prediction [<italic>F</italic>(1,28) = 12.05, <italic>p</italic> = 0.002, <inline-formula><mml:math id="M1"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.30] showing more accurate performance on predictable than on unpredictable faces, and a main effect of emotion [<italic>F</italic>(1,28) = 4.51, <italic>p</italic> = 0.043, <inline-formula><mml:math id="M2"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.14] indicating that angry faces were recognized more accurately than happy faces. There was a significant interaction between emotion and group [<italic>F</italic>(1,28) = 6.13, <italic>p</italic> = 0.020, <inline-formula><mml:math id="M3"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.18], which showed that the accuracy for angry faces was significantly higher than that for happy faces in HSA (<italic>p</italic> = 0.005) but not LSA (<italic>p</italic> = 0.791) participants. In addition, a significant interaction between emotion and prediction emerged [<italic>F</italic>(1,28) = 12.38, <italic>p</italic> = 0.002, <inline-formula><mml:math id="M4"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.31], indicating that angry faces were recognized more accurately than happy faces in the unpredictable condition (<italic>p</italic> = 0.003). However, the three-way interaction between prediction, emotion and group was not statistically significant [<italic>F</italic>(1,28) = 0.50, <italic>p</italic> = 0.484, <inline-formula><mml:math id="M5"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.02]. With regard to reaction time, a significant interaction between emotion and prediction [<italic>F</italic>(1,28) = 5.98, <italic>p</italic> = 0.021, <inline-formula><mml:math id="M6"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.18] was observed, but the subsequent analyses failed to reach significance. No other effects were significant (all <italic>Fs</italic> &#x003C; 2.88, <italic>ps</italic> > 0.101).</p>
</sec>
<sec><title>ERP Results</title>
<sec><title>P100</title>
<p>The ANOVA of P100 latency showed a significant main effect of emotion [<italic>F</italic>(1,28) = 6.23, <italic>p</italic> = 0.019, <inline-formula><mml:math id="M7"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.18], with shorter latencies for happy versus angry faces. There was a significant interaction between prediction and hemisphere [<italic>F</italic>(1,28) = 8.08, <italic>p</italic> = 0.008, <inline-formula><mml:math id="M8"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.22] but the subsequent analyses failed to reach significance. The corresponding ANOVA for P100 peak amplitude revealed a significant interaction between prediction and emotion [<italic>F</italic>(1,28) = 7.99, <italic>p</italic> = 0.009, <inline-formula><mml:math id="M9"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.22] and a significant four-way interaction between prediction, emotion, hemisphere and group [<italic>F</italic>(1,28) = 6.18, <italic>p</italic> = 0.019, <inline-formula><mml:math id="M10"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.18]. Further analyses suggested that predictable angry faces triggered enhanced activity compared to predictable happy faces in the right hemisphere electrodes for HSA (<italic>p</italic> = 0.011) but not LSA (<italic>p</italic> = 0.183) participants (<bold>Figure <xref ref-type="fig" rid="F2">2</xref></bold>).</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption><p>Grand mean ERPs between HSA and LSA group for angry and happy faces in the predictable and unpredictable conditions at electrodes O1 and O2 with time windows of P100 and P200.</p></caption>
<graphic xlink:href="fpsyg-08-01269-g002.tif"/>
</fig>
</sec>
<sec><title>N170</title>
<p>In the analysis of N170 latency, no main effects or interactions reached significance (all <italic>Fs</italic> &#x003C; 2.20, <italic>ps</italic> > 0.149). The ANOVA on the peak amplitude of this component revealed a significant three-way interaction between emotion, hemisphere and group [<italic>F</italic>(1,28) = 9.74, <italic>p</italic> = 0.004, <inline-formula><mml:math id="M11"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.26], reflecting more negative amplitudes over the right hemisphere electrodes for HSA participants when they observed happy faces. More importantly, there was a significant four-way interaction between prediction, emotion, hemisphere and group [<italic>F</italic>(1,28) = 5.18, <italic>p</italic> = 0.031, <inline-formula><mml:math id="M12"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.16]. Follow-up analyses confirmed that HSA participants exhibited larger N170 amplitudes in the right hemisphere electrodes than LSA participants when they perceived unpredictable (<italic>p</italic> = 0.046), but not predictable (<italic>p</italic> = 0.163), angry faces (<bold>Figures <xref ref-type="fig" rid="F3">3</xref></bold>, <bold><xref ref-type="fig" rid="F4">4</xref></bold>). No other effect reached significance (all <italic>Fs</italic> &#x003C; 3.13, <italic>ps</italic> > 0.088).</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption><p>Grand mean ERPs between HSA and LSA group for angry and happy faces in the predictable and unpredictable conditions at electrodes P7 and P8 with time window N170.</p></caption>
<graphic xlink:href="fpsyg-08-01269-g003.tif"/>
</fig>
<fig id="F4" position="float">
<label>FIGURE 4</label>
<caption><p>Grand mean ERPs between HSA and LSA group for angry and happy faces in the predictable and unpredictable conditions at electrodes PO7 and PO8 with time window N170.</p></caption>
<graphic xlink:href="fpsyg-08-01269-g004.tif"/>
</fig>
</sec>
<sec><title>P200</title>
<p>For P200 latency, a main effect of emotion was observed [<italic>F</italic>(1,28) = 6.17, <italic>p</italic> = 0.019, <inline-formula><mml:math id="M13"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.18], which indicated that the peak latency was significantly shorter to identifying angry than happy faces. For the P200 peak amplitude, interaction between emotion and group reached significance [<italic>F</italic>(1,28) = 5.12, <italic>p</italic> = 0.032, <inline-formula><mml:math id="M14"><mml:msubsup><mml:mi mathvariant='normal' mathcolor='black'>&#x03b7;</mml:mi><mml:mi mathvariant='normal' mathcolor='black'>p</mml:mi><mml:mn mathvariant='normal' mathcolor='black'>2</mml:mn></mml:msubsup></mml:math></inline-formula> = 0.16]. The interaction effect was due to more positive amplitudes for angry compared with happy faces in HSA participants (<italic>p</italic> = 0.035) (<bold>Figure <xref ref-type="fig" rid="F2">2</xref></bold>). No other significant amplitude differences were found for this component (all <italic>Fs</italic> &#x003C; 3.97, <italic>ps</italic> > 0.056).</p>
</sec>
</sec>
<sec><title>Correlations between ERP Prediction Effects and the Social Anxiety</title>
<p>The correlation analysis examined the relationship between the LSAS-SR total scores and the effect of prediction on the N170. A significant negative correlation was observed in the right hemisphere electrodes when participants viewed angry faces &#x007B;<italic>r</italic>(28) = -0.42, <italic>p</italic> = 0.022, bootstrap confidence interval (CI) = [-0.640, -0.124]&#x007D;. In addition, we found a significant negative correlation in both hemisphere electrodes for happy faces in LSA participants &#x007B;left hemisphere: <italic>r</italic>(15) = -0.54, <italic>p</italic> = 0.024, bootstrap CI = [-0.850, -0.128]; right hemisphere: <italic>r</italic>(15) = -0.63, <italic>p</italic> = 0.007, bootstrap CI = [-0.868, -0.068]&#x007D; (<bold>Figure <xref ref-type="fig" rid="F5">5</xref></bold>). However, no any significant correlation was observed in HSA participants (all <italic>ps</italic> > 0.227).</p>
<fig id="F5" position="float">
<label>FIGURE 5</label>
<caption><p>Correlations between the N170 effect of prediction and the LSAS-SR total scores in LSA participants. Pearson coefficient, respective <italic>p</italic>-values and number of participants are reported in the top left corner.</p></caption>
<graphic xlink:href="fpsyg-08-01269-g005.tif"/>
</fig>
</sec>
<sec><title>ICA Clusters and Estimated Sources</title>
<p>The ICA cluster analysis revealed three clusters of interest namely the P100-, N170-, and P200-cluster (<bold>Figure <xref ref-type="fig" rid="F6">6</xref></bold>). The IC scalp maps of N170-cluster were characterized by occipito-temporal distribution. These maps were used for source dipole modeling. The best fitting dipole was located in the right occipital gyrus (<italic>x</italic> = 36, <italic>y</italic> = -66, <italic>z</italic> = 5). For the P100- and P200-cluster, the IC scalp maps showed an occipital distribution. Both clusters had a source in left occipital gyrus (<italic>x</italic> = -36, <italic>y</italic> = -66, <italic>z</italic> = 22; <italic>x</italic> = -22, <italic>y</italic> = -82, <italic>z</italic> = 4, respectively).</p>
<fig id="F6" position="float">
<label>FIGURE 6</label>
<caption><p>Results of the estimated sources for P100, N170, and P200 component.</p></caption>
<graphic xlink:href="fpsyg-08-01269-g006.tif"/>
</fig>
</sec>
</sec>
<sec><title>Discussion</title>
<p>The present study employed a modified emotional task in which prediction was manipulated by a cue to investigate how top-down prediction influenced emotional face perception in social anxiety. Behaviourally, we reported an effect of prediction with higher accuracy for predictable than unpredictable faces. Furthermore, we found that HSA but not LSA participants recognized angry faces more accurately than happy faces. For the P100 and P200 components, HSA participants showed enhanced brain activity in response to angry faces compared to happy faces. Moreover, HSA participants exhibited larger N170 amplitudes than LSA participants in the right hemisphere electrodes when they perceived unpredictable, but not predictable, angry faces. The subsequent correlation analysis yielded significant negative correlations between the N170 effect of prediction and the LSAS-SR total scores.</p>
<p>The current work revealed a clear effect of prediction at the behavioral level. Such facilitated response to predictable faces might be due to the fact that prior predictions allow our brains to reduce the number of candidate representations of one object that need to be considered (<xref ref-type="bibr" rid="B25">Kveraga et al., 2007</xref>). Moreover, we found that HSA participants were sensitive to angry faces, which is consistent with previous behavioral studies examining social anxiety (<xref ref-type="bibr" rid="B21">Klumpp and Amir, 2009</xref>; <xref ref-type="bibr" rid="B49">Stevens et al., 2009</xref>; <xref ref-type="bibr" rid="B48">Staugaard, 2010</xref>). For instance, <xref ref-type="bibr" rid="B49">Stevens et al. (2009)</xref> have found that HSA participants showed an attentional bias toward unambiguously angry faces.</p>
<p>The current electrophysiological results showed larger P100 amplitudes for angry faces in HSA participants when these faces were predictable, suggesting a hypervigilance to angry faces. This finding is consistent with our behavioral results and data from prior ERP studies. Using ERP measures, <xref ref-type="bibr" rid="B27">Mueller et al. (2009)</xref> observed that SAD potentiated P100 amplitudes in response to angry-neutral relative to happy-neutral face-pairs. Other studies, however, reported that SAD led to higher P100 amplitudes in response to schematic, artificial and natural facial stimuli, regardless of expression (e.g., <xref ref-type="bibr" rid="B28">M&#x00FC;hlberger et al., 2009</xref>; <xref ref-type="bibr" rid="B38">Rossignol et al., 2012</xref>; <xref ref-type="bibr" rid="B30">Peschard et al., 2013</xref>), suggesting the absence of a specific enhancement to threat in the SAD. While caution should be taken in interpreting these results, one reason for the discrepancy might be that the perception of emotional faces for HSA participants in the present experiment was influenced by the top-down prediction. In the study of <xref ref-type="bibr" rid="B38">Rossignol et al. (2012)</xref>, participants were instructed to perform a spatial cueing task, while we adopted an emotional cueing task.</p>
<p>Unlike the P100 components, HSA participants exhibited more positive P200 amplitudes in response to angry faces compared with happy faces irrespective of the top-down prediction, which indicated no moderating effect of prediction on the P200 finding of hypervigilance for angry faces. A similar pattern of results has been found in previous ERP studies, which reported hypervigilance toward angry faces for P200 amplitudes in a masked and an unmasked emotional Stroop task (<xref ref-type="bibr" rid="B52">van Peer et al., 2010</xref>).</p>
<p>As suggested by <xref ref-type="bibr" rid="B47">Stahl et al. (2008)</xref>, increased amplitudes and longer latencies of the N170 were associated with disturbed holistic processing of faces. The present study reported that HSA participants exhibited larger N170 amplitudes than LSA participants in the right hemisphere electrodes when they perceived unpredictable angry faces, presumably reflecting a disturbed face representation in HSA participants. The abnormal processing of angry faces was in line with numerous studies examining the moderating effect of social anxiety on the N170 (<xref ref-type="bibr" rid="B24">Kolassa and Miltner, 2006</xref>; <xref ref-type="bibr" rid="B56">Wieser et al., 2010</xref>). The interpersonal theory of social anxiety (<xref ref-type="bibr" rid="B1">Alden and Taylor, 2004</xref>) proposed that SAD is an interpersonal disorder, a condition in which anxiety severely disrupts an individual&#x2019;s relationships with others. Such tendency for subjects with social anxiety to experience social inhibition may lead to a lack of experience in the perception of social stimuli, such as emotional faces. It is widely accepted that less experience with human faces leads to a relatively more feature-based processing of them (<xref ref-type="bibr" rid="B15">Hugenberg et al., 2007</xref>; <xref ref-type="bibr" rid="B45">Short and Mondloch, 2013</xref>).</p>
<p>Interestingly, when HSA participants perceived predictable angry faces, they showed no differences in N170 amplitudes compared to LSA participants in the right hemisphere electrodes. This suggests that top-down prediction may improve the deficiencies in building a holistic face representation in HSA participants. Recently, behavioral and neurophysiological studies have reported that the visual stimuli involving mostly feature-based processing were perceived in a holistic manner when these stimuli were predictable (<xref ref-type="bibr" rid="B36">Ran et al., 2014b</xref>,<xref ref-type="bibr" rid="B37">c</xref>), indicating that prediction induced a switch from feature-based to holistic processing. In this case, HSA participants with a disturbed face representation adopted a holistic coding strategy for perceiving human faces when they were predictable.</p>
<p>There was a significant negative correlation between the N170 effect of prediction and the total scores on the LSAS-SR in both hemisphere electrodes for happy faces in LSA participants. This correlation suggested that LSA participants with more slight anxiety symptoms showed an enhanced ability to predict upcoming positive emotional events. However, no any significant correlation was found in HSA participants. This need to be examined further in the future studies.</p>
<p>Two clusters of interest namely the P100- and N170-cluster were found in the early time window. The P100-cluster had a dipole located in the left occipital gyrus while the N170 cluster showed a generator site in the right occipital gyrus. This suggests that the moderating effect of prediction on the SAD is based on the activation of neuronal populations in the bilateral occipital gyrus. A similar result of source localization has been found in previous fMRI studies, which found that the N170 components were located in the right occipital gyrus (e.g., <xref ref-type="bibr" rid="B31">Qiu et al., 2008</xref>). Although some studies have employed the frequency cue to manipulate the prediction (<xref ref-type="bibr" rid="B4">Barbalat et al., 2013</xref>; <xref ref-type="bibr" rid="B8">Chen et al., 2015</xref>; <xref ref-type="bibr" rid="B32">Ran et al., 2016a</xref>,<xref ref-type="bibr" rid="B34">c</xref>), such frequency manipulation is not optimal. It is nice to adopt a contingent probability design in further research as it is more informative than the frequency manipulation.</p>
</sec>
<sec><title>Conclusion</title>
<p>Although a wealth of research has examined facial processing (<xref ref-type="bibr" rid="B26">Luo et al., 2010</xref>; <xref ref-type="bibr" rid="B43">Schmitz et al., 2012</xref>; <xref ref-type="bibr" rid="B35">Ran et al., 2014a</xref>), there is no study that directly investigates the influences of the top-down prediction on the emotional face perception in social anxiety. In the present study, we reported a clear effect of prediction at the behavioral level. The ERP results from the P100 showed larger amplitudes for angry faces in HSA participants when the faces were predictable, suggesting a hypervigilance to angry faces. Unlike the results in P100, HSA participants exhibited more positive P200 amplitudes for angry compared with happy faces irrespective of the top-down prediction, indicating no moderating effect of prediction on the P200 finding of hypervigilance for angry faces. Crucially, HSA participants exhibited larger N170 amplitudes than LSA participants in the right hemisphere electrodes when they perceived unpredictable, but not predictable, angry faces. This result suggests that top-down prediction may improve the deficiency in building a holistic face representation in HSA participants.</p>
</sec>
<sec><title>Author Contributions</title>
<p>GR and XC designed the experiments. GR collected and analyzed the data. GR primarily wrote the manuscript. All authors discussed the results and commented on the manuscript.</p>
</sec>
<sec><title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<fn-group>
<fn fn-type="financial-disclosure">
<p><bold>Funding.</bold> This research was supported by the National Natural Science Foundation of China (No. 31571146).</p>
</fn>
</fn-group>
<ack>
<p>The authors are grateful to Qi Zhang for setting up and running the experiments. We also thank the participants in the study.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Alden</surname> <given-names>L. E.</given-names></name> <name><surname>Taylor</surname> <given-names>C. T.</given-names></name></person-group> (<year>2004</year>). <article-title>Interpersonal processes in social phobia.</article-title> <source><italic>Clin. Psychol. Rev.</italic></source> <volume>24</volume> <fpage>857</fpage>&#x2013;<lpage>882</lpage>. <pub-id pub-id-type="doi">10.1016/j.cpr.2004.07.006</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Alden</surname> <given-names>L. E.</given-names></name> <name><surname>Taylor</surname> <given-names>C. T.</given-names></name></person-group> (<year>2011</year>). <article-title>Relational treatment strategies increase social approach behaviors in patients with generalized social anxiety disorder.</article-title> <source><italic>J. Anxiety Disord.</italic></source> <volume>25</volume> <fpage>309</fpage>&#x2013;<lpage>318</lpage>. <pub-id pub-id-type="doi">10.1016/j.janxdis.2010.10.003</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Amir</surname> <given-names>N.</given-names></name> <name><surname>Beard</surname> <given-names>C.</given-names></name> <name><surname>Taylor</surname> <given-names>C. T.</given-names></name> <name><surname>Klumpp</surname> <given-names>H.</given-names></name> <name><surname>Elias</surname> <given-names>J.</given-names></name> <name><surname>Burns</surname> <given-names>M.</given-names></name><etal/></person-group> (<year>2009</year>). <article-title>Attention training in individuals with generalized social phobia: a randomized controlled trial.</article-title> <source><italic>J. Consult. Clin. Psychol.</italic></source> <volume>77</volume> <fpage>961</fpage>&#x2013;<lpage>973</lpage>. <pub-id pub-id-type="doi">10.1037/a0016685</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barbalat</surname> <given-names>G.</given-names></name> <name><surname>Bazargani</surname> <given-names>N.</given-names></name> <name><surname>Blakemore</surname> <given-names>S. J.</given-names></name></person-group> (<year>2013</year>). <article-title>The influence of prior expectations on emotional face perception in adolescence.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>23</volume> <fpage>1542</fpage>&#x2013;<lpage>1551</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhs140</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Beck</surname> <given-names>A. T.</given-names></name> <name><surname>Beamesderfer</surname> <given-names>A.</given-names></name></person-group> (<year>1974</year>). <article-title>Assessment of depression: the depression inventory.</article-title> <source><italic>Mod. Probl. Pharmacopsychiatry</italic></source> <volume>7</volume> <fpage>151</fpage>&#x2013;<lpage>169</lpage>. <pub-id pub-id-type="doi">10.1159/000395074</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Beesdo</surname> <given-names>K.</given-names></name> <name><surname>Bittner</surname> <given-names>A.</given-names></name> <name><surname>Pine</surname> <given-names>D. S.</given-names></name> <name><surname>Stein</surname> <given-names>M. B.</given-names></name> <name><surname>H&#x00F6;fler</surname> <given-names>M.</given-names></name> <name><surname>Lieb</surname> <given-names>R.</given-names></name><etal/></person-group> (<year>2007</year>). <article-title>Incidence of social anxiety disorder and the consistent risk for secondary depression in the first three decades of life.</article-title> <source><italic>Arch. Gen. Psychiatry</italic></source> <volume>64</volume> <fpage>903</fpage>&#x2013;<lpage>912</lpage>. <pub-id pub-id-type="doi">10.1001/archpsyc.64.8.903</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caharel</surname> <given-names>S.</given-names></name> <name><surname>Montalan</surname> <given-names>B.</given-names></name> <name><surname>Fromager</surname> <given-names>E.</given-names></name> <name><surname>Bernard</surname> <given-names>C.</given-names></name> <name><surname>Lalonde</surname> <given-names>R.</given-names></name> <name><surname>Mohamed</surname> <given-names>R.</given-names></name></person-group> (<year>2011</year>). <article-title>Other-race and inversion effects during the structural encoding stage of face processing in a race categorization task: an event-related brain potential study.</article-title> <source><italic>Int. J. Psychophysiol.</italic></source> <volume>79</volume> <fpage>266</fpage>&#x2013;<lpage>271</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2010.10.018</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Ran</surname> <given-names>G.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Hu</surname> <given-names>T.</given-names></name></person-group> (<year>2015</year>). <article-title>Unconscious attention modulates the silencing effect of top-down predictions.</article-title> <source><italic>Conscious. Cogn.</italic></source> <volume>34</volume> <fpage>63</fpage>&#x2013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1016/j.concog.2015.03.010</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Coles</surname> <given-names>M. E.</given-names></name> <name><surname>Heimberg</surname> <given-names>R. G.</given-names></name> <name><surname>Liebowitz</surname> <given-names>M.</given-names></name> <name><surname>Hami</surname> <given-names>S.</given-names></name> <name><surname>Stein</surname> <given-names>M.</given-names></name> <name><surname>Goetz</surname> <given-names>D.</given-names></name></person-group> (<year>2001</year>). <article-title>The liebowitz social anxiety scale: a comparison of the psychometric properties of self-report and clinician-administered formats.</article-title> <source><italic>Psychol. Med.</italic></source> <volume>31</volume> <fpage>1025</fpage>&#x2013;<lpage>1035</lpage>. <pub-id pub-id-type="doi">10.1017/S0033291701004056</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Delong</surname> <given-names>H.</given-names></name> <name><surname>Pollack</surname> <given-names>M. H.</given-names></name></person-group> (<year>2008</year>). <article-title>Update on the assessment, diagnosis, and treatment of individuals with social anxiety disorder.</article-title> <source><italic>Focus</italic></source> <volume>2008</volume> <fpage>431</fpage>&#x2013;<lpage>437</lpage>. <pub-id pub-id-type="doi">10.1176/foc.6.4.foc431</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gardener</surname> <given-names>E. K.</given-names></name> <name><surname>Carr</surname> <given-names>A. R.</given-names></name> <name><surname>MacGregor</surname> <given-names>A.</given-names></name> <name><surname>Felmingham</surname> <given-names>K. L.</given-names></name></person-group> (<year>2013</year>). <article-title>Sex differences and emotion regulation: an event-related potential study.</article-title> <source><italic>PLoS ONE</italic></source> <volume>8</volume>:<issue>e73475</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0073475</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Han</surname> <given-names>S.</given-names></name> <name><surname>Jiang</surname> <given-names>Y.</given-names></name> <name><surname>Mao</surname> <given-names>L.</given-names></name> <name><surname>Humphreys</surname> <given-names>G. W.</given-names></name> <name><surname>Qin</surname> <given-names>J.</given-names></name></person-group> (<year>2005</year>). <article-title>Attentional modulation of perceptual grouping in human visual cortex: ERP studies.</article-title> <source><italic>Hum. Brain Mapp.</italic></source> <volume>26</volume> <fpage>199</fpage>&#x2013;<lpage>209</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20157</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Harbort</surname> <given-names>J.</given-names></name> <name><surname>Witth&#x00F6;ft</surname> <given-names>M.</given-names></name> <name><surname>Spiegel</surname> <given-names>J.</given-names></name> <name><surname>Nick</surname> <given-names>K.</given-names></name> <name><surname>Hecht</surname> <given-names>H.</given-names></name></person-group> (<year>2013</year>). <article-title>The widening of the gaze cone in patients with social anxiety disorder and its normalization after CBT.</article-title> <source><italic>Behav. Res. Ther.</italic></source> <volume>51</volume> <fpage>359</fpage>&#x2013;<lpage>367</lpage>. <pub-id pub-id-type="doi">10.1016/j.brat.2013.03.009</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Herrmann</surname> <given-names>M.</given-names></name> <name><surname>Schreppel</surname> <given-names>T.</given-names></name> <name><surname>J&#x00E4;ger</surname> <given-names>D.</given-names></name> <name><surname>Koehler</surname> <given-names>S.</given-names></name> <name><surname>Ehlis</surname> <given-names>A.-C.</given-names></name> <name><surname>Fallgatter</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>The other-race effect for face perception: an event-related potential study.</article-title> <source><italic>J. Neural Transm.</italic></source> <volume>114</volume> <fpage>951</fpage>&#x2013;<lpage>957</lpage>. <pub-id pub-id-type="doi">10.1007/s00702-007-0624-9</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hugenberg</surname> <given-names>K.</given-names></name> <name><surname>Miller</surname> <given-names>J.</given-names></name> <name><surname>Claypool</surname> <given-names>H. M.</given-names></name></person-group> (<year>2007</year>). <article-title>Categorization and individuation in the cross-race recognition deficit: toward a solution to an insidious problem.</article-title> <source><italic>J. Exp. Soc. Psychol.</italic></source> <volume>43</volume> <fpage>334</fpage>&#x2013;<lpage>340</lpage>. <pub-id pub-id-type="doi">10.1016/j.jesp.2006.02.010</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Alain</surname> <given-names>C.</given-names></name> <name><surname>Sedore</surname> <given-names>K.</given-names></name> <name><surname>Mcintosh</surname> <given-names>A. R.</given-names></name></person-group> (<year>2007</year>). <article-title>Early face processing specificity: it&#x2019;s in the eyes!.</article-title> <source><italic>J. Cogn. Neurosci.</italic></source> <volume>19</volume> <fpage>1815</fpage>&#x2013;<lpage>1826</lpage>. <pub-id pub-id-type="doi">10.1162/jocn.2007.19.11.1815</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Batty</surname> <given-names>M.</given-names></name></person-group> (<year>2009</year>). <article-title>Neural bases of eye and gaze processing: the core of social cognition.</article-title> <source><italic>Neurosci. Biobehav. Rev.</italic></source> <volume>33</volume> <fpage>843</fpage>&#x2013;<lpage>863</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2009.02.004</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Latinus</surname> <given-names>M.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2006</year>). <article-title>Face, eye and object early processing: what is the face specificity?</article-title> <source><italic>Neuroimage</italic></source> <volume>29</volume> <fpage>667</fpage>&#x2013;<lpage>676</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2005.07.041</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2002</year>). <article-title>Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: a repetition study using ERPs.</article-title> <source><italic>Neuroimage</italic></source> <volume>15</volume> <fpage>353</fpage>&#x2013;<lpage>372</lpage>. <pub-id pub-id-type="doi">10.1006/nimg.2001.0982</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Judah</surname> <given-names>M. R.</given-names></name> <name><surname>Grant</surname> <given-names>D. M.</given-names></name> <name><surname>Carlisle</surname> <given-names>N. B.</given-names></name></person-group> (<year>2015</year>). <article-title>The effects of self-focus on attentional biases in social anxiety: an ERP study.</article-title> <source><italic>Cogn. Affect. Behav. Neurosci.</italic></source> <volume>16</volume> <fpage>1</fpage>&#x2013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.3758/s13415-015-0398-8</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Klumpp</surname> <given-names>H.</given-names></name> <name><surname>Amir</surname> <given-names>N.</given-names></name></person-group> (<year>2009</year>). <article-title>Examination of vigilance and disengagement of threat in social anxiety with a probe detection task.</article-title> <source><italic>Anxiety Stress Coping</italic></source> <volume>22</volume> <fpage>283</fpage>&#x2013;<lpage>296</lpage>. <pub-id pub-id-type="doi">10.1080/10615800802449602</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kolassa</surname> <given-names>I.-T.</given-names></name> <name><surname>Kolassa</surname> <given-names>S.</given-names></name> <name><surname>Bergmann</surname> <given-names>S.</given-names></name> <name><surname>Lauche</surname> <given-names>R.</given-names></name> <name><surname>Dilger</surname> <given-names>S.</given-names></name> <name><surname>Miltner</surname> <given-names>W. H.</given-names></name><etal/></person-group> (<year>2009</year>). <article-title>Interpretive bias in social phobia: an ERP study with morphed emotional schematic faces.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>23</volume> <fpage>69</fpage>&#x2013;<lpage>95</lpage>. <pub-id pub-id-type="doi">10.1080/02699930801940461</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kolassa</surname> <given-names>I.-T.</given-names></name> <name><surname>Kolassa</surname> <given-names>S.</given-names></name> <name><surname>Musial</surname> <given-names>F.</given-names></name> <name><surname>Miltner</surname> <given-names>W. H.</given-names></name></person-group> (<year>2007</year>). <article-title>Event-related potentials to schematic faces in social phobia.</article-title> <source><italic>Cogn. Emot.</italic></source> <volume>21</volume> <fpage>1721</fpage>&#x2013;<lpage>1744</lpage>. <pub-id pub-id-type="doi">10.1080/02699930701229189</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kolassa</surname> <given-names>I.-T.</given-names></name> <name><surname>Miltner</surname> <given-names>W. H.</given-names></name></person-group> (<year>2006</year>). <article-title>Psychophysiological correlates of face processing in social phobia.</article-title> <source><italic>Brain Res.</italic></source> <volume>1118</volume> <fpage>130</fpage>&#x2013;<lpage>141</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2006.08.019</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kveraga</surname> <given-names>K.</given-names></name> <name><surname>Ghuman</surname> <given-names>A. S.</given-names></name> <name><surname>Bar</surname> <given-names>M.</given-names></name></person-group> (<year>2007</year>). <article-title>Top-down predictions in the cognitive brain.</article-title> <source><italic>Brain Cogn.</italic></source> <volume>65</volume> <fpage>145</fpage>&#x2013;<lpage>168</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandc.2007.06.007</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Luo</surname> <given-names>W.</given-names></name> <name><surname>Feng</surname> <given-names>W.</given-names></name> <name><surname>He</surname> <given-names>W.</given-names></name> <name><surname>Wang</surname> <given-names>N.-Y.</given-names></name> <name><surname>Luo</surname> <given-names>Y.-J.</given-names></name></person-group> (<year>2010</year>). <article-title>Three stages of facial expression processing: ERP study with rapid serial visual presentation.</article-title> <source><italic>Neuroimage</italic></source> <volume>49</volume> <fpage>1857</fpage>&#x2013;<lpage>1867</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2009.09.018</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mueller</surname> <given-names>E.</given-names></name> <name><surname>Hofmann</surname> <given-names>S.</given-names></name> <name><surname>Santesso</surname> <given-names>D.</given-names></name> <name><surname>Meuret</surname> <given-names>A.</given-names></name> <name><surname>Bitran</surname> <given-names>S.</given-names></name> <name><surname>Pizzagalli</surname> <given-names>D. A.</given-names></name></person-group> (<year>2009</year>). <article-title>Electrophysiological evidence of attentional biases in social anxiety disorder.</article-title> <source><italic>Psychol. Med.</italic></source> <volume>39</volume> <fpage>1141</fpage>&#x2013;<lpage>1152</lpage>. <pub-id pub-id-type="doi">10.1017/S0033291708004820</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>M&#x00FC;hlberger</surname> <given-names>A.</given-names></name> <name><surname>Wieser</surname> <given-names>M. J.</given-names></name> <name><surname>Herrmann</surname> <given-names>M. J.</given-names></name> <name><surname>Weyers</surname> <given-names>P.</given-names></name> <name><surname>Tr&#x00F6;ger</surname> <given-names>C.</given-names></name> <name><surname>Pauli</surname> <given-names>P.</given-names></name></person-group> (<year>2009</year>). <article-title>Early cortical processing of natural and artificial emotional faces differs between lower and higher socially anxious persons.</article-title> <source><italic>J. Neural Transm.</italic></source> <volume>116</volume> <fpage>735</fpage>&#x2013;<lpage>746</lpage>. <pub-id pub-id-type="doi">10.1007/s00702-008-0108-6</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nikolla</surname> <given-names>D.</given-names></name> <name><surname>Edgar</surname> <given-names>G.</given-names></name> <name><surname>Catherwood</surname> <given-names>D.</given-names></name> <name><surname>Matthews</surname> <given-names>T.</given-names></name></person-group> (<year>2017</year>). <article-title>Can bottom-up processes of attention be a source of &#x2018;interference&#x2019; in situations where top-down control of attention is crucial?</article-title> <source><italic>Br. J. Psychol.</italic></source> <pub-id pub-id-type="doi">10.1111/bjop.12251</pub-id> <comment>[Epub ahead of print]</comment>.</citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peschard</surname> <given-names>V.</given-names></name> <name><surname>Philippot</surname> <given-names>P.</given-names></name> <name><surname>Joassin</surname> <given-names>F.</given-names></name> <name><surname>Rossignol</surname> <given-names>M.</given-names></name></person-group> (<year>2013</year>). <article-title>The impact of the stimulus features and task instructions on facial processing in social anxiety: an ERP investigation.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>93</volume> <fpage>88</fpage>&#x2013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2013.01.009</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Qiu</surname> <given-names>J.</given-names></name> <name><surname>Li</surname> <given-names>H.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Liu</surname> <given-names>Q.</given-names></name> <name><surname>Zhang</surname> <given-names>F.</given-names></name></person-group> (<year>2008</year>). <article-title>The M&#x00FC;ller-Lyer illusion seen by the brain: an event-related brain potentials study.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>77</volume> <fpage>150</fpage>&#x2013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2009.07.006</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ran</surname> <given-names>G.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Cao</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name></person-group> (<year>2016a</year>). <article-title>Prediction and unconscious attention operate synergistically to facilitate stimulus processing: an fMRI study.</article-title> <source><italic>Conscious. Cogn.</italic></source> <volume>44</volume> <fpage>41</fpage>&#x2013;<lpage>50</lpage>. <pub-id pub-id-type="doi">10.1016/j.concog.2016.06.016</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ran</surname> <given-names>G.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name> <name><surname>Ma</surname> <given-names>Y.</given-names></name></person-group> (<year>2016b</year>). <article-title>The neural mechanism for the superiority effect of social prediction.</article-title> <source><italic>Adv. Psychol. Sci.</italic></source> <volume>24</volume> <fpage>684</fpage>&#x2013;<lpage>691</lpage>. <pub-id pub-id-type="doi">10.3724/SP.J.1042.2016.00684</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ran</surname> <given-names>G.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Ma</surname> <given-names>Y.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name></person-group> (<year>2016c</year>). <article-title>Attention modulates neural responses to unpredictable emotional faces in dorsolateral prefrontal cortex.</article-title> <source><italic>Front. Hum. Neurosci.</italic></source> <volume>10</volume>:<issue>332</issue>. <pub-id pub-id-type="doi">10.3389/fnhum.2016.00332</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ran</surname> <given-names>G.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Pan</surname> <given-names>Y.</given-names></name></person-group> (<year>2014a</year>). <article-title>Human sex differences in emotional processing of own-race and other-race faces.</article-title> <source><italic>Neuroreport</italic></source> <volume>25</volume> <fpage>683</fpage>&#x2013;<lpage>687</lpage>. <pub-id pub-id-type="doi">10.1097/WNR.0000000000000158</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ran</surname> <given-names>G.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Pan</surname> <given-names>Y.</given-names></name></person-group> (<year>2014b</year>). <article-title>The effects of prediction on the perception for own-race and other-race faces.</article-title> <source><italic>PLoS ONE</italic></source> <volume>9</volume>:<issue>e114011</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0114011</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ran</surname> <given-names>G. M.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Pan</surname> <given-names>Y. G.</given-names></name> <name><surname>Hu</surname> <given-names>T. Q.</given-names></name> <name><surname>Ma</surname> <given-names>J.</given-names></name></person-group> (<year>2014c</year>). <article-title>Effects of anticipation on perception of facial expressions.</article-title> <source><italic>Percept. Mot. Skills</italic></source> <volume>118</volume> <fpage>195</fpage>&#x2013;<lpage>209</lpage>. <pub-id pub-id-type="doi">10.2466/24.PMS.118k13w4</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossignol</surname> <given-names>M.</given-names></name> <name><surname>Philippot</surname> <given-names>P.</given-names></name> <name><surname>Bissot</surname> <given-names>C.</given-names></name> <name><surname>Rigoulot</surname> <given-names>S.</given-names></name> <name><surname>Campanella</surname> <given-names>S.</given-names></name></person-group> (<year>2012</year>). <article-title>Electrophysiological correlates of enhanced perceptual processes and attentional capture by emotional faces in social anxiety.</article-title> <source><italic>Brain Res.</italic></source> <volume>1460</volume> <fpage>50</fpage>&#x2013;<lpage>62</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2012.04.034</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Delvenne</surname> <given-names>J. F.</given-names></name> <name><surname>Debatisse</surname> <given-names>D.</given-names></name> <name><surname>Goffaux</surname> <given-names>V.</given-names></name> <name><surname>Bruyer</surname> <given-names>R.</given-names></name> <name><surname>Crommelinck</surname> <given-names>M.</given-names></name><etal/></person-group> (<year>1999</year>). <article-title>Spatio-temporal localization of the face inversion effect: an event-related potentials study.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>50</volume> <fpage>173</fpage>&#x2013;<lpage>189</lpage>. <pub-id pub-id-type="doi">10.1016/S0301-0511(99)00013-7</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ruscio</surname> <given-names>A. M.</given-names></name> <name><surname>Brown</surname> <given-names>T. A.</given-names></name> <name><surname>Chiu</surname> <given-names>W. T.</given-names></name> <name><surname>Sareen</surname> <given-names>J.</given-names></name> <name><surname>Stein</surname> <given-names>M. B.</given-names></name> <name><surname>Kessler</surname> <given-names>R. C.</given-names></name></person-group> (<year>2008</year>). <article-title>Social fears and social phobia in the USA: results from the national comorbidity survey replication.</article-title> <source><italic>Psychol. Med.</italic></source> <volume>38</volume> <fpage>15</fpage>&#x2013;<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1017/S0033291707001699</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rytwinski</surname> <given-names>N. K.</given-names></name> <name><surname>Fresco</surname> <given-names>D. M.</given-names></name> <name><surname>Heimberg</surname> <given-names>R. G.</given-names></name> <name><surname>Coles</surname> <given-names>M. E.</given-names></name> <name><surname>Liebowitz</surname> <given-names>M. R.</given-names></name> <name><surname>Cissell</surname> <given-names>S.</given-names></name><etal/></person-group> (<year>2009</year>). <article-title>Screening for social anxiety disorder with the self-report version of the liebowitz social anxiety scale.</article-title> <source><italic>Depress. Anxiety</italic></source> <volume>26</volume> <fpage>34</fpage>&#x2013;<lpage>38</lpage>. <pub-id pub-id-type="doi">10.1002/da.20503</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schendan</surname> <given-names>H. E.</given-names></name> <name><surname>Ganis</surname> <given-names>G.</given-names></name> <name><surname>Kutas</surname> <given-names>M.</given-names></name></person-group> (<year>1998</year>). <article-title>Neurophysiological evidence for visual perceptual categorization of words and faces within 150 ms.</article-title> <source><italic>Psychophysiology</italic></source> <volume>35</volume> <fpage>240</fpage>&#x2013;<lpage>251</lpage>. <pub-id pub-id-type="doi">10.1017/S004857729897010X</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schmitz</surname> <given-names>J.</given-names></name> <name><surname>Scheel</surname> <given-names>C. N.</given-names></name> <name><surname>Rigon</surname> <given-names>A.</given-names></name> <name><surname>Gross</surname> <given-names>J. J.</given-names></name> <name><surname>Blechert</surname> <given-names>J.</given-names></name></person-group> (<year>2012</year>). <article-title>You don&#x2019;t like me, do you? Enhanced ERP responses to averted eye gaze in social anxiety.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>91</volume> <fpage>263</fpage>&#x2013;<lpage>269</lpage>. <pub-id pub-id-type="doi">10.1017/S004857729897010X</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schupp</surname> <given-names>H. T.</given-names></name> <name><surname>&#x00D6;hman</surname> <given-names>A.</given-names></name> <name><surname>Jungh&#x00F6;fer</surname> <given-names>M.</given-names></name> <name><surname>Weike</surname> <given-names>A. I.</given-names></name> <name><surname>Stockburger</surname> <given-names>J.</given-names></name> <name><surname>Hamm</surname> <given-names>A. O.</given-names></name></person-group> (<year>2004</year>). <article-title>The facilitated processing of threatening faces: an ERP analysis.</article-title> <source><italic>Emotion</italic></source> <volume>4</volume> <fpage>189</fpage>&#x2013;<lpage>200</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.4.2.189</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Short</surname> <given-names>L. A.</given-names></name> <name><surname>Mondloch</surname> <given-names>C. J.</given-names></name></person-group> (<year>2013</year>). <article-title>Aging faces and aging perceivers: young and older adults are less sensitive to deviations from normality in older than in young adult faces.</article-title> <source><italic>Perception</italic></source> <volume>42</volume> <fpage>795</fpage>&#x2013;<lpage>812</lpage>. <pub-id pub-id-type="doi">10.1068/p7380</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Spielberger</surname> <given-names>C. D.</given-names></name> <name><surname>Gorsuch</surname> <given-names>R. L.</given-names></name> <name><surname>Lusthene</surname> <given-names>R.</given-names></name> <name><surname>Vagg</surname> <given-names>P. R.</given-names></name> <name><surname>Jacobs</surname> <given-names>G. A.</given-names></name></person-group> (<year>1983</year>). <source><italic>Manual for the State-Trait Anviety Inventory.</italic></source> <publisher-loc>Palo Alto, CA</publisher-loc>: <publisher-name>Consulting Psychologists Press</publisher-name>.</citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stahl</surname> <given-names>J.</given-names></name> <name><surname>Wiese</surname> <given-names>H.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2008</year>). <article-title>Expertise and own-race bias in face processing: an event-related potential study.</article-title> <source><italic>Neuroreport</italic></source> <volume>19</volume> <fpage>583</fpage>&#x2013;<lpage>587</lpage>. <pub-id pub-id-type="doi">10.1097/WNR.0b013e3282f97b4d</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Staugaard</surname> <given-names>S. R.</given-names></name></person-group> (<year>2010</year>). <article-title>Threatening faces and social anxiety: a literature review.</article-title> <source><italic>Clin. Psychol. Rev.</italic></source> <volume>30</volume> <fpage>669</fpage>&#x2013;<lpage>690</lpage>. <pub-id pub-id-type="doi">10.1016/j.cpr.2010.05.001</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stevens</surname> <given-names>S.</given-names></name> <name><surname>Rist</surname> <given-names>F.</given-names></name> <name><surname>Gerlach</surname> <given-names>A. L.</given-names></name></person-group> (<year>2009</year>). <article-title>Influence of alcohol on the processing of emotional facial expressions in individuals with social phobia.</article-title> <source><italic>Br. J. Clin. Psychol.</italic></source> <volume>48</volume> <fpage>125</fpage>&#x2013;<lpage>140</lpage>. <pub-id pub-id-type="doi">10.1348/014466508X368856</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Taylor</surname> <given-names>M.</given-names></name> <name><surname>Khan</surname> <given-names>S.</given-names></name></person-group> (<year>2000</year>). <article-title>Top-down modulation of early selective attention processes in children.</article-title> <source><italic>Int. J. Psychophysiol.</italic></source> <volume>37</volume> <fpage>135</fpage>&#x2013;<lpage>147</lpage>. <pub-id pub-id-type="doi">10.1016/S0167-8760(00)00084-2</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tortosa</surname> <given-names>M. I.</given-names></name> <name><surname>Lupi&#x00E1;&#x00F1;ez</surname> <given-names>J.</given-names></name> <name><surname>Ruz</surname> <given-names>M.</given-names></name></person-group> (<year>2013</year>). <article-title>Race, emotion and trust: an ERP study.</article-title> <source><italic>Brain Res.</italic></source> <volume>1494</volume> <fpage>44</fpage>&#x2013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2012.11.037</pub-id></citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van Peer</surname> <given-names>J. M.</given-names></name> <name><surname>Spinhoven</surname> <given-names>P.</given-names></name> <name><surname>Roelofs</surname> <given-names>K.</given-names></name></person-group> (<year>2010</year>). <article-title>Psychophysiological evidence for cortisol-induced reduction in early bias for implicit social threat in social phobia.</article-title> <source><italic>Psychoneuroendocrinology</italic></source> <volume>35</volume> <fpage>21</fpage>&#x2013;<lpage>32</lpage>. <pub-id pub-id-type="doi">10.1016/j.psyneuen.2009.09.012</pub-id></citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>Y.</given-names></name> <name><surname>Luo</surname> <given-names>Y. J.</given-names></name></person-group> (<year>2005</year>). <article-title>Standardization and assessment of college students&#x2019; facial expression of emotion.</article-title> <source><italic>Chin. J. Clin. Psychol.</italic></source> <volume>13</volume> <fpage>396</fpage>&#x2013;<lpage>398</lpage>. <pub-id pub-id-type="doi">10.3969/j.issn.1005-3611.2005.04.006</pub-id></citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wangelin</surname> <given-names>B. C.</given-names></name> <name><surname>Bradley</surname> <given-names>M. M.</given-names></name> <name><surname>Kastner</surname> <given-names>A.</given-names></name> <name><surname>Lang</surname> <given-names>P. J.</given-names></name></person-group> (<year>2012</year>). <article-title>Affective engagement for facial expressions and emotional scenes: the influence of social anxiety.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>91</volume> <fpage>103</fpage>&#x2013;<lpage>110</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2012.05.002</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wiese</surname> <given-names>H.</given-names></name> <name><surname>Kaufmann</surname> <given-names>J. M.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2014</year>). <article-title>The neural signature of the own-race bias: evidence from event-related potentials.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>24</volume> <fpage>826</fpage>&#x2013;<lpage>835</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhs369</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wieser</surname> <given-names>M. J.</given-names></name> <name><surname>Pauli</surname> <given-names>P.</given-names></name> <name><surname>Reicherts</surname> <given-names>P.</given-names></name> <name><surname>M&#x00FC;hlberger</surname> <given-names>A.</given-names></name></person-group> (<year>2010</year>). <article-title>Don&#x2019;t look at me in anger! Enhanced processing of angry faces in anticipation of public speaking.</article-title> <source><italic>Psychophysiology</italic></source> <volume>47</volume> <fpage>271</fpage>&#x2013;<lpage>280</lpage>. <pub-id pub-id-type="doi">10.1111/j.1469-8986.2009.00938</pub-id></citation></ref>
</ref-list>
</back>
</article>