<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2016.01359</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Facial Cosmetics Exert a Greater Influence on Processing of the Mouth Relative to the Eyes: Evidence from the N170 Event-Related Potential Component</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Tanaka</surname> <given-names>Hideaki</given-names></name>
<xref ref-type="author-notes" rid="fn001"><sup>&#x002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/324557/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><institution>Department of Psychology, Faculty of Psychology, Otemon Gakuin University</institution> <country>Ibaraki, Japan</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: <italic>Jim Grange, Keele University, UK</italic></p></fn>
<fn fn-type="edited-by"><p>Reviewed by: <italic>Alex L. Jones, Swansea University, UK; Lindsey A. Short, Redeemer University College, Canada</italic></p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x002A;Correspondence: <italic>Hideaki Tanaka, <email>tanahide@otemon.ac.jp</email></italic></p></fn>
<fn fn-type="other" id="fn002"><p>This article was submitted to Cognition, a section of the journal Frontiers in Psychology</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>05</day>
<month>09</month>
<year>2016</year>
</pub-date>
<pub-date pub-type="collection">
<year>2016</year>
</pub-date>
<volume>07</volume>
<elocation-id>1359</elocation-id>
<history>
<date date-type="received">
<day>16</day>
<month>06</month>
<year>2016</year>
</date>
<date date-type="accepted">
<day>25</day>
<month>08</month>
<year>2016</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2016 Tanaka.</copyright-statement>
<copyright-year>2016</copyright-year>
<copyright-holder>Tanaka</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>Cosmetic makeup significantly influences facial perception. Because faces consist of similar physical structures, cosmetic makeup is typically used to highlight individual features, particularly those of the eyes (i.e., eye shadow) and mouth (i.e., lipstick). Though event-related potentials have been utilized to study various aspects of facial processing, the influence of cosmetics on specific ERP components remains unclear. The present study aimed to investigate the relationship between the application of cosmetic makeup and the amplitudes of the P1 and N170 event-related potential components during facial perception tasks. Moreover, the influence of visual perception on N170 amplitude, was evaluated under three makeup conditions: Eye Shadow, Lipstick, and No Makeup. Electroencephalography was used to monitor 17 participants who were exposed to visual stimuli under each these three makeup conditions. The results of the present study subsequently demonstrated that the Lipstick condition elicited a significantly greater N170 amplitude than the No Makeup condition, while P1 amplitude was unaffected by any of the conditions. Such findings indicate that the application of cosmetic makeup alters general facial perception but exerts no influence on the perception of low-level visual features. Collectively, these results support the notion that the application of makeup induces subtle alterations in the processing of facial stimuli, with a particular effect on the processing of specific facial components (i.e., the mouth), as reflected by changes in N170 amplitude.</p>
</abstract>
<kwd-group>
<kwd>N170</kwd>
<kwd>event-related potential</kwd>
<kwd>cosmetic makeup</kwd>
<kwd>eyes</kwd>
<kwd>mouth</kwd>
<kwd>face perception</kwd>
</kwd-group>
<counts>
<fig-count count="3"/>
<table-count count="2"/>
<equation-count count="0"/>
<ref-count count="73"/>
<page-count count="9"/>
<word-count count="0"/>
</counts>
</article-meta>
</front>
<body>
<sec><title>Introduction</title>
<p>Women in a number of societies throughout the world have traditionally used cosmetic makeup to modify the visual perception of their facial beauty (<xref ref-type="bibr" rid="B36">Jones and Kramer, 2015</xref>), a process that elicits enhanced ratings of physical attractiveness from both men and women (<xref ref-type="bibr" rid="B25">Graham and Jouhar, 1981</xref>; <xref ref-type="bibr" rid="B67">Ueda and Koyama, 2011</xref>). Such results indicate that cosmetic makeup significantly influences the visual perception of facial stimuli.</p>
<p>While faces consist of similar physical structures, cosmetic makeup is typically used to highlight or emphasize individual features (i.e., eyes, nose, and mouth). Indeed, research has indicated that such enhancements are particularly relevant for the eyes (i.e., eye shadow) and the mouth (i.e., lipstick) (<xref ref-type="bibr" rid="B25">Graham and Jouhar, 1981</xref>; <xref ref-type="bibr" rid="B67">Ueda and Koyama, 2011</xref>; <xref ref-type="bibr" rid="B36">Jones and Kramer, 2015</xref>). Application of cosmetics to female faces was found to increase facial contrast (<xref ref-type="bibr" rid="B58">Russell, 2009</xref>). According to <xref ref-type="bibr" rid="B57">Russell (2003)</xref>, the &#x201C;consistent luminance difference between the darker regions of the eyes and mouth and the lighter regions of the skin that surround them forms a pattern unique to faces.&#x201D; <xref ref-type="bibr" rid="B59">Russell et al. (2016)</xref> also reported that female faces with higher facial contrast were rated as healthier and more attractive when this difference in luminance was increased than when it was decreased, though the opposite was observed for male faces (<xref ref-type="bibr" rid="B57">Russell, 2003</xref>). In addition, previous studies have indicated that increasing facial contrast via the use of cosmetics plays a role in age perception, and that female faces with greater facial contrast appear younger (<xref ref-type="bibr" rid="B51">Porcheron et al., 2013</xref>; <xref ref-type="bibr" rid="B37">Jones et al., 2015</xref>). Furthermore, female faces to which cosmetics have been applied are considered more feminine and attractive than the same faces without cosmetics (<xref ref-type="bibr" rid="B58">Russell, 2009</xref>; <xref ref-type="bibr" rid="B63">Stephen and McKeegan, 2010</xref>; <xref ref-type="bibr" rid="B37">Jones et al., 2015</xref>). Because increased luminance contrast enhances femininity and attractiveness in female faces, but reduces masculinity and attractiveness in male faces (<xref ref-type="bibr" rid="B57">Russell, 2003</xref>, <xref ref-type="bibr" rid="B58">2009</xref>; <xref ref-type="bibr" rid="B63">Stephen and McKeegan, 2010</xref>), only female faces were utilized in the present study.</p>
<p>Changes in facial perception can be detected via the recording of event-related brain potentials (ERPs), from which studies have identified face-sensitive P1 and N170 components. As the P1 and N170 ERP components are typically regarded as markers for processing the perceived faces, these components are useful in examining the effect of cosmetics on face perception.</p>
<p>P1, an early positive component of the ERP, typically peaks approximately 100 ms after the presentation of facial stimuli. Reported to reflect the processing of low-level visual features such as contrast (<xref ref-type="bibr" rid="B65">Tarkiainen et al., 2002</xref>; <xref ref-type="bibr" rid="B54">Rossion and Caharel, 2011</xref>), P1 amplitude has also been linked to face-specific visual processing (<xref ref-type="bibr" rid="B66">Thierry et al., 2007</xref>; <xref ref-type="bibr" rid="B64">Susac et al., 2009</xref>; <xref ref-type="bibr" rid="B13">Dering et al., 2011</xref>; <xref ref-type="bibr" rid="B42">Luo et al., 2013</xref>). Previous studies indicate that P1 features a medial (O1, O2) or a lateral-occipital scalp distribution, or both (<xref ref-type="bibr" rid="B15">Eimer, 1998</xref>, <xref ref-type="bibr" rid="B16">2000a</xref>; <xref ref-type="bibr" rid="B41">Liu et al., 2002</xref>; <xref ref-type="bibr" rid="B23">Goffaux et al., 2003</xref>; <xref ref-type="bibr" rid="B31">Itier and Taylor, 2004a</xref>,<xref ref-type="bibr" rid="B32">b</xref>; <xref ref-type="bibr" rid="B27">Herrmann et al., 2005</xref>; <xref ref-type="bibr" rid="B47">Okazaki et al., 2008</xref>; <xref ref-type="bibr" rid="B60">Sadeh et al., 2010</xref>; <xref ref-type="bibr" rid="B42">Luo et al., 2013</xref>).</p>
<p>Comparatively, N170 is a negative component evoked at the onset of facial perception that is characterized by a posterior-temporal scalp distribution (P7, PO7, PO8, P8) (<xref ref-type="bibr" rid="B4">Bentin et al., 1996</xref>; <xref ref-type="bibr" rid="B56">Rossion and Jacques, 2008</xref>; <xref ref-type="bibr" rid="B11">Chen et al., 2009</xref>; <xref ref-type="bibr" rid="B49">Peng et al., 2012</xref>; <xref ref-type="bibr" rid="B53">Ran et al., 2014</xref>) and peaks approximately 170 ms after the presentation of facial stimuli. N170 amplitude is significantly greater in response to human faces than other visual images, including cars, hands, houses, furniture, and scrambled faces (<xref ref-type="bibr" rid="B4">Bentin et al., 1996</xref>; <xref ref-type="bibr" rid="B22">George et al., 1996</xref>; <xref ref-type="bibr" rid="B15">Eimer, 1998</xref>, <xref ref-type="bibr" rid="B17">2000b</xref>; <xref ref-type="bibr" rid="B20">Eimer and McCarthy, 1999</xref>; <xref ref-type="bibr" rid="B34">Jemel et al., 1999</xref>; <xref ref-type="bibr" rid="B5">Bentin and Deouell, 2000</xref>). The neural generators of N170 are reported to lie adjacent to the fusiform area, a region previously implicated in facial processing (<xref ref-type="bibr" rid="B4">Bentin et al., 1996</xref>). This is consistent with previous functional magnetic resonance imaging (fMRI), magnetoencephalography (MEG), and Brain Electrical Source Analysis (BESA) studies (<xref ref-type="bibr" rid="B52">Puce et al., 1995</xref>; <xref ref-type="bibr" rid="B70">Watanabe et al., 1999a</xref>,<xref ref-type="bibr" rid="B71">b</xref>; <xref ref-type="bibr" rid="B60">Sadeh et al., 2010</xref>; <xref ref-type="bibr" rid="B42">Luo et al., 2013</xref>).</p>
<p>While the P1 component of the ERP appears to reflect a response to lower-level visual features, the N170 component appears to be driven by wide-scale facial perception (<xref ref-type="bibr" rid="B54">Rossion and Caharel, 2011</xref>). In addition, <xref ref-type="bibr" rid="B61">Schweinberger (2011)</xref> suggested that the P1 component reflects the pictorial encoding of faces, while the N170 component reflects the structural encoding of faces. Pictorial encoding is defined as the early top&#x2013;down attentional processing of faces, while structural encoding precedes the processes involved in the identification of faces. N170 is therefore considered to be related to both domain-specific and domain-general processing of facial information (<xref ref-type="bibr" rid="B18">Eimer, 2011</xref>). However, the ability of P1 or N170 amplitudes to detect alterations in facial perception induced by the application of cosmetics remains unclear. In order to clarify the influence of cosmetic makeup on the neural representation of facial stimuli, the present study used an ERP adaptation paradigm, in which the stimulus presented was preceded by a stimulus of another category (e.g., a face presented within a different format) (<xref ref-type="bibr" rid="B39">Kov&#x00E1;cs et al., 2006</xref>; <xref ref-type="bibr" rid="B19">Eimer et al., 2010</xref>; <xref ref-type="bibr" rid="B73">Zimmer and Kov&#x00E1;cs, 2011</xref>; <xref ref-type="bibr" rid="B8">Caharel et al., 2015</xref>). Since previous studies of cosmetic makeup have utilized comparisons between faces with and without makeup, (<xref ref-type="bibr" rid="B25">Graham and Jouhar, 1981</xref>; <xref ref-type="bibr" rid="B36">Jones and Kramer, 2015</xref>), the present study adopted a similar paradigm.</p>
<p>The current literature indicates that the eyes play a central role in facial perception and representation. In particular, a recent MEG study demonstrated that participants require significantly longer to perceive eyes presented in isolation than when presented as a facial component (<xref ref-type="bibr" rid="B71">Watanabe et al., 1999b</xref>). Accordingly, eye-tracking studies demonstrate that participants tend to fixate close to or directly on the eyes during facial perception (<xref ref-type="bibr" rid="B33">Janik et al., 1978</xref>; <xref ref-type="bibr" rid="B2">Barton et al., 2006</xref>; <xref ref-type="bibr" rid="B1">Arizpe et al., 2012</xref>), and that N170 amplitude is significantly greater in response to eyes presented in isolation than to the whole face, nose, or mouth (<xref ref-type="bibr" rid="B4">Bentin et al., 1996</xref>; <xref ref-type="bibr" rid="B5">Bentin and Deouell, 2000</xref>; <xref ref-type="bibr" rid="B29">Itier et al., 2006</xref>; <xref ref-type="bibr" rid="B28">Itier et al., 2007</xref>; <xref ref-type="bibr" rid="B46">Nemrodov and Itier, 2011</xref>). However, conflicting reports exist with regard to this matter. While several studies have indicated that eyeless faces elicit N170 amplitudes similar to that of normal or inverted faces (<xref ref-type="bibr" rid="B15">Eimer, 1998</xref>; <xref ref-type="bibr" rid="B43">Magnuski and Gola, 2013</xref>), a similar experiment demonstrated greater N170 amplitudes for normal faces relative to eyeless faces (<xref ref-type="bibr" rid="B45">Nemrodov et al., 2014</xref>). Therefore, the relationship between N170 amplitude and the role of eyes in facial perception remains uncertain.</p>
<p>In addition, <xref ref-type="bibr" rid="B12">daSilva et al. (2016)</xref> recently demonstrated the significance of the mouth in facial processing by presenting mouth images depicting grimaces, smiles, and open mouth expressions to participants. Expressions featuring teeth elicited significantly larger N170 amplitudes compared to expressions without teeth (<xref ref-type="bibr" rid="B12">daSilva et al., 2016</xref>). However, <xref ref-type="bibr" rid="B12">daSilva et al. (2016)</xref> did not examine N170 amplitude in relation to processing of both the eyes and mouth. Accordingly, few studies have evaluated N170 amplitude responsivity to the eyes and mouth in the context of facial processing. <xref ref-type="bibr" rid="B50">Pesciarelli et al. (2016)</xref> reported that processing of the eyes in inverted faces elicited significantly larger N170 amplitudes than in upright faces; however, this was not true for the mouth. In addition, <xref ref-type="bibr" rid="B50">Pesciarelli et al. (2016)</xref> identified significantly larger N170 amplitudes for the mouth relative to the eyes in upright faces.</p>
<p>Therefore, it remains unclear whether the eyes or mouth exert a greater influence on N170 amplitude. As previously mentioned, cosmetic makeup is most frequently applied to the eyes (i.e., eye shadow) and the mouth (i.e., lipstick) (<xref ref-type="bibr" rid="B25">Graham and Jouhar, 1981</xref>; <xref ref-type="bibr" rid="B67">Ueda and Koyama, 2011</xref>; <xref ref-type="bibr" rid="B36">Jones and Kramer, 2015</xref>). <xref ref-type="bibr" rid="B44">Mulhern et al. (2003)</xref> examined the relative contribution of cosmetic application under five cosmetic conditions: no make-up, foundation only, eye make-up only, lip make-up only, and full-facial make-up. Women judged eye make-up as contributing most to attractiveness, while men rated both eye make-up and foundation as having a significant impact on attractiveness in the context of a full-facial makeover. However, lipstick did not appear to contribute to attractiveness independently (<xref ref-type="bibr" rid="B44">Mulhern et al., 2003</xref>). On the other hand, <xref ref-type="bibr" rid="B63">Stephen and McKeegan (2010)</xref> allowed participants to manipulate the color of the lips in color-calibrated face photographs along the red&#x2013;green and blue&#x2013;yellow axes. Participants increased redness contrast to enhance femininity and attractiveness in female faces, but reduced redness contrast to enhance masculinity in male faces (<xref ref-type="bibr" rid="B63">Stephen and McKeegan, 2010</xref>). In order to clarify whether the application of cosmetics to the eyes or mouth elicits a greater effect on N170 amplitude, the present study compared P1 and N170 amplitudes during facial perception under three makeup conditions: <italic>Eye shadow</italic>, <italic>Lipstick</italic>, and <italic>No Makeup</italic>.</p>
<p>The present study aimed to investigate the influence of cosmetic makeup on the perception of facial stimuli via the evaluation of N170 and P1 amplitudes during the recording of ERP. If N170 amplitude is more reflective of the processing of the eyes than the mouth following the application of cosmetics, a significant difference in N170 amplitude would be expected between the <italic>Eye shadow</italic> and <italic>Lipstick/No Makeup</italic> conditions. Alternatively, if N170 amplitude is more reflective of the processing of the mouth than the eyes following the application of cosmetics, a significant difference would be expected between the <italic>Lipstick</italic> and <italic>Eye shadow/No Makeup</italic> conditions. Moreover, because the application of cosmetics to female faces has been observed to increase facial contrast (<xref ref-type="bibr" rid="B58">Russell, 2009</xref>), the present study also aimed to investigate whether application of cosmetics to the eyes or mouth exerts a greater influence on P1 amplitude, which reflects the processing of lower-level visual features (<xref ref-type="bibr" rid="B54">Rossion and Caharel, 2011</xref>).</p>
</sec>
<sec id="s1" sec-type="materials|methods">
<title>Materials and Methods</title>
<sec><title>Participants</title>
<p>Seventeen healthy, right-handed, Japanese participants (5 men; 12 women; aged 18&#x2013;24 years; mean age: 21.3 years) were selected for the present study. All participants exhibited normal or corrected-to-normal vision and had no history of psychiatric or neurological disorders.</p>
<p>All participants provided written informed consent prior to participation in the study, in accordance with the Declaration of Helsinki. The ethics committee of Otemon Gakuin University formally approved this experiment and the recruitment of participants from the Otemon Gakuin University student population.</p>
</sec>
<sec><title>Stimuli</title>
<p>The images selected as visual stimuli included color pictures of the faces of 10 young, adult Japanese women. The stimulus faces were unfamiliar to all participants in the study. The pictures were obtained from various websites<sup><xref ref-type="fn" rid="fn01">1</xref></sup><sup>,</sup><sup><xref ref-type="fn" rid="fn02">2</xref></sup> and included front-facing views of almost identical luminance. All images depicted neutral expressions. In total, 30 visual stimuli were used for the present study (each of the 10 model faces were provided in three conditions; <italic>Lipstick</italic> (wearing red lipstick only), <italic>Eye Shadow</italic> (wearing blue eye shadow only), and <italic>No Makeup</italic> (no makeup applied). Each image was digitally edited to include cosmetics of the same color (same red lipstick, same blue eye shadow) and reconstructed from the original using an application software (YouCan Makeup) of iPad<sup><xref ref-type="fn" rid="fn03">3</xref></sup> (<bold>Figure <xref ref-type="fig" rid="F1">1A</xref></bold>). The <italic>No Makeup</italic> condition was also used as the adapting image for the experiment and was presented prior to each stimulus for comparison. In addition, each image was edited to feature the same hairstyle and color (black), as reconstructed from the original using an iPad. All stimuli were airbrushed using Adobe Photoshop 12 to remove any outstanding features or blemishes and were subsequently processed using Photoshop software to ensure background consistency. All stimuli were presented in the same orientation on a white background. All faces were presented in a front-facing view and were equated for mean luminance (luminance values = 8.3 cd/m2) and size using Adobe Photoshop 12 software (<bold>Figure <xref ref-type="fig" rid="F1">1A</xref></bold>). Faces occupied a visual angle of (horizontal &#x00D7; vertical) 3.4&#x00B0; &#x00D7; 4.0&#x00B0;. All faces were presented in the center of a 22-inch cathode ray tube monitor (Mitsubishi, Diamondtron M2, RDF223G, Chiyoda, Tokyo, Japan) that was placed 100 cm in front of the participants. The screen resolution was 1280 &#x00D7; 1024, with a refresh rate of 100 Hz.</p>
<fig id="F1" position="float">
<label>FIGURE 1</label>
<caption><p><bold>(A)</bold> Examples of adapting facial stimuli (<italic>No Makeup</italic>) and three target faces (<italic>Eye shadow, Lipstick, No Makeup</italic>). <bold>(B)</bold> Timeline of the single trial.</p></caption>
<graphic xlink:href="fpsyg-07-01359-g001.tif"/>
</fig>
<p>A previous study (<xref ref-type="bibr" rid="B24">Golby et al., 2001</xref>) reported differential activity in the fusiform region in response to same-race and alternate-race faces. Comparatively, several reports indicate that N170 amplitude is unaffected by the effects of race (<xref ref-type="bibr" rid="B68">Vizioli et al., 2010a</xref>,<xref ref-type="bibr" rid="B69">b</xref>), while additional studies report greater N170 amplitudes in response to other-race facial stimuli relative to own-race stimuli (<xref ref-type="bibr" rid="B62">Stahl et al., 2010</xref>; <xref ref-type="bibr" rid="B72">Wiese et al., 2012</xref>). For this reason, the images used as visual stimuli in the present study were of Japanese women only, and all participants were Japanese. In addition, since N170 amplitude varies depending on the viewpoint of the face (<xref ref-type="bibr" rid="B17">Eimer, 2000b</xref>; <xref ref-type="bibr" rid="B8">Caharel et al., 2015</xref>), all faces were presented in a front-facing view. Furthermore, in several studies, N170 has been shown to be sensitive to emotional expression (<xref ref-type="bibr" rid="B3">Batty and Taylor, 2003</xref>; <xref ref-type="bibr" rid="B14">Eger et al., 2003</xref>; <xref ref-type="bibr" rid="B9">Caharel et al., 2005</xref>; <xref ref-type="bibr" rid="B7">Blau et al., 2007</xref>; <xref ref-type="bibr" rid="B40">Lepp&#x00E4;nen et al., 2007</xref>). Therefore, the faces used for visual stimuli in the present study featured neutral expressions.</p>
</sec>
<sec><title>Procedure</title>
<p>Participants were seated comfortably 100 cm in front of a 22-inch cathode ray tube monitor on which stimuli were presented using a Multi Trigger System (Medical Try System, Kodaira, Tokyo, Japan). Each trial was completed as follows: (I) a fixation mark (+) was presented for 500 ms, followed by an inter-stimulus interval of 1000 ms; (II) an adapting facial stimulus (No Makeup) was presented for 500 ms, followed by an inter-stimulus interval of 1000 ms; (III) a target face stimulus (<italic>Lipstick</italic>, <italic>Eye Shadow</italic>, or <italic>No Makeup</italic>) was presented for 500 ms, followed by an inter-stimulus interval of 500 ms; and (IV) a judgment screen was presented for 1000 ms (<bold>Figure <xref ref-type="fig" rid="F1">1B</xref></bold>). The inter-trial interval varied randomly between 500 and 1500 ms.</p>
<p>On the judgment screen, the three target faces were assigned a number: 1 = <italic>Lipstick</italic>; 2 = <italic>Eye Shadow</italic>; 3 = <italic>No Makeup</italic>. Each participant was instructed to compare the adapting face (No Makeup) with a target face and to identify the type of target face as quickly and accurately as possible. Participants were required to respond by pressing one of three buttons that corresponded to 1, 2, and 3 with their right index finger to indicate whether the facial stimulus belonged to the Lipstick, Eye Shadow, or No Makeup condition, respectively. Reaction time was measured using a digital timer accurate to 1 ms, beginning with the onset of stimulus presentation and finishing once participants had responded to the stimulus. Participants performed 10 practice trials, followed by three blocks of 80 trials (240 trials total). The order of conditions was randomized within each block. In total, 30 stimuli were presented in random order, with equal probability. In each trial, the adapting face and target face were obtained from the same person.</p>
</sec>
<sec><title>Recording and Analysis</title>
<p>Electroencephalography (EEG) and Electrooculography (EOG) data were acquired using a 128-channel Sensor Net (Electrical Geodesic, Inc., Eugene, OR, USA) and recorded via the standard EGI Net Station 5.2.01 package. EEG and EOG results were recorded using Ag/AgCl electrodes from the 10-5 system (<xref ref-type="bibr" rid="B48">Oostenveld and Praamstra, 2001</xref>; <xref ref-type="bibr" rid="B38">Jurcak et al., 2007</xref>) and each electrode was referred to the vertex (Cz). Next, each electrode was o&#xFB04;ine re-referenced to the common average. Vertical and horizontal eye movements were recorded using EOG electrodes placed above, below and at the outer canthi of both eyes to detect movement artifacts. EEG and EOG were sampled at 500 Hz and band-pass filtered at 0.01&#x2013;30 Hz. Electrode impedance was maintained below 50 k&#x03A9;. For artifact rejection, all trials in which both the vertical and horizontal EOG voltages exceeded 140 mV during the recording epoch were excluded from further analysis.</p>
<p>Stimulus-locked ERPs were derived separately for each of the three target faces (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, and <italic>No Makeup</italic>) from 200 ms before to 1000 ms after stimulus presentation, and were baseline corrected using the 200 ms pre-stimulus window. Based on previous studies (<xref ref-type="bibr" rid="B42">Luo et al., 2013</xref>; <xref ref-type="bibr" rid="B53">Ran et al., 2014</xref>), the P1 component was analyzed via the following four electrode sites: O1/O2 and PO3/PO4. The amplitude of the positive peak of the EEG signal was quantified 50&#x2013;110 ms after stimulus presentation. Similarly, according to previous studies (<xref ref-type="bibr" rid="B53">Ran et al., 2014</xref>; <xref ref-type="bibr" rid="B8">Caharel et al., 2015</xref>), the N170 component was analyzed via the following 12 electrode sites: P5/P6, P7/P8, PO7/PO8, PO9/PO10, POO9h /POO10h, and PPO9h/PPO10h (the 10-5 system) (<xref ref-type="bibr" rid="B48">Oostenveld and Praamstra, 2001</xref>; <xref ref-type="bibr" rid="B38">Jurcak et al., 2007</xref>). The amplitude of the negative peak of the EEG signal was quantified 120 to 180 ms after stimulus presentation. The mean reaction time and ERP amplitude were then calculated for each participant in response to the three target faces.</p>
</sec>
<sec><title>Statistical Analysis</title>
<p>Reaction time was analyzed using a one-way repeated-measures analysis of variance (ANOVA) for each condition (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, <italic>No Makeup</italic>). P1 amplitude was analyzed using a three-way (3 &#x00D7; 2 &#x00D7; 2) repeated-measures ANOVA with regard to condition (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, <italic>No Makeup</italic>), hemisphere (left, right) and electrode placement (O1 vs. PO3, O2 vs. PO4), while <italic>post hoc</italic> comparisons were performed using the Bonferroni test. Similarly, N170 amplitude was analyzed using a three-way (3 &#x00D7; 2 &#x00D7; 6) repeated-measures ANOVA with regard to condition (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, <italic>No Makeup</italic>), hemisphere (left, right) and electrode placement (P5 vs. P7 vs. PO7 vs. PO9 vs. POO9h vs. PPO9h, P6 vs. P8 vs. PO8 vs. PO10 vs. POO10h vs. PPO10h), while <italic>post hoc</italic> comparisons were performed using the Bonferroni test. ERP was analyzed using Greenhouse&#x2013;Geisser corrections applied to p values associated with multiple degrees of freedom repeated-measures comparisons.</p>
</sec>
</sec>
<sec><title>Results</title>
<sec><title>Effect of Condition on Reaction Time</title>
<p>No main effect was detected for <italic>condition</italic> on reaction time: <italic>Eye shadow</italic>, 408 &#x00B1; 81 ms (mean &#x00B1; SD); <italic>Lipstick</italic>, 412 &#x00B1; 98 ms; <italic>No Makeup</italic>, 391 &#x00B1; 73 ms [<italic>F</italic>(2,32) = 1.70, <italic>p</italic> = 0.20].</p>
</sec>
<sec><title>Effect of Condition on P1 Amplitude</title>
<p><bold>Figure <xref ref-type="fig" rid="F2">2</xref></bold> displays the grand-averaged EEG waveforms for all conditions (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, <italic>No Makeup</italic>) at two electrode sites (O1/O2). For each condition, an enhanced positive ERP was identified 50&#x2013;110 ms after exposure to the target face. This positive ERP was identified as P1. <bold>Table <xref ref-type="table" rid="T1">1</xref></bold> displays the mean P1 amplitude for all conditions at four electrode sites (O1/O2 and PO3/PO4). No significant main effect was detected with regard to <italic>condition</italic> [<italic>F</italic>(2,32) = 0.49, <italic>p</italic> = 0.77, <inline-formula><mml:math id="M1"><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>&#x03b7;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>p</mml:mi></mml:mrow><mml:mrow><mml:mn mathcolor='black' mathvariant='normal'>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.02), or <italic>hemisphere</italic> on P1 amplitude [<italic>F</italic>(1,16) = 0.42, <italic>p</italic> = 0.53, <inline-formula><mml:math id="M2"><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>&#x03b7;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>p</mml:mi></mml:mrow><mml:mrow><mml:mn mathcolor='black' mathvariant='normal'>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.03]. However, <italic>electrode placement</italic> produced a significant effect on P1 amplitude [<italic>F</italic>(1,16) = 11.95, <italic>p</italic> = 0.003, <inline-formula><mml:math id="M3"><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>&#x03b7;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>p</mml:mi></mml:mrow><mml:mrow><mml:mn mathcolor='black' mathvariant='normal'>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.43], with a greater P1 amplitude at the O1 and O2 sites than the PO3 and PO4 sites (<italic>p</italic> &#x003C; 0.05). No significant interactions were observed between the variables (all <italic>p</italic> > 0.05).</p>
<fig id="F2" position="float">
<label>FIGURE 2</label>
<caption><p><bold>Stimulus-locked average ERP waveforms at O1 and O2 for each target face: <italic>Eye shadow</italic>, <italic>Lipstick</italic>, and <italic>No Makeup</italic></bold>.</p></caption>
<graphic xlink:href="fpsyg-07-01359-g002.tif"/>
</fig>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Mean P1 amplitude for all conditions at four electrode sites.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<th valign="top" align="left">Electrocode</th>
<th valign="top" align="center" colspan="2">Eyeshadow<hr/></th>
<th valign="top" align="center" colspan="2">Lipstick<hr/></th>
<th valign="top" align="center" colspan="2">No makeup<hr/></th>
</tr>
<tr>
<td valign="top" align="left"></td>
<th valign="top" align="center">Mean</th>
<th valign="top" align="center"><italic>SD</italic></th>
<th valign="top" align="center">Mean</th>
<th valign="top" align="center"><italic>SD</italic></th>
<th valign="top" align="center">Mean</th>
<th valign="top" align="center"><italic>SD</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left"><bold>Left hemisphere</bold></td>
<td valign="top" align="center" colspan="4"></td>
</tr>
<tr>
<td valign="top" align="left">PO3</td>
<td valign="top" align="center">1.53</td>
<td valign="top" align="center">1.06</td>
<td valign="top" align="center">1.22</td>
<td valign="top" align="center">0.95</td>
<td valign="top" align="center">1.22</td>
<td valign="top" align="center">1.07</td></tr>
<tr>
<td valign="top" align="left">O1</td>
<td valign="top" align="center">2.17</td>
<td valign="top" align="center">1.56</td>
<td valign="top" align="center">2.15</td>
<td valign="top" align="center">1.88</td>
<td valign="top" align="center">2.21</td>
<td valign="top" align="center">1.98</td>
</tr>
<tr>
<td valign="top" align="left"><bold>Right</bold> <bold>hemisphere</bold></td>
<td valign="top" align="center" colspan="4"></td>
</tr>
<tr>
<td valign="top" align="left">PO4</td>
<td valign="top" align="center">1.45</td>
<td valign="top" align="center">1.69</td>
<td valign="top" align="center">1.34</td>
<td valign="top" align="center">1.26</td>
<td valign="top" align="center">1.51</td>
<td valign="top" align="center">1.44</td></tr>
<tr>
<td valign="top" align="left">O2</td>
<td valign="top" align="center">2.40</td>
<td valign="top" align="center">1.33</td>
<td valign="top" align="center">2.43</td>
<td valign="top" align="center">1.26</td>
<td valign="top" align="center">2.69</td>
<td valign="top" align="center">1.67</td></tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec><title>Effect of Condition on N170 Amplitude</title>
<p><bold>Figure <xref ref-type="fig" rid="F3">3</xref></bold> displays the grand-averaged EEG waveforms for all conditions (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, <italic>No Makeup</italic>) at four electrode sites (P7/P8 and PO7/PO8). For each condition, an enhanced negative ERP was identified 120&#x2013;180 ms after exposure to the target face. This negative ERP was identified as N170. <bold>Table <xref ref-type="table" rid="T2">2</xref></bold> displays the mean N170 amplitude for all conditions at 12 electrode sites (P5/P6, P7/P8, PO7/PO8, PO9/PO10, POO9h /POO10h, and PPO9h /PPO10h). A significant main effect was detected for <italic>condition</italic> [<italic>F</italic>(2,32) = 3.39, <italic>p</italic> = 0.05, <inline-formula><mml:math id="M4"><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>&#x03b7;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>p</mml:mi></mml:mrow><mml:mrow><mml:mn mathcolor='black' mathvariant='normal'>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.18], and <italic>electrode placement</italic> on N170 amplitude [<italic>F</italic>(5,80) = 7.07, <italic>p</italic> = 0.002, <inline-formula><mml:math id="M5"><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>&#x03b7;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>p</mml:mi></mml:mrow><mml:mrow><mml:mn mathcolor='black' mathvariant='normal'>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.31]. The N170 amplitude for the <italic>Lipstick</italic> condition was significantly greater than for the <italic>No Makeup</italic> condition (<italic>p</italic> &#x003C; 0.05). No significant main effect was detected with regard to <italic>hemisphere</italic> on N170 amplitude [<italic>F</italic>(1,16) = 0.29, <italic>p</italic> = 0.60, <inline-formula><mml:math id="M6"><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>&#x03b7;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>p</mml:mi></mml:mrow><mml:mrow><mml:mn mathcolor='black' mathvariant='normal'>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.02]. In addition, no significant interactions were detected between any two of the three variables (<italic>condition</italic> &#x00D7; <italic>hemisphere</italic>, <italic>condition</italic> &#x00D7; <italic>electrode</italic>, and <italic>hemisphere</italic> &#x00D7; <italic>electrode</italic>) (all <italic>p</italic> > 0.05), however, a significant interaction was detected for all three (<italic>condition</italic> &#x00D7; <italic>hemisphere</italic> &#x00D7; <italic>electrode</italic>) [<italic>F</italic>(10,160) = 2.88, <italic>p</italic> = 0.04, <inline-formula><mml:math id="M7"><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>&#x03b7;</mml:mi></mml:mrow><mml:mrow><mml:mi mathcolor='black' mathvariant='normal'>p</mml:mi></mml:mrow><mml:mrow><mml:mn mathcolor='black' mathvariant='normal'>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.15]. Simple effect analyses indicated that the N170 amplitude for the <italic>Lipstick</italic> condition was significantly greater than for the <italic>No Makeup</italic> condition in the left hemisphere (PO7) and right hemisphere (PO10) (<italic>p</italic> &#x003C; 0.05). Furthermore, simple effect analyses demonstrated that N170 amplitude was significantly greater at the P8 and PPO10h placement sites than the PO8 site in the right hemisphere for all conditions (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, <italic>No Makeup</italic>) (<italic>p</italic> &#x003C; 0.05). Moreover, in the left hemisphere during the <italic>No Makeup</italic> condition, N170 amplitude was significantly greater at P5 than P7 (<italic>p</italic> &#x003C; 0.05).</p>
<fig id="F3" position="float">
<label>FIGURE 3</label>
<caption><p><bold>Stimulus-locked average ERP waveforms at P7, P8, PO7, and PO8 for each target face: <italic>Eye shadow</italic>, <italic>Lipstick</italic>, and <italic>No Makeup</italic></bold>.</p></caption>
<graphic xlink:href="fpsyg-07-01359-g003.tif"/>
</fig>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>Mean N170 amplitude for all conditions at twelve electrode sites.</p></caption>
<table cellspacing="5" cellpadding="5" frame="hsides" rules="groups">
<thead>
<tr>
<th valign="top" align="left">Electrocode</th>
<th valign="top" align="center" colspan="2">Eyeshadow<hr/></th>
<th valign="top" align="center" colspan="2">Lipstick<hr/></th>
<th valign="top" align="center" colspan="2">No makeup<hr/></th>
</tr>
<tr>
<td valign="top" align="left"></td>
<th valign="top" align="center">Mean</th>
<th valign="top" align="center"><italic>SD</italic></th>
<th valign="top" align="center">Mean</th>
<th valign="top" align="center"><italic>SD</italic></th>
<th valign="top" align="center">Mean</th>
<th valign="top" align="center"><italic>SD</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left"><bold>Left hemisphere</bold></td>
<td valign="top" align="center" colspan="4"></td>
</tr>
<tr>
<td valign="top" align="left">P5</td>
<td valign="top" align="center">0.30</td>
<td valign="top" align="center">2.13</td>
<td valign="top" align="center">0.34</td>
<td valign="top" align="center">1.73</td>
<td valign="top" align="center">0.33</td>
<td valign="top" align="center">1.57</td></tr>
<tr>
<td valign="top" align="left">P7</td>
<td valign="top" align="center">1.04</td>
<td valign="top" align="center">3.56</td>
<td valign="top" align="center">1.48</td>
<td valign="top" align="center">2.60</td>
<td valign="top" align="center">1.87</td>
<td valign="top" align="center">2.53</td>
</tr>
<tr>
<td valign="top" align="left">PO7</td>
<td valign="top" align="center">2.52</td>
<td valign="top" align="center">4.32</td>
<td valign="top" align="center">2.17</td>
<td valign="top" align="center">3.54</td>
<td valign="top" align="center">3.14</td>
<td valign="top" align="center">4.14</td></tr>
<tr>
<td valign="top" align="left">PO9</td>
<td valign="top" align="center">0.73</td>
<td valign="top" align="center">2.68</td>
<td valign="top" align="center">0.66</td>
<td valign="top" align="center">2.33</td>
<td valign="top" align="center">1.11</td>
<td valign="top" align="center">2.24</td>
</tr>
<tr>
<td valign="top" align="left">POO9h</td>
<td valign="top" align="center">1.78</td>
<td valign="top" align="center">3.47</td>
<td valign="top" align="center">1.81</td>
<td valign="top" align="center">3.03</td>
<td valign="top" align="center">2.07</td>
<td valign="top" align="center">3.41</td></tr>
<tr>
<td valign="top" align="left">PPO9h</td>
<td valign="top" align="center">1.76</td>
<td valign="top" align="center">3.68</td>
<td valign="top" align="center">1.10</td>
<td valign="top" align="center">2.89</td>
<td valign="top" align="center">1.51</td>
<td valign="top" align="center">2.85</td>
</tr>
<tr>
<td valign="top" align="left"><bold>Right hemisphere</bold></td>
<td valign="top" align="center" colspan="4"></td>
</tr>
<tr>
<td valign="top" align="left">P6</td>
<td valign="top" align="center">1.50</td>
<td valign="top" align="center">2.83</td>
<td valign="top" align="center">1.04</td>
<td valign="top" align="center">2.84</td>
<td valign="top" align="center">1.50</td>
<td valign="top" align="center">3.20</td></tr>
<tr>
<td valign="top" align="left">P8</td>
<td valign="top" align="center">1.24</td>
<td valign="top" align="center">4.23</td>
<td valign="top" align="center">1.85</td>
<td valign="top" align="center">3.95</td>
<td valign="top" align="center">2.32</td>
<td valign="top" align="center">4.31</td>
</tr>
<tr>
<td valign="top" align="left">PO8</td>
<td valign="top" align="center">3.06</td>
<td valign="top" align="center">4.30</td>
<td valign="top" align="center">2.74</td>
<td valign="top" align="center">3.71</td>
<td valign="top" align="center">3.52</td>
<td valign="top" align="center">4.79</td></tr>
<tr>
<td valign="top" align="left">PO10</td>
<td valign="top" align="center">1.04</td>
<td valign="top" align="center">3.13</td>
<td valign="top" align="center">0.77</td>
<td valign="top" align="center">2.61</td>
<td valign="top" align="center">1.66</td>
<td valign="top" align="center">3.35</td>
</tr>
<tr>
<td valign="top" align="left">POO10h</td>
<td valign="top" align="center">1.84</td>
<td valign="top" align="center">3.60</td>
<td valign="top" align="center">1.90</td>
<td valign="top" align="center">3.43</td>
<td valign="top" align="center">2.86</td>
<td valign="top" align="center">4.72</td></tr>
<tr>
<td valign="top" align="left">PPO10h</td>
<td valign="top" align="center">1.16</td>
<td valign="top" align="center">3.89</td>
<td valign="top" align="center">0.98</td>
<td valign="top" align="center">3.75</td>
<td valign="top" align="center">1.55</td>
<td valign="top" align="center">4.27</td></tr>
</tbody>
</table>
</table-wrap>
</sec>
</sec>
<sec><title>Discussion</title>
<p>The present study aimed to investigate changes in P1 and N170 amplitude in response to the application of cosmetics during a facial perception task. This experiment adopted an ERP adaptation paradigm, in which participants were required to compare a model face with makeup (<italic>Eye Shadow/Lipstick</italic>) to a model face without makeup (<italic>No Makeup</italic>). Subsequently, P1 and N170 amplitudes were analyzed using EEG and EOG during a facial perception task with three target faces; (<italic>Eye shadow</italic>, <italic>Lipstick</italic>, and <italic>No Makeup</italic>), wherein the adapting face presented prior to the target face was the <italic>No Makeup</italic> condition. The results of the present study demonstrated that N170 amplitudes were significantly greater in response to the <italic>Lipstick</italic> condition than to the <italic>No Makeup</italic> condition, while no significant effect was detected for cosmetic makeup on P1 amplitude.</p>
<p>Previously, <xref ref-type="bibr" rid="B54">Rossion and Caharel (2011)</xref> reported that P1 and N170 amplitudes demonstrated functional dissociation with regard to facial sensitivity, wherein P1 was driven by low-level visual features, while N170 reflected facial perception. The present results support these findings, providing direct evidence that the application of facial cosmetics does not influence the perception of low-level visual features, but instead affects overall facial perception. Although the results of previous studies indicated that application of cosmetics to female faces increased facial contrast (<xref ref-type="bibr" rid="B58">Russell, 2009</xref>), no significant effect was detected for cosmetic makeup on P1 amplitude in the present study. As the present study utilized an ERP adaptation paradigm (<xref ref-type="bibr" rid="B39">Kov&#x00E1;cs et al., 2006</xref>; <xref ref-type="bibr" rid="B19">Eimer et al., 2010</xref>; <xref ref-type="bibr" rid="B73">Zimmer and Kov&#x00E1;cs, 2011</xref>; <xref ref-type="bibr" rid="B8">Caharel et al., 2015</xref>), attention was attracted to detecting changes between the faces with and without makeup. Therefore, because attention allocated to the low-level visual processing of faces was distracted, no differences in P1 were observed according to condition. On the other hand, because attention was allocated to the perception and structural encoding of the face, the N170 amplitude was significantly greater in the <italic>Lipstick</italic> than in the <italic>No Makeup</italic> condition.</p>
<p>Such findings provide support for the notion that the application of cosmetics significantly influences facial perception (<xref ref-type="bibr" rid="B25">Graham and Jouhar, 1981</xref>; <xref ref-type="bibr" rid="B67">Ueda and Koyama, 2011</xref>; <xref ref-type="bibr" rid="B36">Jones and Kramer, 2015</xref>). In the present study, N170 amplitudes were significantly greater in response to the <italic>Lipstick</italic> condition than to the <italic>No Makeup</italic> condition, though they did not significantly differ between the <italic>No Makeup</italic> and <italic>Eye Shadow</italic> conditions. Such findings support the hypothesis that N170 amplitudes reflect the processing of specific facial stimuli, wherein N170 better represents processing of the mouth than the eyes. Therefore, the results of the present study indicate that application of cosmetic makeup to any region of the face influences mouth-based processing, as reflected by changes in N170 amplitude. These findings are consistent with those of a previous study (<xref ref-type="bibr" rid="B50">Pesciarelli et al., 2016</xref>), wherein the application of cosmetic makeup (<italic>Lipstick</italic>) drew attention to the mouth during facial perception.</p>
<p>However, because different amounts of makeup affect attractiveness (<xref ref-type="bibr" rid="B35">Jones et al., 2014</xref>), it is possible that the red lipstick was more vivid than the blue eye shadow in the present study. In addition, because longer viewing durations affect judgments of faces with different amounts of makeup in varied ways (<xref ref-type="bibr" rid="B21">Etcoff et al., 2011</xref>), the red lipstick may have simply been more eye-catching under the relatively short duration utilized in the present study. To clarify these issues, future studies should manipulate the amount of makeup and duration of stimulus presentation.</p>
<p>In addition, <xref ref-type="bibr" rid="B37">Jones et al. (2015)</xref> revealed that a typical application of cosmetics increases the luminance contrast of the eyes to a much greater extent than the redness contrast of the mouth. The present results, however, contradicted these findings. As previously mentioned, because the present study did not involve manipulation of the amount of makeup or the duration of stimulus presentation, further research is required in order to examine the relative influence of various amounts of makeup on luminance contrast and ERP components. Furthermore, a previous eye-tracking study (<xref ref-type="bibr" rid="B6">Blais et al., 2008</xref>) revealed that Western Caucasian observers fixate more on the eye region, while East Asian observers fixate more on the central region of the face. It is possible that participants from a primarily Caucasian culture may not have exhibited the same pattern of results as the Japanese participants tested in the present study. Some previous studies indicate that N170 amplitude is unaffected by the effects of race (<xref ref-type="bibr" rid="B68">Vizioli et al., 2010a</xref>,<xref ref-type="bibr" rid="B69">b</xref>), while additional studies report greater N170 amplitudes in response to other-race facial stimuli relative to own-race stimuli (<xref ref-type="bibr" rid="B62">Stahl et al., 2010</xref>; <xref ref-type="bibr" rid="B72">Wiese et al., 2012</xref>). In order to clarify these issues, future studies should compare N170 amplitude for observers of several races when both own-race and other-race stimuli are presented.</p>
<p>According to <xref ref-type="bibr" rid="B45">Nemrodov et al. (2014)</xref>, N170 amplitude was greater in response to inverted faces (faces presented upside down) than upright faces (<xref ref-type="bibr" rid="B4">Bentin et al., 1996</xref>; <xref ref-type="bibr" rid="B55">Rossion et al., 1999</xref>; <xref ref-type="bibr" rid="B30">Itier and Taylor, 2002</xref>; <xref ref-type="bibr" rid="B50">Pesciarelli et al., 2016</xref>). However, the N170 face inversion effect was strongly attenuated in eyeless faces when fixation was on the eyes, but was normal when fixation was on the mouth (<xref ref-type="bibr" rid="B45">Nemrodov et al., 2014</xref>). Moreover, <xref ref-type="bibr" rid="B50">Pesciarelli et al. (2016)</xref> reported that processing of the eyes in inverted faces elicited significantly larger N170 amplitudes compared to upright faces, though this effect was not observed for processing of the mouth. In addition, <xref ref-type="bibr" rid="B50">Pesciarelli et al. (2016)</xref> reported that processing of the mouth elicited significantly larger N170 amplitudes compared to processing of the eyes, but only in upright faces. Further research is needed to examine the relationship between the N170 face inversion effect and the application of cosmetic makeup, with particular focus on the eye and mouth regions.</p>
<p>While N170 amplitudes did not significantly differ between the <italic>No Makeup</italic> and <italic>Eye Shadow</italic> conditions in the present study, the eyes nonetheless play a significant role in facial perception (<xref ref-type="bibr" rid="B33">Janik et al., 1978</xref>; <xref ref-type="bibr" rid="B4">Bentin et al., 1996</xref>; <xref ref-type="bibr" rid="B5">Bentin and Deouell, 2000</xref>; <xref ref-type="bibr" rid="B2">Barton et al., 2006</xref>; <xref ref-type="bibr" rid="B29">Itier et al., 2006</xref>, <xref ref-type="bibr" rid="B28">2007</xref>; <xref ref-type="bibr" rid="B46">Nemrodov and Itier, 2011</xref>; <xref ref-type="bibr" rid="B1">Arizpe et al., 2012</xref>; <xref ref-type="bibr" rid="B45">Nemrodov et al., 2014</xref>). Accordingly, <xref ref-type="bibr" rid="B26">Haxby et al. (2000)</xref> reported that the superior temporal sulcus processes individual facial features, including the changeable aspects of faces and the perception of eye gaze. Moreover, <xref ref-type="bibr" rid="B10">Cecchini et al. (2013)</xref> demonstrated that the left middle temporal gyrus (BA21) exhibits enhanced activation in response to eyes in an intact face condition than to eyes in a scrambled face condition. The results of the present study, in conjunction with the aforementioned findings, indicate that the role of the eyes in facial perception should be investigated with regard to other ERP components or using additional neuroimaging techniques.</p>
</sec>
<sec><title>Conclusion</title>
<p>The present study found that N170 amplitude was significantly increased in response to the application of cosmetic makeup (<italic>Lipstick</italic>), but was unaffected by the <italic>No Makeup</italic> condition. In addition, no significant main effect was identified with regard to condition for P1 amplitude. Therefore, the present results support the notion that the application of cosmetic makeup alters facial perception. These findings subsequently indicate that cosmetic makeup produces a significant effect on facial perception, influencing the processing of specific facial features, with a particular focus on the mouth, as reflected by changes in N170 amplitude.</p>
</sec>
<sec><title>Author Contributions</title>
<p>HT designed the experiments, performed the experiments and EEG data recording. HT performed the EEG data analysis and statistical analyses and wrote the manuscript.</p>
</sec>
<sec><title>Conflict of Interest Statement</title>
<p>The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Arizpe</surname> <given-names>J.</given-names></name> <name><surname>Kravitz</surname> <given-names>D. J.</given-names></name> <name><surname>Yovel</surname> <given-names>G.</given-names></name> <name><surname>Baker</surname> <given-names>C. I.</given-names></name></person-group> (<year>2012</year>). <article-title>Start position strongly influences fixation patterns during face processing: difficulties with eye movements as a measure of information use.</article-title> <source><italic>PLoS ONE</italic></source> <volume>7</volume>:<issue>e31106</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0031106</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barton</surname> <given-names>J. J.</given-names></name> <name><surname>Radcliffe</surname> <given-names>N.</given-names></name> <name><surname>Cherkasova</surname> <given-names>M. V.</given-names></name> <name><surname>Edelman</surname> <given-names>J.</given-names></name> <name><surname>Intriligator</surname> <given-names>J. M.</given-names></name></person-group> (<year>2006</year>). <article-title>Information processing during face recognition: the effects of familiarity, inversion, and morphing on scanning fixations.</article-title> <source><italic>Perception</italic></source> <volume>35</volume> <fpage>1089</fpage>&#x2013;<lpage>1105</lpage>. <pub-id pub-id-type="doi">10.1068/p5547</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Batty</surname> <given-names>M.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2003</year>). <article-title>Early processing of the six basic facial emotional expressions.</article-title> <source><italic>Cogn. Brain Res.</italic></source> <volume>17</volume> <fpage>613</fpage>&#x2013;<lpage>620</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(03)00174-5</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bentin</surname> <given-names>S.</given-names></name> <name><surname>Allison</surname> <given-names>T.</given-names></name> <name><surname>Puce</surname> <given-names>A.</given-names></name> <name><surname>Perez</surname> <given-names>E.</given-names></name> <name><surname>MacCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>1996</year>). <article-title>Electrophysiological studies of face perception in humans.</article-title> <source><italic>J. Cogn. Neurosci.</italic></source> <volume>8</volume> <fpage>551</fpage>&#x2013;<lpage>565</lpage>. <pub-id pub-id-type="doi">10.1162/jocn.1996.8.6.551</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bentin</surname> <given-names>S.</given-names></name> <name><surname>Deouell</surname> <given-names>L.</given-names></name></person-group> (<year>2000</year>). <article-title>Structural encoding and identification in face processing: ERP evidence for separate mechanisms.</article-title> <source><italic>Cogn. Neuropsychol.</italic></source> <volume>17</volume> <fpage>35</fpage>&#x2013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1080/026432900380472</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Blais</surname> <given-names>C.</given-names></name> <name><surname>Jack</surname> <given-names>R. E.</given-names></name> <name><surname>Scheepers</surname> <given-names>C.</given-names></name> <name><surname>Fiset</surname> <given-names>D.</given-names></name> <name><surname>Caldara</surname> <given-names>R.</given-names></name></person-group> (<year>2008</year>). <article-title>Culture shapes how we look at faces.</article-title> <source><italic>PLoS ONE</italic></source> <volume>3</volume>:<issue>e3022</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0003022</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Blau</surname> <given-names>V. C.</given-names></name> <name><surname>Maurer</surname> <given-names>U.</given-names></name> <name><surname>Tottenham</surname> <given-names>N.</given-names></name> <name><surname>McCandliss</surname> <given-names>B. D.</given-names></name></person-group> (<year>2007</year>). <article-title>The face-specific N170 component is modulated by emotional facial expression.</article-title> <source><italic>Behav. Brain Res.</italic></source> <volume>3</volume> <issue>7</issue>. <pub-id pub-id-type="doi">10.1186/1744-9081-3-7</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caharel</surname> <given-names>S.</given-names></name> <name><surname>Collet</surname> <given-names>K.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2015</year>). <article-title>The early visual encoding of a face (N170) is viewpoint-dependent: a parametric ERP-adaptation study.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>106</volume> <fpage>18</fpage>&#x2013;<lpage>27</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2015.01.010</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caharel</surname> <given-names>S.</given-names></name> <name><surname>Courtay</surname> <given-names>N.</given-names></name> <name><surname>Bernard</surname> <given-names>C.</given-names></name> <name><surname>Lalonde</surname> <given-names>R.</given-names></name> <name><surname>Rebai</surname> <given-names>M.</given-names></name></person-group> (<year>2005</year>). <article-title>Familiarity andemotional expression influence an early stage of face processing: an electro-physiological study.</article-title> <source><italic>Brain Cogn.</italic></source> <volume>59</volume> <fpage>96</fpage>&#x2013;<lpage>100</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandc.2005.05.005</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cecchini</surname> <given-names>M.</given-names></name> <name><surname>Aceto</surname> <given-names>P.</given-names></name> <name><surname>Altavilla</surname> <given-names>D.</given-names></name> <name><surname>Palumbo</surname> <given-names>L.</given-names></name> <name><surname>Lai</surname> <given-names>C.</given-names></name></person-group> (<year>2013</year>). <article-title>The role of the eyes in processing an intact face and its scrambled image: a dense array ERP and low-resolution electromagnetic tomography (sLORETA) study.</article-title> <source><italic>Soc. Neurosci.</italic></source> <volume>8</volume> <fpage>314</fpage>&#x2013;<lpage>325</lpage>. <pub-id pub-id-type="doi">10.1080/17470919.2013.797020</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>J.</given-names></name> <name><surname>Liu</surname> <given-names>B.</given-names></name> <name><surname>Chen</surname> <given-names>B.</given-names></name> <name><surname>Fang</surname> <given-names>F.</given-names></name></person-group> (<year>2009</year>). <article-title>Time course of amodal completion in face perception.</article-title> <source><italic>Vision Res.</italic></source> <volume>49</volume> <fpage>752</fpage>&#x2013;<lpage>758</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2009.02.005</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>daSilva</surname> <given-names>E. B.</given-names></name> <name><surname>Crager</surname> <given-names>K.</given-names></name> <name><surname>Geisler</surname> <given-names>D.</given-names></name> <name><surname>Newbern</surname> <given-names>P.</given-names></name> <name><surname>Orem</surname> <given-names>B.</given-names></name> <name><surname>Puce</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). <article-title>Something to sink your teeth into: the presence of teeth augments ERPs to mouth expressions.</article-title> <source><italic>Neuroimage</italic></source> <volume>127</volume> <fpage>227</fpage>&#x2013;<lpage>241</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2015.12.020</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dering</surname> <given-names>B.</given-names></name> <name><surname>Martin</surname> <given-names>C. D.</given-names></name> <name><surname>Moro</surname> <given-names>S.</given-names></name> <name><surname>Pegna</surname> <given-names>A. J.</given-names></name> <name><surname>Thierry</surname> <given-names>G.</given-names></name></person-group> (<year>2011</year>). <article-title>Face-sensitive processes one hundred milliseconds after picture onset.</article-title> <source><italic>Front. Hum. Neurosci.</italic></source> <volume>5</volume>:<issue>93</issue>. <pub-id pub-id-type="doi">10.3389/fnhum.2011.00093</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eger</surname> <given-names>E.</given-names></name> <name><surname>Jedynak</surname> <given-names>A.</given-names></name> <name><surname>Iwaki</surname> <given-names>T.</given-names></name> <name><surname>Skrandies</surname> <given-names>W.</given-names></name></person-group> (<year>2003</year>). <article-title>Rapid extraction of emotional expression: evidence from evoked potential fields during brief presentation offace stimuli.</article-title> <source><italic>Neuropsychologia</italic></source> <volume>41</volume> <fpage>808</fpage>&#x2013;<lpage>817</lpage>. <pub-id pub-id-type="doi">10.1016/S0028-3932(02)00287-7</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name></person-group> (<year>1998</year>). <article-title>Does the face-specific N170 component reflect the activity of a specialized eye processor?</article-title> <source><italic>Neuroreport</italic></source> <volume>9</volume> <fpage>2945</fpage>&#x2013;<lpage>2948</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-199809140-00005</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name></person-group> (<year>2000a</year>). <article-title>Effects of face inversion on the structural encoding and recognition of faces: evidence from event-related brain potentials.</article-title> <source><italic>Cogn. Brain Res.</italic></source> <volume>10</volume> <fpage>145</fpage>&#x2013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(00)00038-0</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name></person-group> (<year>2000b</year>). <article-title>The face-specific N170 component reflects late stages in the structural encoding of faces.</article-title> <source><italic>Neuroreport</italic></source> <volume>11</volume> <fpage>2319</fpage>&#x2013;<lpage>2324</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-200007140-00050</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name></person-group> (<year>2011</year>). &#x201C;<article-title>The face-sensitive N170 component of the event-related brain potential</article-title>,&#x201D; in <source><italic>The Oxford Handbook of Face Perception</italic>,</source> <role>eds</role> <person-group person-group-type="editor"><name><surname>Calder</surname> <given-names>A.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Johnson</surname> <given-names>M. H.</given-names></name> <name><surname>Haxby</surname> <given-names>J. V.</given-names></name></person-group> (<publisher-loc>Oxford</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>), <fpage>329</fpage>&#x2013;<lpage>344</lpage>.</citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name> <name><surname>Kiss</surname> <given-names>M.</given-names></name> <name><surname>Nicholas</surname> <given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>Response profile of the face-sensitive N170 component: a rapid adaptation study.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>20</volume> <fpage>2442</fpage>&#x2013;<lpage>2452</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhp312</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name> <name><surname>McCarthy</surname> <given-names>R. A.</given-names></name></person-group> (<year>1999</year>). <article-title>Prosopagnosia and structural encoding of faces: evidence from event-related potentials.</article-title> <source><italic>Neuroreport</italic></source> <volume>10</volume> <fpage>255</fpage>&#x2013;<lpage>259</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-199902050-00010</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Etcoff</surname> <given-names>N. L.</given-names></name> <name><surname>Stock</surname> <given-names>S.</given-names></name> <name><surname>Haley</surname> <given-names>L. E.</given-names></name> <name><surname>Vickery</surname> <given-names>S. A.</given-names></name> <name><surname>House</surname> <given-names>D. M.</given-names></name></person-group> (<year>2011</year>). <article-title>Cosmetics as a feature of the extended human phenotype: modulation of the perception of biologically important facial signals.</article-title> <source><italic>PLoS ONE</italic></source> <volume>6</volume>:<issue>e25656</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0025656</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>George</surname> <given-names>N.</given-names></name> <name><surname>Evans</surname> <given-names>J.</given-names></name> <name><surname>Fiori</surname> <given-names>N.</given-names></name> <name><surname>Davidoff</surname> <given-names>J.</given-names></name> <name><surname>Renault</surname> <given-names>B.</given-names></name></person-group> (<year>1996</year>). <article-title>Brain events related to normal and moderately scrambled faces.</article-title> <source><italic>Cogn. Brain Res.</italic></source> <volume>4</volume> <fpage>65</fpage>&#x2013;<lpage>76</lpage>. <pub-id pub-id-type="doi">10.1016/0926-6410(95)00045-3</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Goffaux</surname> <given-names>V.</given-names></name> <name><surname>Gauthier</surname> <given-names>I.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2003</year>). <article-title>Spatial scale contribution to early visual differences between face and object processing.</article-title> <source><italic>Cogn. Brain Res.</italic></source> <volume>16</volume> <fpage>416</fpage>&#x2013;<lpage>424</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(03)00056-9</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Golby</surname> <given-names>A. J.</given-names></name> <name><surname>Gabrieli</surname> <given-names>J. D.</given-names></name> <name><surname>Chiao</surname> <given-names>J. Y.</given-names></name> <name><surname>Eberhardt</surname> <given-names>J. L.</given-names></name></person-group> (<year>2001</year>). <article-title>Differential responses in the fusiform region to same-race and other-race faces.</article-title> <source><italic>Nat. Neurosci.</italic></source> <volume>4</volume> <fpage>845</fpage>&#x2013;<lpage>850</lpage>. <pub-id pub-id-type="doi">10.1038/90565</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Graham</surname> <given-names>J. A.</given-names></name> <name><surname>Jouhar</surname> <given-names>A. J.</given-names></name></person-group> (<year>1981</year>). <article-title>The effects of cosmetics on person perception.</article-title> <source><italic>Int. J. Cosmet. Sci.</italic></source> <volume>3</volume> <fpage>199</fpage>&#x2013;<lpage>210</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-2494.1981.tb00283.x</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haxby</surname> <given-names>J. V.</given-names></name> <name><surname>Hoffman</surname> <given-names>E. A.</given-names></name> <name><surname>Gobbini</surname> <given-names>M. I.</given-names></name></person-group> (<year>2000</year>). <article-title>The distributed human neural system for face perception.</article-title> <source><italic>Trends Cogn. Sci.</italic></source> <volume>4</volume> <fpage>223</fpage>&#x2013;<lpage>233</lpage>. <pub-id pub-id-type="doi">10.1016/S1364-6613(00)01482-0</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Herrmann</surname> <given-names>M. J.</given-names></name> <name><surname>Ehlis</surname> <given-names>A. C.</given-names></name> <name><surname>Muehlberger</surname> <given-names>A.</given-names></name> <name><surname>Fallgatter</surname> <given-names>A. J.</given-names></name></person-group> (<year>2005</year>). <article-title>Source localization of early stages of face processing.</article-title> <source><italic>Brain Topogr.</italic></source> <volume>18</volume> <fpage>77</fpage>&#x2013;<lpage>85</lpage>. <pub-id pub-id-type="doi">10.1007/s10548-005-0277-7</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Alain</surname> <given-names>C.</given-names></name> <name><surname>Sedore</surname> <given-names>K.</given-names></name> <name><surname>McIntosh</surname> <given-names>A. R.</given-names></name></person-group> (<year>2007</year>). <article-title>Early face processing specificity: it&#x2019;s in the eyes!</article-title> <source><italic>J. Cogn. Neurosci.</italic></source> <volume>19</volume> <fpage>1815</fpage>&#x2013;<lpage>1826</lpage>. <pub-id pub-id-type="doi">10.1162/jocn.2007.19.11.1815</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Latinus</surname> <given-names>M.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2006</year>). <article-title>Face, eye and object early processing: what is the face specificity?</article-title> <source><italic>Neuroimage</italic></source> <volume>29</volume> <fpage>667</fpage>&#x2013;<lpage>676</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2005.07.041</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2002</year>). <article-title>Inversion and contrast polarity reversal affect both encoding and recognition processes of unfamiliar faces: a repetition study using ERPs.</article-title> <source><italic>Neuroimage</italic></source> <volume>15</volume> <fpage>353</fpage>&#x2013;<lpage>372</lpage>. <pub-id pub-id-type="doi">10.1006/nimg.2001.0982</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2004a</year>). <article-title>N170 or N1? Spatiotemporal differences between object and face processing using ERPs.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>14</volume> <fpage>132</fpage>&#x2013;<lpage>142</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhg111</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Itier</surname> <given-names>R. J.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2004b</year>). <article-title>Source analysis of the N170 to faces and objects.</article-title> <source><italic>Neuroreport</italic></source> <volume>15</volume> <fpage>1261</fpage>&#x2013;<lpage>1265</lpage>. <pub-id pub-id-type="doi">10.1097/01.wnr.0000127827.3576.d8</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Janik</surname> <given-names>S. W.</given-names></name> <name><surname>Wellens</surname> <given-names>A. R.</given-names></name> <name><surname>Goldberg</surname> <given-names>M. L.</given-names></name> <name><surname>Dell&#x2019;Osso</surname> <given-names>L. F.</given-names></name></person-group> (<year>1978</year>). <article-title>Eyes as the center of focus in the visual examination of human faces.</article-title> <source><italic>Percept. Mot. Skills</italic></source> <volume>47</volume> <fpage>857</fpage>&#x2013;<lpage>858</lpage>. <pub-id pub-id-type="doi">10.2466/pms.1978.47.3.857</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jemel</surname> <given-names>B.</given-names></name> <name><surname>George</surname> <given-names>N.</given-names></name> <name><surname>Chaby</surname> <given-names>L.</given-names></name> <name><surname>Fiori</surname> <given-names>N.</given-names></name> <name><surname>Renault</surname> <given-names>B.</given-names></name></person-group> (<year>1999</year>). <article-title>Differential processing of part-to-whole and part-to-part face priming: an ERP study.</article-title> <source><italic>Neuroreport</italic></source> <volume>10</volume> <fpage>1069</fpage>&#x2013;<lpage>1075</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-199904060-00031</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jones</surname> <given-names>A. L.</given-names></name> <name><surname>Kramer</surname> <given-names>R. S.</given-names></name> <name><surname>Ward</surname> <given-names>R.</given-names></name></person-group> (<year>2014</year>). <article-title>Miscalibrations in judgements of attractiveness with cosmetics.</article-title> <source><italic>Q. J. Exp. Psychol.</italic></source> <volume>67</volume> <fpage>2060</fpage>&#x2013;<lpage>2068</lpage>. <pub-id pub-id-type="doi">10.1080/17470218.2014.908932</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jones</surname> <given-names>A. L.</given-names></name> <name><surname>Kramer</surname> <given-names>S. S.</given-names></name></person-group> (<year>2015</year>). <article-title>Facial cosmetics have little effect on attractiveness judgments compared with identity.</article-title> <source><italic>Perception</italic></source> <volume>44</volume> <fpage>79</fpage>&#x2013;<lpage>86</lpage>. <pub-id pub-id-type="doi">10.1068/p7904</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jones</surname> <given-names>A. L.</given-names></name> <name><surname>Russell</surname> <given-names>R.</given-names></name> <name><surname>Ward</surname> <given-names>R.</given-names></name></person-group> (<year>2015</year>). <article-title>Cosmetics alter biologically based factors of beauty: evidence from facial contrast.</article-title> <source><italic>Evol. Psychol.</italic></source> <volume>13</volume> <fpage>210</fpage>&#x2013;<lpage>229</lpage>. <pub-id pub-id-type="doi">10.1177/147470491501300113</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jurcak</surname> <given-names>V.</given-names></name> <name><surname>Tsuzuki</surname> <given-names>D.</given-names></name> <name><surname>Dan</surname> <given-names>I.</given-names></name></person-group> (<year>2007</year>). <article-title>10/20, 10/10, and 10/5 systems revisited: their validity as relative head-surface-based positioning systems.</article-title> <source><italic>Neuroimage</italic></source> <volume>34</volume> <fpage>1600</fpage>&#x2013;<lpage>1611</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2006.09.024</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kov&#x00E1;cs</surname> <given-names>G.</given-names></name> <name><surname>Zimmer</surname> <given-names>M.</given-names></name> <name><surname>Bank&#x00F3;</surname> <given-names>E.</given-names></name> <name><surname>Harza</surname> <given-names>I.</given-names></name> <name><surname>Antal</surname> <given-names>A.</given-names></name> <name><surname>Vidny&#x00E1;nszky</surname> <given-names>Z.</given-names></name></person-group> (<year>2006</year>). <article-title>Electrophysiological correlates of visual adaptation to faces and body parts in humans.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>16</volume> <fpage>742</fpage>&#x2013;<lpage>753</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhj020</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lepp&#x00E4;nen</surname> <given-names>J. M.</given-names></name> <name><surname>Kauppinen</surname> <given-names>P.</given-names></name> <name><surname>Peltola</surname> <given-names>M. J.</given-names></name> <name><surname>Hietanen</surname> <given-names>J. K.</given-names></name></person-group> (<year>2007</year>). <article-title>Differential elec-trocortical responses to increasing intensities of fearful and happy emotionalexpressions.</article-title> <source><italic>Brain Res.</italic></source> <volume>1166</volume> <fpage>103</fpage>&#x2013;<lpage>109</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2007.06.060</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liu</surname> <given-names>J.</given-names></name> <name><surname>Harris</surname> <given-names>A.</given-names></name> <name><surname>Kanwisher</surname> <given-names>N.</given-names></name></person-group> (<year>2002</year>). <article-title>Stages of processing in face perception: an MEG study.</article-title> <source><italic>Nat. Neurosci.</italic></source> <volume>5</volume> <fpage>910</fpage>&#x2013;<lpage>916</lpage>. <pub-id pub-id-type="doi">10.1038/nn909</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Luo</surname> <given-names>S.</given-names></name> <name><surname>Luo</surname> <given-names>W.</given-names></name> <name><surname>He</surname> <given-names>W.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Luo</surname> <given-names>Y.</given-names></name></person-group> (<year>2013</year>). <article-title>P1 and N170 components distinguish human-like and animal-like makeup stimuli.</article-title> <source><italic>Neuroreport</italic></source> <volume>24</volume> <fpage>482</fpage>&#x2013;<lpage>486</lpage>. <pub-id pub-id-type="doi">10.1097/WNR.0b013e328361cf08</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Magnuski</surname> <given-names>M.</given-names></name> <name><surname>Gola</surname> <given-names>M.</given-names></name></person-group> (<year>2013</year>). <article-title>It&#x2019;s not only in the eyes: nonlinear relationship between face orientation and N170 amplitude irrespective of eye presence.</article-title> <source><italic>Int. J. Psychophysiol.</italic></source> <volume>89</volume> <fpage>358</fpage>&#x2013;<lpage>365</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2013.04.016</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mulhern</surname> <given-names>R.</given-names></name> <name><surname>Fieldman</surname> <given-names>G.</given-names></name> <name><surname>Hussey</surname> <given-names>T.</given-names></name> <name><surname>L&#x00E9;v&#x00EA;que</surname> <given-names>J. L.</given-names></name> <name><surname>Pineau</surname> <given-names>P.</given-names></name></person-group> (<year>2003</year>). <article-title>Do cosmetics enhance female Caucasian facial attractiveness?</article-title> <source><italic>Int. J. Cosmet. Sci.</italic></source> <volume>25</volume> <fpage>199</fpage>&#x2013;<lpage>205</lpage>. <pub-id pub-id-type="doi">10.1046/j.1467-2494.2003.00188.x</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nemrodov</surname> <given-names>D.</given-names></name> <name><surname>Anderson</surname> <given-names>T.</given-names></name> <name><surname>Preston</surname> <given-names>F. F.</given-names></name> <name><surname>Itier</surname> <given-names>R. J.</given-names></name></person-group> (<year>2014</year>). <article-title>Early sensitivity for eyes within faces: a new neuronal account of holistic and featural processing.</article-title> <source><italic>Neuroimage</italic></source> <volume>97</volume> <fpage>81</fpage>&#x2013;<lpage>94</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2014.04.042</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nemrodov</surname> <given-names>D.</given-names></name> <name><surname>Itier</surname> <given-names>R. J.</given-names></name></person-group> (<year>2011</year>). <article-title>The role of eyes in early face processing: a rapid adaptation study of the inversion effect.</article-title> <source><italic>Br. J. Psychol.</italic></source> <volume>102</volume> <fpage>783</fpage>&#x2013;<lpage>798</lpage>. <pub-id pub-id-type="doi">10.1111/j.2044-8295.2011.02033.x</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Okazaki</surname> <given-names>Y.</given-names></name> <name><surname>Abrahamyan</surname> <given-names>A.</given-names></name> <name><surname>Stevens</surname> <given-names>C. J.</given-names></name> <name><surname>Ioannides</surname> <given-names>A. A.</given-names></name></person-group> (<year>2008</year>). <article-title>The timing of face selectivity and attentional modulation in visual processing.</article-title> <source><italic>Neuroscience</italic></source> <volume>152</volume> <fpage>1130</fpage>&#x2013;<lpage>1144</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroscience.2008.01.056</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oostenveld</surname> <given-names>R.</given-names></name> <name><surname>Praamstra</surname> <given-names>P.</given-names></name></person-group> (<year>2001</year>). <article-title>The five percent electrode system for high-resolution EEG and ERP measurements.</article-title> <source><italic>Clin. Neurophysiol.</italic></source> <volume>112</volume> <fpage>713</fpage>&#x2013;<lpage>719</lpage>. <pub-id pub-id-type="doi">10.1016/S1388-2457(00)00527-7</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peng</surname> <given-names>M.</given-names></name> <name><surname>De Beuckelaer</surname> <given-names>A.</given-names></name> <name><surname>Yuan</surname> <given-names>L.</given-names></name> <name><surname>Zhou</surname> <given-names>R.</given-names></name></person-group> (<year>2012</year>). <article-title>The processing of anticipated and unanticipated fearful faces: an ERP study.</article-title> <source><italic>Neurosci. Lett.</italic></source> <volume>526</volume> <fpage>85</fpage>&#x2013;<lpage>90</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2012.08.009</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pesciarelli</surname> <given-names>F.</given-names></name> <name><surname>Leo</surname> <given-names>I.</given-names></name> <name><surname>Sarlo</surname> <given-names>M.</given-names></name></person-group> (<year>2016</year>). <article-title>Implicit processing of the eyes and mouth: evidence from human electrophysiology.</article-title> <source><italic>PLoS ONE</italic></source> <volume>11</volume>:<issue>e0147415</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0147415</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Porcheron</surname> <given-names>A.</given-names></name> <name><surname>Mauger</surname> <given-names>E.</given-names></name> <name><surname>Russell</surname> <given-names>R.</given-names></name></person-group> (<year>2013</year>). <article-title>Aspects of facial contrast decrease with age and are cues for age perception.</article-title> <source><italic>PLoS ONE</italic></source> <volume>8</volume>:<issue>e57985</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0057985</pub-id></citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Puce</surname> <given-names>A.</given-names></name> <name><surname>Allison</surname> <given-names>T.</given-names></name> <name><surname>Gore</surname> <given-names>J. C.</given-names></name> <name><surname>MacCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>1995</year>). <article-title>Face-sensitive regions in human extrastriate cortex studied by functional MRI.</article-title> <source><italic>J. Neurophysiol.</italic></source> <volume>74</volume> <fpage>1192</fpage>&#x2013;<lpage>1199</lpage>.</citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ran</surname> <given-names>G.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Chen</surname> <given-names>X.</given-names></name> <name><surname>Pan</surname> <given-names>Y.</given-names></name></person-group> (<year>2014</year>). <article-title>The effects of prediction on the perception for own-race and other-race faces.</article-title> <source><italic>PLoS ONE</italic></source> <volume>9</volume>:<issue>e114011</issue>. <pub-id pub-id-type="doi">10.1371/journal.pone.0114011</pub-id></citation></ref>
<ref id="B54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Caharel</surname> <given-names>S.</given-names></name></person-group> (<year>2011</year>). <article-title>ERP evidence for the speed of face categorization in the human brain: disentangling the contribution of low-level visual cues from face perception.</article-title> <source><italic>Vision Res.</italic></source> <volume>51</volume> <fpage>1297</fpage>&#x2013;<lpage>1311</lpage>. <pub-id pub-id-type="doi">10.1016/j.visres.2011.04.003</pub-id></citation></ref>
<ref id="B55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Delvenne</surname> <given-names>J. F.</given-names></name> <name><surname>Debatisse</surname> <given-names>D.</given-names></name> <name><surname>Goffaux</surname> <given-names>V.</given-names></name> <name><surname>Bruyer</surname> <given-names>R.</given-names></name> <name><surname>Crommelinck</surname> <given-names>M.</given-names></name><etal/></person-group> (<year>1999</year>). <article-title>Spatio-temporal localization of the face inversion effect: an event-related potentials study.</article-title> <source><italic>Biol. Psychol.</italic></source> <volume>50</volume> <fpage>173</fpage>&#x2013;<lpage>189</lpage>. <pub-id pub-id-type="doi">10.1016/S0301-0511(99)00013-7</pub-id></citation></ref>
<ref id="B56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Jacques</surname> <given-names>C.</given-names></name></person-group> (<year>2008</year>). <article-title>Does physical interstimulus variance account for early electrophysiological face sensitive responses in the human brain? Ten lessons on the N170.</article-title> <source><italic>Neuroimage</italic></source> <volume>39</volume> <fpage>1959</fpage>&#x2013;<lpage>1979</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2007.10.011</pub-id></citation></ref>
<ref id="B57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Russell</surname> <given-names>R.</given-names></name></person-group> (<year>2003</year>). <article-title>Sex, beauty, and the relative luminance of facial features.</article-title> <source><italic>Perception</italic></source> <volume>32</volume> <fpage>1093</fpage>&#x2013;<lpage>1107</lpage>. <pub-id pub-id-type="doi">10.1068/p5101</pub-id></citation></ref>
<ref id="B58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Russell</surname> <given-names>R.</given-names></name></person-group> (<year>2009</year>). <article-title>A sex difference in facial pigmentation and its exaggeration by cosmetics.</article-title> <source><italic>Perception</italic></source> <volume>38</volume> <fpage>1211</fpage>&#x2013;<lpage>1219</lpage>. <pub-id pub-id-type="doi">10.1068/p6331</pub-id></citation></ref>
<ref id="B59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Russell</surname> <given-names>R.</given-names></name> <name><surname>Porcheron</surname> <given-names>A.</given-names></name> <name><surname>Sweda</surname> <given-names>J. R.</given-names></name> <name><surname>Jones</surname> <given-names>A. L.</given-names></name> <name><surname>Mauger</surname> <given-names>E.</given-names></name> <name><surname>Morizot</surname> <given-names>F.</given-names></name></person-group> (<year>2016</year>). <article-title>Facial contrast is a cue for perceiving health from the face.</article-title> <source><italic>J. Exp. Psychol. Hum. Percept. Perform.</italic></source> <pub-id pub-id-type="doi">10.1037/xhp0000219</pub-id> <comment>[Epub ahead of print].</comment></citation></ref>
<ref id="B60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sadeh</surname> <given-names>B.</given-names></name> <name><surname>Podlipsky</surname> <given-names>I.</given-names></name> <name><surname>Zhdanov</surname> <given-names>A.</given-names></name> <name><surname>Yovel</surname> <given-names>G.</given-names></name></person-group> (<year>2010</year>). <article-title>Event-related potential and functional MRI measures of face-selectivity are highly correlated: a simultaneous ERP-fMRI investigation.</article-title> <source><italic>Hum. Brain Mapp.</italic></source> <volume>31</volume> <fpage>1490</fpage>&#x2013;<lpage>1501</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20952</pub-id></citation></ref>
<ref id="B61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2011</year>). &#x201C;<article-title>Neurophysiological correlates of face recognition</article-title>,&#x201D; in <source><italic>The Oxford Handbook of Face Perception</italic>,</source> <role>eds</role> <person-group person-group-type="editor"><name><surname>Calder</surname> <given-names>A.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name> <name><surname>Johnson</surname> <given-names>M. H.</given-names></name> <name><surname>Haxby</surname> <given-names>J. V.</given-names></name></person-group> (<publisher-loc>Oxford</publisher-loc>: <publisher-name>Oxford University Press</publisher-name>), <fpage>345</fpage>&#x2013;<lpage>366</lpage>.</citation></ref>
<ref id="B62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stahl</surname> <given-names>J.</given-names></name> <name><surname>Wiese</surname> <given-names>H.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2010</year>). <article-title>Learning task affects ERP-correlates of the own-race bias, but not recognition memory performance.</article-title> <source><italic>Neuropsychologia</italic></source> <volume>48</volume> <fpage>2027</fpage>&#x2013;<lpage>2040</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2010.03.024</pub-id></citation></ref>
<ref id="B63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stephen</surname> <given-names>I. D.</given-names></name> <name><surname>McKeegan</surname> <given-names>A. M.</given-names></name></person-group> (<year>2010</year>). <article-title>Lip colour affects perceived sex typicality and attractiveness of human faces.</article-title> <source><italic>Perception</italic></source> <volume>39</volume> <fpage>1104</fpage>&#x2013;<lpage>1110</lpage>. <pub-id pub-id-type="doi">10.1068/p6730</pub-id></citation></ref>
<ref id="B64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Susac</surname> <given-names>A.</given-names></name> <name><surname>Ilmoniemi</surname> <given-names>R. J.</given-names></name> <name><surname>Pihko</surname> <given-names>E.</given-names></name> <name><surname>Nurminen</surname> <given-names>J.</given-names></name> <name><surname>Supek</surname> <given-names>S.</given-names></name></person-group> (<year>2009</year>). <article-title>Early dissociation of face and object processing: a magnetoencephalographic study.</article-title> <source><italic>Hum. Brain Mapp.</italic></source> <volume>30</volume> <fpage>917</fpage>&#x2013;<lpage>927</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20557</pub-id></citation></ref>
<ref id="B65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tarkiainen</surname> <given-names>A.</given-names></name> <name><surname>Cornelissen</surname> <given-names>P. L.</given-names></name> <name><surname>Salmelin</surname> <given-names>R.</given-names></name></person-group> (<year>2002</year>). <article-title>Dynamics of visual feature analysis and object-level processing in face versus letter-string perception.</article-title> <source><italic>Brain</italic></source> <volume>125</volume> <fpage>1125</fpage>&#x2013;<lpage>1136</lpage>. <pub-id pub-id-type="doi">10.1093/brain/awf112</pub-id></citation></ref>
<ref id="B66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Thierry</surname> <given-names>G.</given-names></name> <name><surname>Martin</surname> <given-names>C. D.</given-names></name> <name><surname>Downing</surname> <given-names>P.</given-names></name> <name><surname>Pegna</surname> <given-names>A. J.</given-names></name></person-group> (<year>2007</year>). <article-title>Controlling for interstimulus perceptual variance abolishes N170 face selectivity.</article-title> <source><italic>Nat. Neurosci.</italic></source> <volume>10</volume> <fpage>505</fpage>&#x2013;<lpage>511</lpage>. <pub-id pub-id-type="doi">10.1038/nn1864</pub-id></citation></ref>
<ref id="B67"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ueda</surname> <given-names>S.</given-names></name> <name><surname>Koyama</surname> <given-names>T.</given-names></name></person-group> (<year>2011</year>). <article-title>Influence of eye make-up on the perception of gaze direction.</article-title> <source><italic>Int. J. Cosmet. Sci.</italic></source> <volume>33</volume> <fpage>514</fpage>&#x2013;<lpage>518</lpage>. <pub-id pub-id-type="doi">10.1111/j.1468-2494.2011.00664.x</pub-id></citation></ref>
<ref id="B68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vizioli</surname> <given-names>L.</given-names></name> <name><surname>Foreman</surname> <given-names>K.</given-names></name> <name><surname>Rousselet</surname> <given-names>G. A.</given-names></name> <name><surname>Caldara</surname> <given-names>R.</given-names></name></person-group> (<year>2010a</year>). <article-title>Inverting faces elicits sensitivity to race on the N170 component: a cross-cultural study.</article-title> <source><italic>J. Vis.</italic></source> <volume>10</volume> <fpage>1</fpage>&#x2013;<lpage>23</lpage>. <pub-id pub-id-type="doi">10.1167/10.1.15</pub-id></citation></ref>
<ref id="B69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vizioli</surname> <given-names>L.</given-names></name> <name><surname>Rousselet</surname> <given-names>G. A.</given-names></name> <name><surname>Caldara</surname> <given-names>R.</given-names></name></person-group> (<year>2010b</year>). <article-title>Neural repetition suppression to identity is abolished by other-race faces.</article-title> <source><italic>Proc. Natl. Acad. Sci. U.S.A.</italic></source> <volume>107</volume> <fpage>20081</fpage>&#x2013;<lpage>20086</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1005751107</pub-id></citation></ref>
<ref id="B70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Watanabe</surname> <given-names>S.</given-names></name> <name><surname>Kakigi</surname> <given-names>R.</given-names></name> <name><surname>Koyama</surname> <given-names>S.</given-names></name> <name><surname>Kirino</surname> <given-names>E.</given-names></name></person-group> (<year>1999a</year>). <article-title>Human face perception traced by magneto- and electro-encephalography.</article-title> <source><italic>Cogn. Brain Res.</italic></source> <volume>8</volume> <fpage>125</fpage>&#x2013;<lpage>142</lpage>. <pub-id pub-id-type="doi">10.1016/S0926-6410(99)00013-0</pub-id></citation></ref>
<ref id="B71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Watanabe</surname> <given-names>S.</given-names></name> <name><surname>Kakigi</surname> <given-names>R.</given-names></name> <name><surname>Koyama</surname> <given-names>S.</given-names></name> <name><surname>Kirino</surname> <given-names>E.</given-names></name></person-group> (<year>1999b</year>). <article-title>It takes longer to recognize the eyes than the whole face in humans.</article-title> <source><italic>Neuroreport</italic></source> <volume>10</volume> <fpage>2193</fpage>&#x2013;<lpage>2198</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-199907130-00035</pub-id></citation></ref>
<ref id="B72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wiese</surname> <given-names>H.</given-names></name> <name><surname>Kaufmann</surname> <given-names>J. M.</given-names></name> <name><surname>Schweinberger</surname> <given-names>S. R.</given-names></name></person-group> (<year>2012</year>). <article-title>The neural signature of the own-race bias: evidence from event-related potentials.</article-title> <source><italic>Cereb. Cortex</italic></source> <volume>24</volume> <fpage>826</fpage>&#x2013;<lpage>835</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhs369</pub-id></citation></ref>
<ref id="B73"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zimmer</surname> <given-names>M.</given-names></name> <name><surname>Kov&#x00E1;cs</surname> <given-names>G.</given-names></name></person-group> (<year>2011</year>). <article-title>Electrophysiological correlates of face distortion after-effects.</article-title> <source><italic>Q. J. Exp. Psychol.</italic></source> <volume>64</volume> <fpage>533</fpage>&#x2013;<lpage>544</lpage>. <pub-id pub-id-type="doi">10.1080/17470218.2010.501964</pub-id></citation></ref>
</ref-list>
<fn-group>
<fn id="fn01"><label>1</label><p><ext-link ext-link-type="uri" xlink:href="http://www.air-lights.com/recruit.html">http://www.air-lights.com/recruit.html</ext-link></p></fn>
<fn id="fn02"><label>2</label><p><ext-link ext-link-type="uri" xlink:href="http://ameblo.jp/studioaquarius/entry-11473277532.html">http://ameblo.jp/studioaquarius/entry-11473277532.html</ext-link></p></fn>
<fn id="fn03"><label>3</label><p><ext-link ext-link-type="uri" xlink:href="http://jp.perfectcorp.com/&#x005C;#ymk">http://jp.perfectcorp.com/&#x005C;#ymk</ext-link></p></fn>
</fn-group>
</back>
</article>