<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychology</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychology</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Research Foundation</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2010.00169</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Neural Markers of Opposite-Sex Bias in Face Processing</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Proverbio</surname> <given-names>Alice Mado</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="author-notes" rid="fn001">&#x0002A;</xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Riva</surname> <given-names>Federica</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Martin</surname> <given-names>Eleonora</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Zani</surname> <given-names>Alberto</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychology, University of Milano-Bicocca</institution> <country>Milan, Italy</country></aff>
<aff id="aff2"><sup>2</sup><institution>Institute of Bioimaging and Molecular Physiology, National Research Council</institution> <country>Segrate, Milan, Italy</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Guillaume A. Rousselet, University of Glasgow, UK</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Pejman Sehatpour, Nathan Kline Institute, USA; Lisa R. Betts, McMaster University, Canada</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Alice Mado Proverbio, Department of Psychology, University of Milano-Bicocca, Via dell&#x00027;Innovazione 10, 20126 Milan, Italy. e-mail: <email>mado.proverbio&#x00040;unimib.it</email></p></fn>
<fn fn-type="other" id="fn002"><p>This article was submitted to Frontiers in Perception Science, a specialty of Frontiers in Psychology.</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>18</day>
<month>10</month>
<year>2010</year>
</pub-date>
<pub-date pub-type="collection">
<year>2010</year>
</pub-date>
<volume>1</volume>
<elocation-id>169</elocation-id>
<history>
<date date-type="received">
<day>20</day>
<month>04</month>
<year>2010</year>
</date>
<date date-type="accepted">
<day>24</day>
<month>09</month>
<year>2010</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2010 Proverbio, Riva, Martin and Zani.</copyright-statement>
<copyright-year>2010</copyright-year>
<license license-type="open-access" xlink:href="http://www.frontiersin.org/licenseagreement"><p>This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.</p></license>
</permissions>
<abstract>
<p>Some behavioral and neuroimaging studies suggest that adults prefer to view attractive faces of the opposite sex more than attractive faces of the same sex. However, unlike the other-race face effect (Caldara et al., <xref ref-type="bibr" rid="B4">2004</xref>), little is known regarding the existence of an opposite-/same-sex bias in face processing. In this study, the faces of 130 attractive male and female adults were foveally presented to 40 heterosexual university students (20 men and 20 women) who were engaged in a secondary perceptual task (landscape detection). The automatic processing of face gender was investigated by recording ERPs from 128 scalp sites. Neural markers of opposite- vs. same-sex bias in face processing included larger and earlier centro&#x02013;parietal N400s in response to faces of the opposite sex and a larger late positivity (LP) to same-sex faces. Analysis of intra-cortical neural generators (swLORETA) showed that facial processing-related (FG, BA37, BA20/21) and emotion-related brain areas (the right parahippocampal gyrus, BA35; uncus, BA36/38; and the cingulate gyrus, BA24) had higher activations in response to opposite- than same-sex faces. The results of this analysis, along with data obtained from ERP recordings, support the hypothesis that both genders process opposite-sex faces differently than same-sex faces. The data also suggest a hemispheric asymmetry in the processing of opposite-/same-sex faces, with the right hemisphere involved in processing same-sex faces and the left hemisphere involved in processing faces of the opposite sex. The data support previous literature suggesting a right lateralization for the representation of self-image and body awareness.</p>
</abstract>
<kwd-group>
<kwd>ERPs</kwd>
<kwd>face coding</kwd>
<kwd>social cognition</kwd>
<kwd>sex differences</kwd>
<kwd>visual perception</kwd>
<kwd>body awareness</kwd>
<kwd>self-representation</kwd>
<kwd>hemispheric asymmetry</kwd>
</kwd-group>
<counts>
<fig-count count="9"/>
<table-count count="3"/>
<equation-count count="0"/>
<ref-count count="51"/>
<page-count count="12"/>
<word-count count="7810"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction">
<title>Introduction</title>
<p>Humans quickly process faces to identify conspecific features and adjust their behavior based on this identification, reacting differently to faces of the same/opposite sex, competitors/friends, and elderly/youngsters. Studies based on psychological ratings (Perrett et al., <xref ref-type="bibr" rid="B37">1998</xref>; Cornwell et al., <xref ref-type="bibr" rid="B6">2004</xref>) and brain activation (Kranz and Ishai, <xref ref-type="bibr" rid="B27">2006</xref>) have shown that individuals prefer to view attractive faces of the opposite sex compared to those of the same sex (Penton-Voak et al., <xref ref-type="bibr" rid="B36">2001</xref>; Little et al., <xref ref-type="bibr" rid="B31">2002</xref>; Rhodes, <xref ref-type="bibr" rid="B41">2006</xref>). This bias may result from the fact that sexually dimorphic facial characteristics convey information about the quality of potential mates. Kranz and Ishai (<xref ref-type="bibr" rid="B27">2006</xref>) used fMRI to scan a group of female and male subjects looking at male and female faces; the study took into account both the gender of the faces and the sexual preference of the viewers. Heterosexual women and homosexual men exhibited a significantly greater response in the thalamus and the orbitofrontal cortex when viewing male than female faces, whereas heterosexual men and homosexual women responded significantly more strongly to female than male faces in an attractiveness rating task. In a related study, Conway et al. (<xref ref-type="bibr" rid="B5">2008</xref>) found that subjects preferred a direct (vs. averted) gaze when judging the attractiveness of happy faces relative to unhappy faces and that this preference for direct gaze was particularly pronounced in judgments of opposite-sex faces. Several fMRI studies have investigated the same-/opposite-sex effect using different face-processing paradigms (Fischer et al., <xref ref-type="bibr" rid="B16">2004a</xref>,<xref ref-type="bibr" rid="B17">b</xref>; Turk et al., <xref ref-type="bibr" rid="B46">2004</xref>). For instance, Turk et al. (<xref ref-type="bibr" rid="B46">2004</xref>) asked subjects to view pictures of men or women to be selected for a dinner date. They observed a greater blood oxygenated level dependent response (BOLD) response for consequential decisions (opposite-sex dates) compared to inconsequential decisions (same-sex dates) in the dorsal anterior cingulate cortex (ACC), the medial surface of the superior frontal gyrus or pre-SMA (BA 8) and two areas of the right ventral temporal cortex. These ventral temporal regions, which are located near the putative fusiform face area (Kanwisher et al., <xref ref-type="bibr" rid="B22">1997</xref>), may be associated with attentive face processing in consequential decision making (i.e., selecting a dinner date). A bias toward opposite- than same-sex individuals has been observed for the auditory modality as well. For example, some studies (e.g., Jones et al., <xref ref-type="bibr" rid="B21">2010</xref>) have shown that women prefer &#x0201C;masculine&#x0201D; voices, and men prefer &#x0201C;feminine&#x0201D; voices (Welling et al., <xref ref-type="bibr" rid="B50">2008</xref>), preferences that are thought to assist in the identification of high-quality (e.g., healthy) mates. Consistent with these findings, studies manipulating the pitch of voice recordings have found that raising the pitch of women&#x00027;s voices (Feinberg et al., <xref ref-type="bibr" rid="B13">2008</xref>) and lowering the pitch of men&#x00027;s voices (Feinberg et al., <xref ref-type="bibr" rid="B14">2005</xref>; Vukovic et al., <xref ref-type="bibr" rid="B48">2008</xref>) increased vocal attractiveness.</p>
<p>It should be noted that a preference for the opposite sex has not been consistently observed in face processing because this processing also depends on the emotional content of faces. Fischer et al. (<xref ref-type="bibr" rid="B16">2004a</xref>) used fMRI to record the BOLD signal in 24 men and women while the subjects viewed angry, fearful, or neutral male and female faces. In the men, activity in the occipital and the anterior cingulate cortices increased when confronted with angry male faces relative to angry female faces, thus suggesting that men react more emotionally when confronted with angry male faces. Therefore, this study did not observe an opposite-sex bias. However, in a second study (Fischer et al., <xref ref-type="bibr" rid="B17">2004b</xref>), the same group passively exposed viewers to neutral male and female faces and found that, during exposure to faces of the opposite vs. the same sex, men displayed increased activation in the left amygdala and adjacent anterior temporal regions. Therefore, it can be concluded that the perception of emotional faces, and particularly of angry expressions, does not result in a bias toward opposite- vs. same-sex faces because there is an interaction between face gender and affective valence, with males responding more strongly to aggressive males than females.</p>
<p>Both the neuroimaging evidence and the results of behavioral studies are complex and conflicting, and little is known regarding the electrophysiological indices of opposite-sex bias in face processing. Studies using EEGs or ERPs to measure opposite-sex bias are rare and inconsistent (Oliver-Rodr&#x000ED;guez et al., <xref ref-type="bibr" rid="B32">1999</xref>; Langeslag et al., <xref ref-type="bibr" rid="B30">2007</xref>; Suyama et al., <xref ref-type="bibr" rid="B45">2008</xref>; Sun et al., <xref ref-type="bibr" rid="B44">2010</xref>). For example, a recent study (Sun et al., <xref ref-type="bibr" rid="B44">2010</xref>) using ERPs to investigate face-processing mechanisms related to gender and sexual orientation provided interesting data on sex differences in face coding but did not examine whether the viewer&#x00027;s sex affected the processing of male and female faces (same-/opposite-sex effect).</p>
<p>An ERP study by Suyama et al. (<xref ref-type="bibr" rid="B45">2008</xref>) employed a gender discrimination task and found that men exhibited a larger P2 component to female faces compared to male faces at about 220&#x02009;ms over left temporal sites. A similar increase was observed in women processing male faces, but it was located approximately 170&#x02009;ms over central sites. This study thus observed an opposite-sex bias but found that it differed in men and women. In an interesting passive-viewing study (Oliver-Rodr&#x000ED;guez et al., <xref ref-type="bibr" rid="B32">1999</xref>), ERPs were recorded in male and female participants in response to faces of both genders. After ERP recording, the viewers were asked to rate each face on a five-point attractiveness scale. In male viewers as well as in preovulatory and postovulatory female viewers, a positive correlation was observed between the ratings and the P300 amplitudes in response to opposite-sex faces. The modulation in P300 amplitude was thought to be related to the emotional value of the stimulus. A similar interpretation has been used to explain the finding that male and female subjects display larger P300 amplitudes (Langeslag et al., <xref ref-type="bibr" rid="B29">2008</xref>) and late positive potentials (LP) (Langeslag et al., <xref ref-type="bibr" rid="B30">2007</xref>) in response to photographs of romantic partners relative to photographs of opposite-sex friends. Early posterior negativity (EPN, &#x0223C;250&#x02009;ms) and increased LP amplitude were observed in response to attractive faces compared to non-attractive faces in a study that did not consider the sex of either the viewers or the photographed faces (Werheid et al., <xref ref-type="bibr" rid="B51">2007</xref>). Therefore, both studies found an enlarged LP for attractive (vs. unattractive), or beloved (vs. familiar) faces, suggesting a possible effect of emotional arousal. These findings agree with the previously discussed fMRI results and support the hypothesis that the processing of non-emotional opposite-sex faces is more effective than the processing of same-sex faces (e.g., Fischer et al., <xref ref-type="bibr" rid="B17">2004b</xref>; Kranz and Ishai, <xref ref-type="bibr" rid="B27">2006</xref>). However, in the first study (Langeslag et al., <xref ref-type="bibr" rid="B30">2007</xref>), the opposite-sex effect was confounded by the effect of romantic involvement, whereas in the second study (Werheid et al., <xref ref-type="bibr" rid="B51">2007</xref>), neither the gender of the viewers nor the sex of the faces was considered.</p>
<p>The present study aimed to determine how early in the visual processing stream an opposite-sex bias exists while controlling for the emotional content and the attractiveness of the faces. Pictures of attractive men and women were presented to casually recruited heterosexual men and women while they were engaged in a secondary target-detection task. The ERPs associated with faces of the opposite sex were averaged across the sexes to identify neural markers for opposite-sex bias in face processing while controlling for sex differences in face processing (e.g., sex differences in social responsiveness, Proverbio et al., <xref ref-type="bibr" rid="B40">2008</xref>; or face decoding, Proverbio et al., <xref ref-type="bibr" rid="B38">2006</xref>).</p>
<p>The current literature suggests that the processing of opposite-sex faces would occur earlier than that of same-sex faces and that the expedited processing would result in earlier peaks of ERP components. Furthermore, in light of the neurometabolic studies (e.g., Fischer et al., <xref ref-type="bibr" rid="B17">2004b</xref>; Kranz and Ishai, <xref ref-type="bibr" rid="B27">2006</xref>) that provide evidence that opposite-sex processing is associated with an increase in brain activity, which reflects attentive/effective processing, we expected an enhancement in the mean amplitude of some ERP components in response to opposite-sex faces compared to same-sex faces. We assumed that the increases in bio-electrical potentials in response to opposite-sex faces compared to same-sex faces reflected the processing of other people&#x00027;s faces. In contrast, the larger potentials in response to same-sex faces reflected self-sex, self-representation, and body awareness processes linked to the ability to distinguish between self and others, which are thought to be right-lateralized brain functions (Keenan et al., <xref ref-type="bibr" rid="B24">2000</xref>, <xref ref-type="bibr" rid="B25">2001</xref>, <xref ref-type="bibr" rid="B26">2003</xref>).</p>
</sec>
<sec sec-type="materials|methods">
<title>Materials and Methods</title>
<sec>
<title>Participants</title>
<p>A total of 40 university students (20 women and 20 men) ranging in age from 20 to 30 years (mean age&#x02009;&#x0003D;&#x02009;22.3 years, SD&#x02009;&#x0003D;&#x02009;2.7; women&#x02009;&#x0003D;&#x02009;21.8, men&#x02009;&#x0003D;&#x02009;22.8) voluntarily participated in the study. All participants had normal or corrected-to-normal vision with right-eye dominance. All participants were right-handed as assessed by the Edinburgh Inventory, and no participants had any left-handed relatives. All participants provided written informed consent. All experiments were conducted according to the ethical recommendations of the Declaration of Helsinki, were approved by the Ethical Committee of the Italian National Research Council (CNR) and were in compliance with the APA standards for the treatment of human volunteers (1992, American Psychological Association). The participants earned academic credits for their participation. The data from one male and one female subject were discarded because of excessive eye movement; therefore, equal numbers of male and female subjects were preserved.</p>
</sec>
<sec>
<title>Stimuli and procedures</title>
<p>The participants were seated in a dimly lit, electrically shielded cubicle, and asked to focus both eyes on a fixation point in the center of a visual display positioned 114&#x02009;cm away. The participants were instructed to avoid eye or body movements. The faces of 130 attractive, adult males and females (ranging from 18 to 50 years of age) were used as stimuli. Face attractiveness was established by four independent judges but without a specific rating procedure. The faces were presented for 800&#x02009;ms at a screen contrast of 40%. All faces had the same average luminance of 16.4&#x02009;cd/cm<sup>2</sup>, and the eyes of the presented faces were aligned to the fixation point. All faces were smiling or showing a positive facial expression (see Figure <xref ref-type="fig" rid="F1">1</xref>). The faces were presented randomly mixed with equiluminant, infrequent targets (3&#x02013;7 per run) depicting landscapes. The stimulus size was 7&#x000B0; 9&#x02032; 56&#x02032; &#x000D7;&#x02009;8&#x000B0; 23&#x02032; 1&#x02032;. The ISI ranged from 1300 to 1500&#x02009;ms. The outer background was dark gray. The task consisted of detecting landscape images and the viewing of male and female faces was passive. Participants had to press a response key to targets with the index finger of the left or right hand. The two hands were used alternately during the recording session. The order of response hand was counterbalanced across subjects.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p><bold>Timeline of stimulus presentation</bold>. Each picture was displayed for 800&#x02009;ms and followed by a random ISI. The task consisted of responding to landscapes as quickly and accurately as possible.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g001.tif"/>
</fig>
</sec>
<sec>
<title>EEG recording and analysis</title>
<p>Electroencephalography was continuously recorded from 128 sites at a sampling rate of 512&#x02009;Hz. Vertical eye movements were recorded by two electrodes placed below and above the right eye, whereas horizontal movements were recorded from electrodes placed at the outer canthi of the eyes. Linked ears served as the reference lead. EEG and electro-oculogram (EOG) were amplified with a half-amplitude band pass of 0.016&#x02013;100&#x02009;Hz. Electrode impedance was maintained below 5&#x02009;k&#x003A9;. EEG epochs were synchronized with the onset of stimulus presentation and analyzed by ANT-EEProbe software. Computerized artifact rejection was performed before averaging in order to discard epochs in which eye movements, blinks, excessive muscle potentials, or amplifier blocking occurred. EEG epochs associated with incorrect behavioral responses were also excluded. The artifact rejection criterion was a peak-to-peak amplitude exceeding 70&#x02009;&#x003BC;V, and the rejection rate was 5% (min 2%, max 7%).</p>
</sec>
<sec>
<title>Data analysis</title>
<p>ERPs were averaged offline from 200 before to 800&#x02009;ms after stimulus onset and were low-pass filtered up to 50&#x02009;Hz. ERP components were identified and measured with reference to the average baseline voltage over the interval from &#x02212;100 to 0&#x02009;ms at the sites and latency where they reached their maximum amplitude.</p>
<p>Earlier posterior P1 and N1 components were not quantified because waveforms were virtually identical across stimulus conditions. The mean amplitude of anterior N2 was measured at Fp1, Fp2, F1, and F2 sites in the 200&#x02013;300&#x02009;ms time window. The peak amplitude and latency of the centro&#x02013;parietal N400 component were measured at CCP1h and CCP2h sites during the 350&#x02013;500&#x02009;ms time window. Multiple comparisons of means were done using <italic>post hoc</italic> Tukey test. The amplitude of the LP was quantified at occipito&#x02013;parietal sites (PPO1, POz, and PPO2) during the 690&#x02013;720&#x02009;ms time window. ERP data were subjected to multifactorial repeated-measures ANOVAs with a single within factor (participant&#x00027;s sex: male, female) and two between factors: face gender (same, opposite) and laterality (left, right). Laterality had two levels for N400 ANOVA (left and right) and three levels for LP ANOVA (left, midline and right).</p>
<p>A development of low resolution electromagnetic tomography (LORETA; Pasqual-Marqui et al., <xref ref-type="bibr" rid="B35">1994</xref>) inverse solution was applied to ERP difference waves at various time latencies. Specifically, it was applied to the difference wave obtained by subtracting ERPs to same-sex faces from ERPs to opposite-sex faces in the 400&#x02013;500&#x02009;ms time window (N400 range) and to the difference wave obtained by subtracting the ERPs to opposite-sex faces from those to same-sex faces in the 590&#x02013;720&#x02009;ms time window (LP range). The standardized LORETA (sLORETA) method employs statistical parametric maps related to the reliability of the estimated current source density distribution. In this work, we used the swLORETA method (Palmero-Soler et al., <xref ref-type="bibr" rid="B33">2007</xref>), which is a variation of the sLORETA method. This method incorporates a singular value decomposition based lead field weighting that compensates for the varying sensitivity of the sensors to current sources at different depths. The swLORETA solution was computed using a regular 3D grid of voxels that represents the possible sources of the EEG signals. Furthermore, the solution was restricted to the gray and white matter obtained from the segmentation of the Collins 27 MRI produced by the Montreal Neurological Institute (Evans and Collins, <xref ref-type="bibr" rid="B12">1993</xref>). The boundary element model (BEM) was used for solving the forward problem (Geselowitz, <xref ref-type="bibr" rid="B18">1967</xref>). The BEM consisted of one homogenic compartment composed of 3446 vertices and 6888 triangles. The swLORETA was complemented by equivalent dipole modeling. The electromagnetic dipoles are represented as arrows and indicate the position, orientation, and magnitude of the dipole-modeling solution that was applied to the ERP difference wave during the specific time window. Grid positions that exhibited magnitudes greater than their 16 nearest neighbors are shown with an arrow that points in the direction of the equivalent current dipole. The following source space properties were used: grid spacing&#x02009;&#x0003D;&#x02009;5&#x02009;mm; estimated SNR &#x0003D;&#x02009;3.</p>
</sec>
</sec>
<sec>
<title>Results</title>
<p>Figure <xref ref-type="fig" rid="F2">2</xref> shows grand-average ERP waveforms recorded at several centro&#x02013;parietal and occipito&#x02013;parietal sites in women and men in response to faces of female and male individuals where N400 and LP reached their maximum amplitude and showed face-gender effects. Earlier ERP responses (P1 and N1) were indistinguishable across stimulus conditions in all subjects so we did not quantify the effects. Anterior N2 (200&#x02013;300&#x02009;ms) was not significantly affected by face gender. For both men and women (no effect of sex was found), the ANOVA of the mean N400 latency data showed an expedited processing of opposite-sex faces (F1, 36&#x02009;&#x0003D;&#x02009;12.518; <italic>p</italic>&#x02009;&#x0003D;&#x02009;0.0011). Participants displayed an earlier N400 in response to faces of the opposite sex (please see Table <xref ref-type="table" rid="T1">1</xref> for mean values) relative to faces of the same sex. These results are shown in the grand-average waveforms of Figure <xref ref-type="fig" rid="F3">3</xref> and the graphics in Figure <xref ref-type="fig" rid="F4">4</xref> that display the N400 latencies recorded in men (Figure <xref ref-type="fig" rid="F4">4</xref>A) and women (Figure <xref ref-type="fig" rid="F4">4</xref>B) as a function of face gender and recording cerebral hemisphere. The N400 was also earlier over the right (436&#x02009;ms, SE &#x0003D;&#x02009;5.1) centro&#x02013;parietal sites relative to the left (438&#x02009;ms, SE&#x02009;&#x0003D;&#x02009;5.2), as confirmed by ANOVA (F1, 36&#x02009;&#x0003D;&#x02009;4.396; <italic>p</italic>&#x02009;&#x0003D;&#x02009;0.0431). Individual scores of N400 latencies across genders and recording hemispheres are shown in Figures <xref ref-type="fig" rid="F4">4</xref>C,D. The difference between the opposite and the same condition was computed to determine the advantage of the opposite- vs. same-sex face processing and displayed on a linear scale as a function of N400 latency. Data were highly consistent across sexes and hemispheres.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p><bold>ERP waveforms recorded at left, right and midline centro&#x02013;parietal and occipito&#x02013;parietal sites in women (<bold><italic>N</italic></bold> &#x0003D;&#x02009;19) and men (<bold><italic>N</italic></bold> &#x0003D;&#x02009;19) in response to faces of female and male individuals</bold>. Both genders exhibited an enlarged N400 to opposite-sex faces and a large late positivity (LP) to same-sex faces.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g002.tif"/>
</fig>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p><bold>Mean values of N400 peak amplitude and latency, as well as LP mean area recorded as a function of face gender, along with standard errors, and confidence intervals</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Face gender</th>
<th align="left">Mean</th>
<th align="left">SE</th>
<th align="left">&#x0002D;95%</th>
<th align="left">&#x0002B;95%</th>
<th align="right">Ss</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" colspan="6"><bold>N400 LATENCY (MS)</bold></td>
</tr>
<tr>
<td align="left">Opposite</td>
<td align="char" char=".">428.066</td>
<td align="left">5.607</td>
<td align="char" char=".">416.695</td>
<td align="char" char=".">439.437</td>
<td align="right">38</td>
</tr>
<tr>
<td align="left">Same</td>
<td align="char" char=".">445.382</td>
<td align="left">5.765</td>
<td align="char" char=".">433.689</td>
<td align="char" char=".">457.074</td>
<td align="right">38</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>N400 AMPLITUDE (&#x003BC;V)</bold></td>
</tr>
<tr>
<td align="left">Opposite</td>
<td align="char" char=".">&#x02212;3.329</td>
<td align="left">0.566</td>
<td align="char" char=".">&#x02212;4.478</td>
<td align="char" char=".">&#x02212;2.181</td>
<td align="right">38</td>
</tr>
<tr>
<td align="left">Same</td>
<td align="char" char=".">&#x02212;2.723</td>
<td align="left">0.578</td>
<td align="char" char=".">&#x02212;3.896</td>
<td align="char" char=".">&#x02212;1.550</td>
<td align="right">38</td>
</tr>
<tr>
<td align="left" colspan="6"><bold>LATE POSITIVITY (AMPLITUDE &#x003BC;V)</bold></td>
</tr>
<tr>
<td align="left">Opposite</td>
<td align="char" char=".">1.179</td>
<td align="left">0.265</td>
<td align="char" char=".">0.641</td>
<td align="char" char=".">1.718</td>
<td align="right">38</td>
</tr>
<tr>
<td align="left">Same</td>
<td align="char" char=".">1.643</td>
<td align="left">0.269</td>
<td align="char" char=".">1.099</td>
<td align="char" char=".">2.188</td>
<td align="right">38</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p><bold>Grand-average ERP waveforms (<bold><italic>N</italic></bold> &#x0003D;&#x02009;38) recorded at centro&#x02013;parietal and occipito&#x02013;parietal sites (corresponding to the sites included in the N400 and LP ANOVAs) as a function of the face gender (opposite vs</bold>. <bold>same), along with the difference wave obtained by subtracting ERPs to same-sex faces from ERPs to opposite-sex faces</bold>.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g003.tif"/>
</fig>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p><bold>(A,B)</bold> N400 peak latency values (in ms) with within-subjects standard errors recorded in men (A) and women (B) as a function of face gender and cerebral hemisphere. No effect of sex of viewers was found, but a significant opposite-sex bias was observed. <bold>(C,D)</bold> The N400 latency difference between opposite and same for men and women recorded at left and right centro&#x02013;parietal sites. Negative values indicate expedite opposite-sex processing. Results were consistent across subjects and sex groups.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g004.tif"/>
</fig>
<p>The ANOVA on mean peak amplitude values revealed significantly larger N400 amplitudes (F1, 36&#x02009;&#x0003D;&#x02009;4.98; <italic>p</italic>&#x02009;&#x0003D;&#x02009;0.031) in response to faces of the opposite sex relative to faces of the same sex for all subjects (see means reported in Table <xref ref-type="table" rid="T1">1</xref> and Figures <xref ref-type="fig" rid="F5">5</xref>A,B for a detailed analysis of same-/opposite-sex bias in male and female participants). The N400 amplitudes were larger over the left centro&#x02013;parietal area (&#x02212;3.56&#x02009;&#x003BC;V, SE &#x0003D;&#x02009;0.56) relative to the right (&#x02212;2.5 &#x003BC;V, SE&#x02009;&#x0003D;&#x02009;0.57), as confirmed by ANOVA (F1, 36&#x02009;&#x0003D;&#x02009;28.5; <italic>p</italic>&#x02009;&#x0003D;&#x02009;0.000005) and represented in the topographic maps in Figure <xref ref-type="fig" rid="F6">6</xref>A. However, hemispheric asymmetry was only observed in the male brain (see Figure <xref ref-type="fig" rid="F5">5</xref>A), as shown by the significant sex of viewers&#x02009;&#x000D7; laterality interaction (F1, 26&#x02009;&#x0003D;&#x02009;32.34; <italic>p</italic>&#x02009;&#x0003D;&#x02009;0.000002) and relative <italic>post hoc</italic> comparisons (Women: RH&#x02009;&#x0003D; &#x02212;3.55, LH&#x02009;&#x0003D; &#x02212;3.48&#x02009;&#x003BC;V, N.S.; Men: RH &#x0003D;&#x02009;&#x02212;1.44, LH &#x0003D;&#x02009;&#x02212;3.63 &#x003BC;V, <italic>p</italic>&#x02009;&#x0003C;&#x02009;0.0001). Individual scores of N400 amplitude across genders and recording hemispheres are presented in Figures <xref ref-type="fig" rid="F5">5</xref>C,D. The difference between the opposite- and the same-sex condition was computed to determine the enhanced response elicited by opposite- vs. same-sex faces and displayed on a linear scale as a function of the response amplitude.</p>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption><p><bold>N400 peak amplitude values (in &#x003BC;V) with within-subjects standard errors recorded in men</bold> (A) and women (B) as a function of face gender and cerebral hemisphere. A strong hemispheric asymmetry is visible in men <bold>(A)</bold> but not women <bold>(B)</bold>. <bold>(C,D)</bold> Scatter plots displaying individual values of N400 amplitudes (in &#x003BC;V) obtained by subtracting opposite&#x02013;same responses recorded in men <bold>(C)</bold> and women <bold>(D)</bold> at left and right centro/parietal sites. Negative values indicate an enhanced brain processing of opposite-sex faces.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g005.tif"/>
</fig>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption><p><bold>(A)</bold> Isocontour voltage topographical maps (top, front, and side views) of the face/gender effect obtained by subtracting ERPs associated with same-sex faces from ERPs associated with opposite-sex faces during the 400&#x02013;450&#x02009;ms time window (N400). <bold>(B)</bold> Topographical maps obtained by subtracting ERPs associated with opposite-sex faces from ERPs associated with same-sex faces during the 590&#x02013;720&#x02009;ms time window (LP).</p></caption>
<graphic xlink:href="fpsyg-01-00169-g006.tif"/>
</fig>
<p>To localize the neural generator of opposite-sex bias in face processing, low resolution electromagnetic tomography (swLORETA) was applied to the difference wave. This wave was obtained by subtracting ERPs associated with same-sex faces from ERPs associated with opposite-sex faces within the latency range of 400&#x02013;500&#x02009;ms, which corresponds to the peak of the N400-evoked response. The inverse solution (shown in Figure <xref ref-type="fig" rid="F7">7</xref>; Table <xref ref-type="table" rid="T2">2</xref> displays a list of active sources for this solution) showed that the processing of opposite-sex faces was associated with a much stronger focus of activity in the bilateral limbic areas (parahippocampal gyri, BA28/35), left and right uncus (BA38), cingulate cortex (BA24), and the left and right fusiform gyri of the temporal lobe (BA37, BA20/21); this activity also displayed strong left hemispheric asymmetry.</p>
<fig id="F7" position="float">
<label>Figure 7</label>
<caption><p><bold>The swLORETA inverse solution applied on the opposite-/same-sex difference wave during the time window 400&#x02013;500&#x02009;ms, which corresponds to the peak of the N400 response</bold>. The electromagnetic dipoles are shown as yellow arrows and indicate the position, orientation, and magnitude of the dipole-modeling solution applied to the ERP difference wave in this specific time window.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g007.tif"/>
</fig>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p><bold>Talairach coordinates corresponding to the intracranial generators explaining the difference-voltage relative to the opposite minus same sex contrast within the 400&#x02013;500&#x02009;ms time window according to swLORETA (ASA) Grid spacing&#x02009;&#x0003D;&#x02009;5&#x02009;mm; estimated SNR &#x0003D;&#x02009;3; unit &#x0003D;&#x02009;nAm</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Magn.</th>
<th align="left"><italic>T&#x02009;&#x0002D;&#x02009;x</italic> (mm)</th>
<th align="left"><italic>T&#x02009;&#x0002D;&#x02009;y</italic> (mm)</th>
<th align="left"><italic>T&#x02009;&#x0002D;&#x02009;z</italic> (mm)</th>
<th align="left">HEM</th>
<th align="left">Lobe</th>
<th align="left">Gyrus</th>
<th align="left">BA</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">6.77</td>
<td align="char" char=".">&#x02212;48.5</td>
<td align="char" char=".">&#x02212;33.7</td>
<td align="char" char=".">&#x02212;23.6</td>
<td align="left">L</td>
<td align="left">T</td>
<td align="left">Fusiform</td>
<td align="left">20</td>
</tr>
<tr>
<td align="left">6.74</td>
<td align="char" char=".">&#x02212;48.5</td>
<td align="char" char=".">&#x02212;55</td>
<td align="char" char=".">&#x02212;17.6</td>
<td align="left">L</td>
<td align="left">T</td>
<td align="left">Fusiform</td>
<td align="left">37</td>
</tr>
<tr>
<td align="left">5.57</td>
<td align="char" char=".">50.8</td>
<td align="char" char=".">&#x02212;33.7</td>
<td align="char" char=".">&#x02212;23.6</td>
<td align="left">R</td>
<td align="left">T</td>
<td align="left">Fusiform</td>
<td align="left">20</td>
</tr>
<tr>
<td align="left">5.35</td>
<td align="char" char=".">50.8</td>
<td align="char" char=".">&#x02212;16.1</td>
<td align="char" char=".">&#x02212;22.2</td>
<td align="left">R</td>
<td align="left">T</td>
<td align="left">Fusiform</td>
<td align="left">20</td>
</tr>
<tr>
<td align="left">6.3</td>
<td align="char" char=".">&#x02212;18.5</td>
<td align="char" char=".">&#x02212;8</td>
<td align="char" char=".">&#x02212;28.9</td>
<td align="left">L</td>
<td align="left">Limbic</td>
<td align="left">Uncus</td>
<td align="left">36</td>
</tr>
<tr>
<td align="left">5.70</td>
<td align="char" char=".">21.2</td>
<td align="char" char=".">9.1</td>
<td align="char" char=".">&#x02212;27.5</td>
<td align="left">R</td>
<td align="left">Limbic</td>
<td align="left">Uncus</td>
<td align="left">38</td>
</tr>
<tr>
<td align="left">5.56</td>
<td align="char" char=".">21.2</td>
<td align="char" char=".">&#x02212;24.5</td>
<td align="char" char=".">&#x02212;15.5</td>
<td align="left">R</td>
<td align="left">Limbic</td>
<td align="left">Parahippocampal</td>
<td align="left">35</td>
</tr>
<tr>
<td align="left">6.03</td>
<td align="char" char=".">&#x02212;38.5</td>
<td align="char" char=".">&#x02212;15.3</td>
<td align="char" char=".">&#x02212;29.6</td>
<td align="left">L</td>
<td align="left">T</td>
<td align="left">Inferior temporal</td>
<td align="left">20</td>
</tr>
<tr>
<td align="left">5.30</td>
<td align="char" char=".">50.8</td>
<td align="char" char=".">&#x02212;0.6</td>
<td align="char" char=".">&#x02212;28.2</td>
<td align="left">R</td>
<td align="left">T</td>
<td align="left">Middle temporal</td>
<td align="left">21</td>
</tr>
<tr>
<td align="left">5.19</td>
<td align="char" char=".">&#x02212;58.5</td>
<td align="char" char=".">&#x02212;8.7</td>
<td align="char" char=".">&#x02212;21.5</td>
<td align="left">L</td>
<td align="left">T</td>
<td align="left">Inferior temporal</td>
<td align="left">20</td>
</tr>
<tr>
<td align="left">2.79</td>
<td align="char" char=".">&#x02212;48.5</td>
<td align="char" char=".">33.4</td>
<td align="char" char=".">23.1</td>
<td align="left">L</td>
<td align="left">F</td>
<td align="left">Middle frontal</td>
<td align="left">46</td>
</tr>
<tr>
<td align="left">1.73</td>
<td align="char" char=".">&#x02212;8.5</td>
<td align="char" char=".">12.4</td>
<td align="char" char=".">30.3</td>
<td align="left">L</td>
<td align="left">Limbic</td>
<td align="left">Cingulate</td>
<td align="left">24</td>
</tr>
<tr>
<td align="left">1.57</td>
<td align="char" char=".">40.9</td>
<td align="char" char=".">&#x02212;30.4</td>
<td align="char" char=".">34.9</td>
<td align="left">R</td>
<td align="left">P</td>
<td align="left">Inferior parietal lobule</td>
<td align="left">40</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The analyses of LP mean-amplitude values were significant for face gender (F1, 36 &#x0003D;&#x02009;4.24, <italic>p</italic>&#x02009;&#x0003D;&#x02009;0.046), indicating larger LP in response to faces of the same gender (see Table <xref ref-type="table" rid="T1">1</xref> for mean values and Figure <xref ref-type="fig" rid="F6">6</xref>B for topographical mapping of the face-gender effect) relative to faces of the opposite gender, as shown in the waveforms presented in Figure <xref ref-type="fig" rid="F3">3</xref> and the graphics displayed in Figure <xref ref-type="fig" rid="F8">8</xref>. The interaction of face gender &#x000D7;&#x02009;laterality &#x000D7;&#x02009;sex tended toward significance (<italic>p</italic>&#x02009;&#x0003D;&#x02009;0.07), suggesting a larger coding effect in women (Figure <xref ref-type="fig" rid="F8">8</xref>B) compared to men (Figure <xref ref-type="fig" rid="F8">8</xref>A). This tendency can be determined by examining individual scores of same&#x02013;opposite effects computed by subtracting LP to opposite minus same-sex faces in men (Figure <xref ref-type="fig" rid="F8">8</xref>C) and women (Figure <xref ref-type="fig" rid="F8">8</xref>D). There was some inter-subject variability for both genders, but overall, women tended to elicit larger differential responses than men as indicated by their more frequent positive values in the scatter-plot. SwLORETA analysis of the LP response, obtained by subtracting ERPs to opposite-sex faces from ERPs to same-sex faces in the LP latency range (590&#x02013;720&#x02009;ms) indicated a series of significant generators explaining the surface difference-voltage displayed in Figure <xref ref-type="fig" rid="F9">9</xref>. The processing of same-sex faces was associated with the activation of a neural circuit including both posterior and anterior neural structures (listed in Table <xref ref-type="table" rid="T3">3</xref>), among which the five strongest sources of activity were located in the right hemisphere: parahippocampal gyrus (BA35), occipital fusiform gyrus (BA37), temporal fusiform gyrus (BA20), middle temporal gyrus (BA21), and superior temporal gyrus (BA20).</p>
<fig id="F8" position="float">
<label>Figure 8</label>
<caption><p><bold>(A,B)</bold> LP mean-amplitude values (in &#x003BC;V) with within-subjects standard errors recorded in men <bold>(A)</bold> and women <bold>(B)</bold> as a function of face gender and recording site. <bold>(C,D)</bold> Scatter plot displaying individual values of LP component obtained by subtracting same&#x02013;opposite responses recorded in men <bold>(C)</bold> and women <bold>(D)</bold> at left and right occipito&#x02013;parietal sites. Positive values indicate larger cerebral responses to same-sex faces.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g008.tif"/>
</fig>
<fig id="F9" position="float">
<label>Figure 9</label>
<caption><p><bold>The swLORETA inverse solution applied on the same/opposite-sex difference wave during the time window 590&#x02013;720&#x02009;ms, which corresponds to the peak of the late positivity (LP)</bold>. The electromagnetic dipoles are shown as yellow arrows and indicate the position, orientation, and magnitude of the dipole-modeling solution applied to the ERP difference wave in this specific time window.</p></caption>
<graphic xlink:href="fpsyg-01-00169-g009.tif"/>
</fig>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p><bold>Talairach coordinates corresponding to the intracranial generators explaining the difference-voltage relative to the same minus opposite sex contrast within the 590&#x02013;720&#x02009;ms time window according to swLORETA (ASA); grid spacing&#x02009;&#x0003D;&#x02009;5&#x02009;mm; estimated SNR &#x0003D;&#x02009;3; unit &#x0003D;&#x02009;nAm</bold>.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Magn.</th>
<th align="left"><italic>T&#x02009;&#x0002D;&#x02009;x</italic> (mm)</th>
<th align="left"><italic>T&#x02009;&#x0002D;&#x02009;y</italic> (mm)</th>
<th align="left"><italic>T&#x02009;&#x0002D;&#x02009;z</italic> (mm)</th>
<th align="left">HEM</th>
<th align="left">Lobe</th>
<th align="left">Gyrus</th>
<th align="left">BA</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">5.15</td>
<td align="char" char=".">21.2</td>
<td align="char" char=".">&#x02212;24.5</td>
<td align="char" char=".">&#x02212;15.5</td>
<td align="left">R</td>
<td align="left">Limbic</td>
<td align="left">Parahippocampal</td>
<td align="left">35</td>
</tr>
<tr>
<td align="left">5.14</td>
<td align="char" char=".">40.9</td>
<td align="char" char=".">&#x02212;55</td>
<td align="char" char=".">&#x02212;17.6</td>
<td align="left">R</td>
<td align="left">O</td>
<td align="left">Fusiform</td>
<td align="left">37</td>
</tr>
<tr>
<td align="left">5.14</td>
<td align="char" char=".">50.8</td>
<td align="char" char=".">&#x02212;33.7</td>
<td align="char" char=".">&#x02212;23.6</td>
<td align="left">R</td>
<td align="left">T</td>
<td align="left">Fusiform</td>
<td align="left">20</td>
</tr>
<tr>
<td align="left">4.78</td>
<td align="char" char=".">50.8</td>
<td align="char" char=".">&#x02212;0.6</td>
<td align="char" char=".">&#x02212;28.2</td>
<td align="left">R</td>
<td align="left">T</td>
<td align="left">Middle temporal</td>
<td align="left">21</td>
</tr>
<tr>
<td align="left">4.75</td>
<td align="char" char=".">31</td>
<td align="char" char=".">9.1</td>
<td align="char" char=".">&#x02212;27.5</td>
<td align="left">R</td>
<td align="left">T</td>
<td align="left">Superior temporal</td>
<td align="left">38</td>
</tr>
<tr>
<td align="left">3.48</td>
<td align="char" char=".">50.8</td>
<td align="char" char=".">11.4</td>
<td align="char" char=".">39.2</td>
<td align="left">R</td>
<td align="left">F</td>
<td align="left">Middle frontal</td>
<td align="left">8</td>
</tr>
<tr>
<td align="left">4.25</td>
<td align="char" char=".">&#x02212;18.5</td>
<td align="char" char=".">&#x02212;8</td>
<td align="char" char=".">&#x02212;28.9</td>
<td align="left">L</td>
<td align="left">Limbic</td>
<td align="left">Uncus</td>
<td align="left">36</td>
</tr>
<tr>
<td align="left">4.22</td>
<td align="char" char=".">&#x02212;18.5</td>
<td align="char" char=".">&#x02212;45.8</td>
<td align="char" char=".">&#x02212;9.5</td>
<td align="left">L</td>
<td align="left">Cereb</td>
<td align="left">Fusiform/parahippocampal</td>
<td align="left">19/36</td>
</tr>
<tr>
<td align="left">4.12</td>
<td align="char" char=".">&#x02212;38.5</td>
<td align="char" char=".">&#x02212;55</td>
<td align="char" char=".">&#x02212;17.6</td>
<td align="left">L</td>
<td align="left">T</td>
<td align="left">Fusiform</td>
<td align="left">37</td>
</tr>
<tr>
<td align="left">3.74</td>
<td align="char" char=".">&#x02212;38.5</td>
<td align="char" char=".">43.4</td>
<td align="char" char=".">23.9</td>
<td align="left">L</td>
<td align="left">F</td>
<td align="left">Middle frontal</td>
<td align="left">10</td>
</tr>
<tr>
<td align="left">3.71</td>
<td align="char" char=".">&#x02212;28.5</td>
<td align="char" char=".">56.3</td>
<td align="char" char=".">&#x02212;1.6</td>
<td align="left">L</td>
<td align="left">F</td>
<td align="left">Superior frontal</td>
<td align="left">10</td>
</tr>
<tr>
<td align="left">3.49</td>
<td align="char" char=".">&#x02212;48.5</td>
<td align="char" char=".">8.2</td>
<td align="char" char=".">&#x02212;20</td>
<td align="left">L</td>
<td align="left">T</td>
<td align="left">Superior temporal</td>
<td align="left">38</td>
</tr>
<tr>
<td align="left">3.06</td>
<td align="char" char=".">11.3</td>
<td align="char" char=".">64.4</td>
<td align="char" char=".">16.8</td>
<td align="left">R</td>
<td align="left">F</td>
<td align="left">Superior frontal</td>
<td align="left">10</td>
</tr>
<tr>
<td align="left">3.02</td>
<td align="char" char=".">31</td>
<td align="char" char=".">53.4</td>
<td align="char" char=".">24.8</td>
<td align="left">R</td>
<td align="left">F</td>
<td align="left">Superior frontal</td>
<td align="left">10</td>
</tr>
<tr>
<td align="left">2.98</td>
<td align="char" char=".">&#x02212;8.5</td>
<td align="char" char=".">64.4</td>
<td align="char" char=".">16.8</td>
<td align="left">L</td>
<td align="left">F</td>
<td align="left">Superior frontal</td>
<td align="left">10</td>
</tr>
<tr>
<td align="left">2.67</td>
<td align="char" char=".">1.5</td>
<td align="char" char=".">&#x02212;20.3</td>
<td align="char" char=".">26.8</td>
<td align="left">R</td>
<td align="left">Limbic</td>
<td align="left">Cingulate</td>
<td align="left">23</td>
</tr>
<tr>
<td align="left">2.53</td>
<td align="char" char=".">1.5</td>
<td align="char" char=".">8.5</td>
<td align="char" char=".">65.9</td>
<td align="left">R</td>
<td align="left">F</td>
<td align="left">Superior frontal</td>
<td align="left">6</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec sec-type="discussion">
<title>Discussion</title>
<p>The present data provide evidence for an opposite-sex bias in face processing. Indeed, the inattentive perception of opposite-sex faces (in both genders) was characterized by a larger and earlier centro&#x02013;parietal N400 when compared to same-sex faces. Furthermore, we also observed a greater late positive component (LP) in response to same-sex faces than to opposite-sex faces at occipito&#x02013;parietal sites, suggesting a functional and anatomical dissociation between self-sex and other-sex face processing. Indeed, other studies have shown that the identification of opposite-sex faces is performed more quickly than identifying same-sex faces, regardless of facial expressions (Hofmann et al., <xref ref-type="bibr" rid="B20">2006</xref>). These results are consistent with the evolutionary hypothesis that suggests that individuals attend more strongly opposite-sex faces than to same-sex faces to facilitate the identification of potential mates. To study sex differences in the recognition of human faces with different facial expressions, the authors (Hofmann et al., <xref ref-type="bibr" rid="B20">2006</xref>) taught 65 women and 64 men to associate names with various neutral male and female faces. During the recall phase, the participants were asked to identify the same faces depicting different emotional expressions. The group found that women named male faces faster than men, and men named female faces faster than women. These results suggest that opposite-sex faces require less processing than same-sex faces, a finding also consistent with this evolutionary perspective. In the present study, the discovery of an opposite-/same-sex effect in the N400 latency range (400&#x02013;450&#x02009;ms) supports the hypothesis that opposite-sex faces are attended to more strongly than same-sex faces. This result is consistent with the hypothesis that opposite-sex faces being evaluated as potential mates, but this interpretation is rather speculative and needs to be corroborated by further investigation. Indeed, the swLORETA found a series of source locations in regions devoted to face processing: the left and right fusiform gyri of the temporal lobe, BA37/20 (Kanwisher and Yovel, <xref ref-type="bibr" rid="B23">2006</xref>); regions providing emotional content in face processing such as the right uncus (BA38) and bilateral limbic areas (parahippocampal gyri), BA28/35 (Vuilleumier et al., <xref ref-type="bibr" rid="B47">2001</xref>); regions providing emotional valence to affective visual stimuli such as faces, that is the medial and superior frontal gyri, BA10/11 (Dolan et al., <xref ref-type="bibr" rid="B9">1996</xref>; Paradiso et al., <xref ref-type="bibr" rid="B34">1999</xref>). Therefore, on the basis of source reconstruction data, it can be hypothesized that the processing of opposite-sex faces might be more attentive, or effective or emotionally valenced than that of same-sex faces because it produced stronger electromagnetic signals (deriving from excitatory post-synaptic potentials) in regions devoted to face processing and providing the emotional connotation to sensory information.</p>
<p>The centro&#x02013;parietal N400 is thought to result from incongruence between incoming information and the mental (semantic) representation of words (Kutas and Hillyard, <xref ref-type="bibr" rid="B28">1980</xref>), pictures and actions (Proverbio and Riva, <xref ref-type="bibr" rid="B39">2009</xref>). Furthermore, the N400 may indicate difficulty in semantic-integration processes (Brown and Hagoort, <xref ref-type="bibr" rid="B2">1993</xref>) and is elicited in same/different judgment tasks (Simos and Molfese, <xref ref-type="bibr" rid="B43">1997</xref>) by the presence of different or deviant items. Interestingly, Watson et al. (<xref ref-type="bibr" rid="B49">2007</xref>) found that the N400 indicated the extent to which self-relevant information conflicts with an individual&#x00027;s self-concept. In this light, the N400 was believed to judge differences in self-referent information. Similarly, our data suggest that the N400 occurs in response to viewing unfamiliar, attractive faces, and differs depending on the gender of the face in relation to the viewer.</p>
<p>The expedited processing of opposite-sex faces is similar to the other-race face effect (ORE) found in several ERP studies (Caldara et al., <xref ref-type="bibr" rid="B4">2004</xref>; Balas and Nelson, <xref ref-type="bibr" rid="B1">2010</xref>). For example, Caldara et al. (<xref ref-type="bibr" rid="B4">2004</xref>) found that, while other-race faces are recognized less accurately than same-race faces, other-race faces are classified faster. Using ERPs, the authors found a 20&#x02009;ms advantage for parietal P3 responses associated with the processing of Asian (other-race) relative to Caucasian (same-race) faces during a face-classification by race task in which ERPs were time-locked to face presentation.</p>
<p>While many studies have shown that same-race faces are recognized more easily than the faces of different, unfamiliar races (Byatt and Rhodes, <xref ref-type="bibr" rid="B3">2004</xref>), other studies have shown that other-race faces are more quickly classified by race. An ERP study by Balas and Nelson (<xref ref-type="bibr" rid="B1">2010</xref>) found earlier N170 latencies in response to black vs. white faces (in white participants), thus suggesting that unfamiliar pigmentation can accelerate configurational analysis in face processing. The ORE effect and the opposite-sex bias are similar in that both effects represent neural markers of self/other representation and the ability to distinguish between self and others, which may be related to self awareness and body representation (Decety and Sommerville, <xref ref-type="bibr" rid="B8">2003</xref>).</p>
<p>The present study also found a sex-related effect in the LP, which displayed larger modulation in women as a function of face gender (same or opposite). These results are consistent with a similar finding by Sun et al. (<xref ref-type="bibr" rid="B44">2010</xref>), who found increased P300 modulation in a gender-identification task in women. In that study, the data were interpreted to indicate that women conducted a more extensive evaluation process in categorizing male and female faces.</p>
<p>Elevated LP activity has previously been shown in response to attractive faces (Werheid et al., <xref ref-type="bibr" rid="B51">2007</xref>), emotional pictures (Dolcos and Cabeza, <xref ref-type="bibr" rid="B10">2002</xref>), and faces with emotional expressions (Eimer and Holmes, <xref ref-type="bibr" rid="B11">2007</xref>). Because the LP is increased for both unpleasant and pleasant stimuli, it is likely sensitive to arousal rather than emotional valence (Schupp et al., <xref ref-type="bibr" rid="B42">2006</xref>).</p>
<p>In a study measuring LP response, He et al. (<xref ref-type="bibr" rid="B19">2009</xref>) recorded ERPs in a group of white subjects while the subjects performed a gender-identification task including white, Asian and black faces. This group found an increased late positive complex at approximately 500&#x02009;ms that was associated with faces of the same race. He et al. interpreted this result to indicate extended processing of same-race faces. In contrast to N400, the LP deflection associated with sex or race might reflect an awareness of similarity rather than difference. However, because of the uniqueness of these data, further investigation is required to determine whether N400 and LP responses reflect a form of same/different discrimination at the representational level or whether they are specific to self-representation. Indeed, only two previous studies have investigated the specific effect of face gender on ERP components in both women and men (Oliver-Rodr&#x000ED;guez et al., <xref ref-type="bibr" rid="B32">1999</xref>; Suyama et al., <xref ref-type="bibr" rid="B45">2008</xref>). However, although these studies found that men and women differ in their specific responses to male and female faces, they did not explore opposite-/same-sex bias. In addition, the relative difference in the timing of sex-dependent modulation between studies might be caused by different experimental paradigms, which included gender discrimination (Suyama et al., <xref ref-type="bibr" rid="B45">2008</xref>), attractiveness rating (Oliver-Rodr&#x000ED;guez et al., <xref ref-type="bibr" rid="B32">1999</xref>), or passive viewing (the present study).</p>
<p>It should be noted that, although N400 responses to opposite-sex faces involved left hemispheric regions to a greater extent than right hemispheric regions, the LP to same-sex faces was strongly lateralized to the right hemisphere, as indicated by swLORETA source reconstruction. These results suggest a right asymmetry in the activation of the parahippocampal gyrus, the fusiform gyrus, the middle and superior temporal gyrus, and the middle frontal gyrus. This pattern of results is consistent with many studies suggesting a hemispheric asymmetry for the processing of self vs. other faces. For example, in a study where a group of patients underwent the intracarotid amobarbital (Wada) test (Keenan et al., <xref ref-type="bibr" rid="B25">2001</xref>), it was shown that the right hemisphere was preferentially involved in self-face recognition, and the left hemisphere was preferentially involved in other-face recognition. In a rather interesting paper on the split brain patient M.L. (Keenan et al., <xref ref-type="bibr" rid="B26">2003</xref>), who underwent a total callosotomy, it was found that when searching for the self-face in a series of morphs (composite facial images made up of his own and a famous face) the patient&#x00027;s performance was better when responding with the right hemisphere (i.e., indicating with the left hand). These data suggested that the right hemisphere is preferentially suited for self-face processing, a hypothesis that is also supported by analogous behavioral data (Keenan et al., <xref ref-type="bibr" rid="B24">2000</xref>).</p>
<p>Consistently, the inverse solution performed in the present study on LP activity in response to same-sex faces demonstrated a strong right hemispheric asymmetry in the activation of posterior brain regions and the right middle frontal gyrus (BA8). The current literature suggests that the right temporomesial and temporolateral cortices, along with the right posterior cingulate areas, right insula, and right prefrontal areas (Fink et al., <xref ref-type="bibr" rid="B15">1996</xref>), are involved in self representation (Craik et al., <xref ref-type="bibr" rid="B7">1999</xref>). In addition, there is neurological evidence that right hemispheric lesions can lead to self-representation disorders, lack of body awareness and deficits in the ability to distinguish between self and others (Decety and Sommerville, <xref ref-type="bibr" rid="B8">2003</xref>). These results are particularly pertinent to our data comparing the processing of same-sex with other-sex faces.</p>
<p>In conclusion, the present study found an opposite-/same-sex bias similar to the ORE (Caldara et al., <xref ref-type="bibr" rid="B4">2004</xref>) and identified specific neural markers for sex bias in face processing, including a larger and earlier N400 associated with opposite-sex faces and a wider LP associated with same-sex faces. Both the ERP patterns and our analysis of intra-cortical neural generators (swLORETA) indicated the activation of brains areas related to face processing and emotion, supporting the hypothesis that both genders process opposite-sex faces earlier and more effectively than same-sex faces.</p>
</sec>
<sec>
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ack>
<p>We are grateful to Roberta Adorni, Nicola Crotti, and Mirella Manfredi for their technical support. The study was funded by the Department of Psychology, by the University of Milano-Bicocca FAR 2008 and by IBFM-CNR grants.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Balas</surname> <given-names>B.</given-names></name> <name><surname>Nelson</surname> <given-names>C. A.</given-names></name></person-group> (<year>2010</year>). <article-title>The role of face shape and pigmentation in other-race face perception: an electrophysiological study</article-title>. <source>Neuropsychologia</source> <volume>48</volume>, <fpage>498</fpage>&#x02013;<lpage>506</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2009.10.007</pub-id><pub-id pub-id-type="pmid">19836406</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brown</surname> <given-names>C.</given-names></name> <name><surname>Hagoort</surname> <given-names>P.</given-names></name></person-group> (<year>1993</year>). <article-title>The processing nature of the N400: evidence from masked priming</article-title>. <source>J. Cogn. Neurosci.</source> <volume>5</volume>, <fpage>34</fpage>&#x02013;<lpage>44</lpage>.<pub-id pub-id-type="doi">10.1162/jocn.1993.5.1.34</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Byatt</surname> <given-names>G.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2004</year>). <article-title>Identification of own-race and other-race faces: implications for the representation of race in face space</article-title>. <source>Psychon. Bull. Rev.</source> <volume>11</volume>, <fpage>735</fpage>&#x02013;<lpage>741</lpage>.<pub-id pub-id-type="pmid">15581126</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caldara</surname> <given-names>R.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Bovet</surname> <given-names>P.</given-names></name> <name><surname>Hauert</surname> <given-names>C. A.</given-names></name></person-group> (<year>2004</year>). <article-title>Event-related potentials and time course of the &#x0201C;other-race&#x0201D; face classification advantage</article-title>. <source>Neuroreport</source> <volume>9</volume>, <fpage>905</fpage>&#x02013;<lpage>910</lpage>.<pub-id pub-id-type="doi">10.1097/00001756-200404090-00034</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Conway</surname> <given-names>C.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name></person-group> (<year>2008</year>). <article-title>Evidence for adaptive design in human gaze preference</article-title>. <source>Proc. Biol. Sci.</source> <volume>7</volume>, <fpage>63</fpage>&#x02013;<lpage>69</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.2007.1073</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cornwell</surname> <given-names>R. E.</given-names></name> <name><surname>Boothroyd</surname> <given-names>L.</given-names></name> <name><surname>Burt</surname> <given-names>D. M.</given-names></name> <name><surname>Feinberg</surname> <given-names>D. R.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Pitman</surname> <given-names>R.</given-names></name> <name><surname>Whiten</surname> <given-names>S.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name></person-group> (<year>2004</year>). <article-title>Concordant preferences for opposite-sex signals? Human pheromones and facial characteristics</article-title>. <source>Proc. Biol. Sci.</source> <volume>22</volume>, <fpage>635</fpage>&#x02013;<lpage>640</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.2003.2649</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Craik</surname> <given-names>I. M.</given-names></name> <name><surname>Moroz</surname> <given-names>T.</given-names></name> <name><surname>Moscovitch</surname> <given-names>M.</given-names></name> <name><surname>Stuss</surname> <given-names>D. T.</given-names></name> <name><surname>Wincour</surname> <given-names>G.</given-names></name> <name><surname>Tulving</surname> <given-names>E.</given-names></name> <name><surname>Kapur</surname> <given-names>S.</given-names></name></person-group> (<year>1999</year>). <article-title>In search of the self: a positron emission tomography study</article-title>. <source>Psychol. Sci.</source> <volume>10</volume>, <fpage>26</fpage>&#x02013;<lpage>34</lpage>.<pub-id pub-id-type="doi">10.1111/1467-9280.00102</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Decety</surname> <given-names>J.</given-names></name> <name><surname>Sommerville</surname> <given-names>J. A.</given-names></name></person-group> (<year>2003</year>). <article-title>Shared representations between self and other: a social cognitive neuroscience view</article-title>. <source>Trends Cogn. Sci.</source> <volume>7</volume>, <fpage>527</fpage>&#x02013;<lpage>533</lpage>.<pub-id pub-id-type="doi">10.1016/j.tics.2003.10.004</pub-id><pub-id pub-id-type="pmid">14643368</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dolan</surname> <given-names>R. J.</given-names></name> <name><surname>Fletcher</surname> <given-names>P.</given-names></name> <name><surname>Morris</surname> <given-names>J.</given-names></name> <name><surname>Kapur</surname> <given-names>N.</given-names></name> <name><surname>Deakin</surname> <given-names>J. F. W.</given-names></name> <name><surname>Frith</surname> <given-names>C. D.</given-names></name></person-group> (<year>1996</year>). <article-title>Neural activation during covert processing of positive emotional facial expressions</article-title>. <source>NeuroImage</source> <volume>4</volume>, <fpage>194</fpage>&#x02013;<lpage>200</lpage>.<pub-id pub-id-type="doi">10.1006/nimg.1996.0070</pub-id><pub-id pub-id-type="pmid">9345509</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dolcos</surname> <given-names>F.</given-names></name> <name><surname>Cabeza</surname> <given-names>R.</given-names></name></person-group> (<year>2002</year>). <article-title>Event-related potentials of emotional memory: encoding pleasant, unpleasant, and neutral pictures</article-title>. <source>Cogn. Affect. Behav. Neurosci.</source> <volume>2</volume>, <fpage>252</fpage>&#x02013;<lpage>263</lpage>.<pub-id pub-id-type="doi">10.3758/CABN.2.3.252</pub-id><pub-id pub-id-type="pmid">12775189</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name> <name><surname>Holmes</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>Event-related brain potential correlates of emotional face processing</article-title>. <source>Neuropsychologia</source> <volume>45</volume>, <fpage>15</fpage>&#x02013;<lpage>31</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2006.04.022</pub-id><pub-id pub-id-type="pmid">16797614</pub-id></citation></ref>
<ref id="B12"><citation citation-type="confproc"><person-group person-group-type="author"><name><surname>Evans</surname> <given-names>A. C.</given-names></name> <name><surname>Collins</surname> <given-names>D. L.</given-names></name></person-group> (<year>1993</year>). <article-title>&#x0201C;3D statistical neuroanatomical models from 305 MRI volumes,&#x0201D;</article-title> in <source>Proceedings of IEEE-Nuclear Science Symposium and Medical Imaging Conference</source>, <fpage>1813</fpage>&#x02013;<lpage>1817</lpage>.</citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Feinberg</surname> <given-names>D. R.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name></person-group> (<year>2008</year>). <article-title>Correlated preferences for men&#x00027;s facial and vocal masculinity</article-title>. <source>Evol. Hum. Behav.</source> <volume>29</volume>, <fpage>233</fpage>&#x02013;<lpage>241</lpage>.<pub-id pub-id-type="doi">10.1016/j.evolhumbehav.2007.12.008</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Feinberg</surname> <given-names>D. R.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Burt</surname> <given-names>D. M.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name></person-group> (<year>2005</year>). <article-title>Manipulations of fundamental and formant frequencies influence the attractiveness of human male voices</article-title>. <source>Anim. Behav.</source> <volume>69</volume>, <fpage>561</fpage>&#x02013;<lpage>568</lpage>.<pub-id pub-id-type="doi">10.1016/j.anbehav.2004.06.012</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fink</surname> <given-names>G.</given-names></name> <name><surname>Markowitsch</surname> <given-names>H.</given-names></name> <name><surname>Reinkemeier</surname> <given-names>M.</given-names></name> <name><surname>Bruckbauer</surname> <given-names>T.</given-names></name> <name><surname>Kessler</surname> <given-names>J.</given-names></name> <name><surname>Heiss</surname> <given-names>W. D.</given-names></name></person-group> (<year>1996</year>). <article-title>Cerebral representation of one&#x00027;s own past: neural networks involved in autobiographical memory</article-title>. <source>J. Neurosci.</source> <volume>16</volume>, <fpage>4275</fpage>&#x02013;<lpage>4282</lpage>.<pub-id pub-id-type="pmid">8753888</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fischer</surname> <given-names>H.</given-names></name> <name><surname>Fransson</surname> <given-names>P.</given-names></name> <name><surname>Wright</surname> <given-names>C. I.</given-names></name> <name><surname>B&#x000E4;ckman</surname> <given-names>L.</given-names></name></person-group> (<year>2004a</year>). <article-title>Enhanced occipital and anterior cingulate activation in men but not in women during exposure to angry and fearful male faces</article-title>. <source>Cogn. Affect. Behav. Neurosci.</source> <volume>4</volume>, <fpage>326</fpage>&#x02013;<lpage>334</lpage>.<pub-id pub-id-type="doi">10.3758/CABN.4.3.326</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fischer</surname> <given-names>H.</given-names></name> <name><surname>Sandblom</surname> <given-names>J.</given-names></name> <name><surname>Herlitz</surname> <given-names>A.</given-names></name> <name><surname>Fransson</surname> <given-names>P.</given-names></name> <name><surname>Wright</surname> <given-names>C. I.</given-names></name> <name><surname>B&#x000E4;ckman</surname> <given-names>L.</given-names></name></person-group> (<year>2004b</year>). <article-title>Sex-differential brain activation during exposure to female and male faces</article-title>. <source>Neuroreport</source> <volume>9</volume>, <fpage>235</fpage>&#x02013;<lpage>238</lpage>.<pub-id pub-id-type="doi">10.1097/00001756-200402090-00004</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Geselowitz</surname> <given-names>D. B.</given-names></name></person-group> (<year>1967</year>). <article-title>On bioelectric potentials in an inhomogeneous volume conductor</article-title>. <source>Biophys. J.</source> <volume>7</volume>, <fpage>1</fpage>&#x02013;<lpage>11</lpage>.<pub-id pub-id-type="doi">10.1016/S0006-3495(67)86571-8</pub-id><pub-id pub-id-type="pmid">19210978</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>He</surname> <given-names>Y.</given-names></name> <name><surname>Johnson</surname> <given-names>M. K.</given-names></name> <name><surname>Dovidio</surname> <given-names>J. F.</given-names></name> <name><surname>McCarthy</surname> <given-names>G.</given-names></name></person-group> (<year>2009</year>). <article-title>The relation between race-related implicit associations and scalp-recorded neural activity evoked by faces from different races</article-title>. <source>Soc Neurosci.</source> <volume>4</volume>, <fpage>426</fpage>&#x02013;<lpage>442</lpage>.<pub-id pub-id-type="doi">10.1080/17470910902949184</pub-id><pub-id pub-id-type="pmid">19562628</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hofmann</surname> <given-names>S. G.</given-names></name> <name><surname>Suvak</surname> <given-names>M.</given-names></name> <name><surname>Litz</surname> <given-names>B. T.</given-names></name></person-group> (<year>2006</year>). <article-title>Sex differences in face recognition and influence of facial affect</article-title>. <source>Pers. Indiv. Differ.</source> <volume>40</volume>, <fpage>1683</fpage>&#x02013;<lpage>1690</lpage>.<pub-id pub-id-type="doi">10.1016/j.paid.2005.12.014</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>Feinberg</surname> <given-names>D. R.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Vukovic</surname> <given-names>J.</given-names></name></person-group> (<year>2010</year>). <article-title>A domain-specific opposite-sex bias in human preferences for manipulated voice pitch</article-title>. <source>Anim. Behav.</source> <volume>79</volume>, <fpage>57</fpage>&#x02013;<lpage>62</lpage>.<pub-id pub-id-type="doi">10.1016/j.anbehav.2009.10.003</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kanwisher</surname> <given-names>N.</given-names></name> <name><surname>McDermott</surname> <given-names>J.</given-names></name> <name><surname>Chun</surname> <given-names>M. M.</given-names></name></person-group> (<year>1997</year>). <article-title>The fusiform face area: a module in human extrastriate cortex specialized for face perception</article-title>. <source>J. Neurosci.</source> <volume>17</volume>, <fpage>4302</fpage>&#x02013;<lpage>4311</lpage>.<pub-id pub-id-type="pmid">9151747</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kanwisher</surname> <given-names>N.</given-names></name> <name><surname>Yovel</surname> <given-names>G.</given-names></name></person-group> (<year>2006</year>). <article-title>The fusiform face area: a cortical region specialized for the perception of faces</article-title>. <source>Philos. Trans. R Soc. Lond. B Biol. Sci.</source> <volume>361</volume>, <fpage>2109</fpage>&#x02013;<lpage>2128</lpage>.<pub-id pub-id-type="doi">10.1098/rstb.2006.1934</pub-id><pub-id pub-id-type="pmid">17118927</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Keenan</surname> <given-names>J. P.</given-names></name> <name><surname>Freund</surname> <given-names>S.</given-names></name> <name><surname>Hamilton</surname> <given-names>R. H.</given-names></name> <name><surname>Ganis</surname> <given-names>G.</given-names></name> <name><surname>Pascual-Leone</surname> <given-names>A.</given-names></name></person-group> (<year>2000</year>). <article-title>Hand response differences in a self-face identification task</article-title>. <source>Neuropsychologia</source> <volume>38</volume>, <fpage>1047</fpage>&#x02013;<lpage>1053</lpage>.<pub-id pub-id-type="doi">10.1016/S0028-3932(99)00145-1</pub-id><pub-id pub-id-type="pmid">10775715</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Keenan</surname> <given-names>J. P.</given-names></name> <name><surname>Nelson</surname> <given-names>A.</given-names></name> <name><surname>O&#x00027;Connor</surname> <given-names>M.</given-names></name> <name><surname>Pascual-Leone</surname> <given-names>A.</given-names></name></person-group> (<year>2001</year>). <article-title>Neurology: self-recognition and the right hemisphere</article-title>. <source>Nature</source> <volume>409</volume>, <fpage>305</fpage>.<pub-id pub-id-type="doi">10.1038/35053167</pub-id><pub-id pub-id-type="pmid">11201730</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Keenan</surname> <given-names>J. P.</given-names></name> <name><surname>Wheeler</surname> <given-names>M.</given-names></name> <name><surname>Steven</surname> <given-names>M.</given-names></name> <name><surname>Platek</surname> <given-names>S. M.</given-names></name> <name><surname>Lardi</surname> <given-names>G.</given-names></name> <name><surname>Lassonde</surname> <given-names>M.</given-names></name></person-group> (<year>2003</year>). <article-title>Self-face processing in a callosotomy patient</article-title>. <source>Eur. J. Neurosci.</source> <volume>18</volume>, <fpage>2391</fpage>&#x02013;<lpage>2395</lpage>.<pub-id pub-id-type="doi">10.1046/j.1460-9568.2003.02958.x</pub-id><pub-id pub-id-type="pmid">14622201</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kranz</surname> <given-names>F.</given-names></name> <name><surname>Ishai</surname> <given-names>A.</given-names></name></person-group> (<year>2006</year>). <article-title>Face perception is modulated by sexual preference</article-title>. <source>Curr. Biol.</source> <volume>16</volume>, <fpage>63</fpage>&#x02013;<lpage>68</lpage>.<pub-id pub-id-type="doi">10.1016/j.cub.2005.10.070</pub-id><pub-id pub-id-type="pmid">16401423</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kutas</surname> <given-names>M.</given-names></name> <name><surname>Hillyard</surname> <given-names>S. A.</given-names></name></person-group> (<year>1980</year>). <article-title>Reading senseless sentences: brain potentials reflect semantic incongruity</article-title>. <source>Science</source> <volume>207</volume>, <fpage>203</fpage>&#x02013;<lpage>205</lpage>.<pub-id pub-id-type="doi">10.1126/science.7350657</pub-id><pub-id pub-id-type="pmid">7350657</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Langeslag</surname> <given-names>S. J. E.</given-names></name> <name><surname>Franken</surname> <given-names>I. H. A.</given-names></name> <name><surname>Van Strien</surname> <given-names>J. W.</given-names></name></person-group> (<year>2008</year>). <article-title>Dissociating love-related attention from task-related attention: an event-related potential oddball study</article-title>. <source>Neurosci. Lett.</source> <volume>431</volume>, <fpage>236</fpage>.<pub-id pub-id-type="doi">10.1016/j.neulet.2007.11.044</pub-id><pub-id pub-id-type="pmid">18162320</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Langeslag</surname> <given-names>S. J. E.</given-names></name> <name><surname>Jansma</surname> <given-names>B. M.</given-names></name> <name><surname>Franken</surname> <given-names>I. H. A.</given-names></name> <name><surname>Van Strien</surname> <given-names>J. W.</given-names></name></person-group> (<year>2007</year>). <article-title>Event-related potential responses to love-related facial stimuli</article-title>. <source>Biol. Psychol.</source> <volume>76</volume>, <fpage>109</fpage>&#x02013;<lpage>115</lpage>.<pub-id pub-id-type="doi">10.1016/j.biopsycho.2007.06.007</pub-id><pub-id pub-id-type="pmid">17681417</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>Penton-Voak</surname> <given-names>I. S.</given-names></name> <name><surname>Burt</surname> <given-names>D. M.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name></person-group> (<year>2002</year>). <article-title>Partnership status and the temporal context of relationships influence human female preferences for sexual dimorphism in male face shape</article-title>. <source>Proc. Roy. Soc. Lond. B: Biol. Sci.</source> <volume>269</volume>, <fpage>1095</fpage>&#x02013;<lpage>1100</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.2002.1984</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oliver-Rodr&#x000ED;guez</surname> <given-names>J. C.</given-names></name> <name><surname>Guan</surname> <given-names>Z.</given-names></name> <name><surname>Johnston</surname> <given-names>V. S.</given-names></name></person-group> (<year>1999</year>). <article-title>Gender differences in late positive components evoked by human faces</article-title>. <source>Psychophysiology</source> <volume>32</volume>, <fpage>176</fpage>&#x02013;<lpage>185</lpage>.</citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Palmero-Soler</surname> <given-names>E.</given-names></name> <name><surname>Dolan</surname> <given-names>K.</given-names></name> <name><surname>Hadamschek</surname> <given-names>V.</given-names></name> <name><surname>Tass</surname> <given-names>P. A.</given-names></name></person-group> (<year>2007</year>). <article-title>swLORETA: a novel approach to robust source localization and synchronization tomography</article-title>. <source>Phys. Med. Biol.</source> <volume>52</volume>, <fpage>1783</fpage>&#x02013;<lpage>1800</lpage>.<pub-id pub-id-type="doi">10.1088/0031-9155/52/7/002</pub-id><pub-id pub-id-type="pmid">17374911</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Paradiso</surname> <given-names>S.</given-names></name> <name><surname>Johnson</surname> <given-names>D. L.</given-names></name> <name><surname>Andreasen</surname> <given-names>N. C.</given-names></name> <name><surname>O&#x00027;Leary</surname> <given-names>D. S.</given-names></name> <name><surname>Watkins</surname> <given-names>G. L.</given-names></name> <name><surname>Boles Ponto</surname> <given-names>L. L.</given-names></name> <name><surname>Hichwa</surname> <given-names>R. D.</given-names></name></person-group> (<year>1999</year>). <article-title>Cerebral blood flow changes associated with attribution of emotional valence to pleasant, unpleasant, and neutral visual stimuli in a PET study of normal subjects</article-title>. <source>Am. J. Psychiatry</source> <volume>156</volume>, <fpage>1618</fpage>&#x02013;<lpage>1629</lpage>.<pub-id pub-id-type="pmid">10518175</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pasqual-Marqui</surname> <given-names>R. D.</given-names></name> <name><surname>Michel</surname> <given-names>C. M.</given-names></name> <name><surname>Lehmann</surname> <given-names>D.</given-names></name></person-group> (<year>1994</year>). <article-title>Low resolution electromagnetic tomography: a new method for localizing electrical activity in the brain</article-title>. <source>Int. J. Psychophysiol.</source> <volume>18</volume>, <fpage>49</fpage>&#x02013;<lpage>65</lpage>.<pub-id pub-id-type="doi">10.1016/0167-8760(84)90014-X</pub-id><pub-id pub-id-type="pmid">7876038</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Penton-Voak</surname> <given-names>I. S.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Baker</surname> <given-names>S.</given-names></name> <name><surname>Tiddeman</surname> <given-names>B. P.</given-names></name> <name><surname>Burt</surname> <given-names>D. M.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name></person-group> (<year>2001</year>). <article-title>Symmetry and sexual dimorphism in facial proportions and male facial attractiveness</article-title>. <source>Proc. R. Soc. B.</source> <volume>268</volume>, <fpage>1617</fpage>&#x02013;<lpage>1623</lpage>.<pub-id pub-id-type="doi">10.1098/rspb.2001.1703</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perrett</surname> <given-names>D. I.</given-names></name> <name><surname>Lee</surname> <given-names>K. J.</given-names></name> <name><surname>Penton-Voak</surname> <given-names>I.</given-names></name> <name><surname>Rowland</surname> <given-names>D.</given-names></name> <name><surname>Yoshikawa</surname> <given-names>S.</given-names></name> <name><surname>Burt</surname> <given-names>D. M.</given-names></name> <name><surname>Henzi</surname> <given-names>S. P.</given-names></name> <name><surname>Castles</surname> <given-names>D. L.</given-names></name> <name><surname>Akamatsu</surname> <given-names>S.</given-names></name></person-group> (<year>1998</year>). <article-title>Effects of sexual dimorphism on facial attractiveness</article-title>. <source>Nature</source> <volume>394</volume>, <fpage>884</fpage>&#x02013;<lpage>887</lpage>.<pub-id pub-id-type="doi">10.1038/29772</pub-id><pub-id pub-id-type="pmid">9732869</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Proverbio</surname> <given-names>A. M.</given-names></name> <name><surname>Brignone</surname> <given-names>V.</given-names></name> <name><surname>Matarazzo</surname> <given-names>S.</given-names></name> <name><surname>Del Zotto</surname> <given-names>M.</given-names></name> <name><surname>Zani</surname> <given-names>A.</given-names></name></person-group> (<year>2006</year>). <article-title>Gender differences in hemispheric asymmetry for face processing</article-title>. <source>BMC Neurosci.</source> <volume>8</volume>, <fpage>44</fpage>.<pub-id pub-id-type="doi">10.1186/1471-2202-7-44</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Proverbio</surname> <given-names>A. M.</given-names></name> <name><surname>Riva</surname> <given-names>F.</given-names></name></person-group> (<year>2009</year>). <article-title>RP and N400 ERP components reflect semantic violations in visual processing of human actions</article-title>. <source>Neurosci. Lett.</source> <volume>459</volume>, <fpage>142</fpage>&#x02013;<lpage>146</lpage>.<pub-id pub-id-type="doi">10.1016/j.neulet.2009.05.012</pub-id><pub-id pub-id-type="pmid">19427368</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Proverbio</surname> <given-names>A. M.</given-names></name> <name><surname>Zani</surname> <given-names>A.</given-names></name> <name><surname>Adorni</surname> <given-names>R.</given-names></name></person-group> (<year>2008</year>). <article-title>Neural markers of a greater female responsiveness to social stimuli</article-title>. <source>BMC Neurosci.</source> <volume>30</volume>, <fpage>56</fpage>.<pub-id pub-id-type="doi">10.1186/1471-2202-9-56</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2006</year>). <article-title>The evolutionary psychology of facial beauty</article-title>. <source>Annu. Rev. Psychol.</source> <volume>57</volume>, <fpage>199</fpage>&#x02013;<lpage>226</lpage>.<pub-id pub-id-type="doi">10.1146/annurev.psych.57.102904.190208</pub-id><pub-id pub-id-type="pmid">16318594</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schupp</surname> <given-names>H. T.</given-names></name> <name><surname>Flaisch</surname> <given-names>T.</given-names></name> <name><surname>Stockburger</surname> <given-names>J.</given-names></name> <name><surname>Jungh&#x000F6;fer</surname> <given-names>M.</given-names></name> <name><surname>Anders</surname> <given-names>S.</given-names></name></person-group> (<year>2006</year>). <article-title>Emotion and attention: event-related brain potential studies</article-title>. <source>Prog. Brain Res.</source> <volume>156</volume>, <fpage>31</fpage>&#x02013;<lpage>51</lpage>.<pub-id pub-id-type="doi">10.1016/S0079-6123(06)56002-9</pub-id><pub-id pub-id-type="pmid">17015073</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Simos</surname> <given-names>P. G.</given-names></name> <name><surname>Molfese</surname> <given-names>D. L.</given-names></name></person-group> (<year>1997</year>). <article-title>Event-related potentials in a two-choice task involving within-form comparisons of pictures and words</article-title>. <source>Int. J. Neurosci.</source> <volume>90</volume>, <fpage>233</fpage>&#x02013;<lpage>253</lpage>.<pub-id pub-id-type="doi">10.3109/00207459709000641</pub-id><pub-id pub-id-type="pmid">9352430</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sun</surname> <given-names>Y.</given-names></name> <name><surname>Gao</surname> <given-names>X.</given-names></name> <name><surname>Han</surname> <given-names>S.</given-names></name></person-group> (<year>2010</year>). <article-title>Sex differences in face gender recognition: an event-related potential study</article-title>. <source>Brain Res.</source> <volume>1327</volume>, <fpage>69</fpage>&#x02013;<lpage>76</lpage>.<pub-id pub-id-type="doi">10.1016/j.brainres.2010.02.013</pub-id><pub-id pub-id-type="pmid">20153301</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Suyama</surname> <given-names>N.</given-names></name> <name><surname>Hoshiyama</surname> <given-names>M.</given-names></name> <name><surname>Shimizu</surname> <given-names>H.</given-names></name> <name><surname>Saito</surname> <given-names>H.</given-names></name></person-group> (<year>2008</year>). <article-title>Event-related potentials for gender discrimination: an examination between differences in gender discrimination between males and females</article-title>. <source>Int. J. Neurosci.</source> <volume>118</volume>, <fpage>1227</fpage>&#x02013;<lpage>1237</lpage>.<pub-id pub-id-type="doi">10.1080/00207450601047176</pub-id><pub-id pub-id-type="pmid">18698506</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Turk</surname> <given-names>D. J.</given-names></name> <name><surname>Banfield</surname> <given-names>J. F.</given-names></name> <name><surname>Walling</surname> <given-names>B. R.</given-names></name> <name><surname>Heatherton</surname> <given-names>T. F.</given-names></name> <name><surname>Grafton</surname> <given-names>S. T.</given-names></name> <name><surname>Handy</surname> <given-names>T. C.</given-names></name> <name><surname>Gazzaniga</surname> <given-names>M. S.</given-names></name> <name><surname>Macrae</surname> <given-names>C. N.</given-names></name></person-group> (<year>2004</year>). <article-title>From facial cue to dinner for two: the neural substrates of personal choice</article-title>. <source>NeuroImage</source> <volume>22</volume>, <fpage>1281</fpage>&#x02013;<lpage>1290</lpage>.<pub-id pub-id-type="doi">10.1016/j.neuroimage.2004.02.037</pub-id><pub-id pub-id-type="pmid">15219600</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vuilleumier</surname> <given-names>P.</given-names></name> <name><surname>Armony</surname> <given-names>J. L.</given-names></name> <name><surname>Driver</surname> <given-names>J.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2001</year>). <article-title>Effects of attention and emotion on face processing in the human brain: an event-related fMRI study</article-title>. <source>Neuron</source> <volume>30</volume>, <fpage>829</fpage>&#x02013;<lpage>841</lpage>.<pub-id pub-id-type="doi">10.1016/S0896-6273(01)00328-2</pub-id><pub-id pub-id-type="pmid">11430815</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vukovic</surname> <given-names>J.</given-names></name> <name><surname>Feinberg</surname> <given-names>D. R.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Welling</surname> <given-names>L. L. M.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Smith</surname> <given-names>F. G.</given-names></name></person-group> (<year>2008</year>). <article-title>Self-rated attractiveness predicts individual differences in women&#x00027;s preferences for masculine men&#x00027;s voices</article-title>. <source>Pers. Individ. Dif.</source> <volume>45</volume>, <fpage>451</fpage>&#x02013;<lpage>456</lpage>.<pub-id pub-id-type="doi">10.1016/j.paid.2008.05.013</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Watson</surname> <given-names>L. A.</given-names></name> <name><surname>Dritschel</surname> <given-names>B.</given-names></name> <name><surname>Obonsawin</surname> <given-names>M. C.</given-names></name> <name><surname>Jentzsch</surname> <given-names>I.</given-names></name></person-group> (<year>2007</year>). <article-title>Seeing yourself in a positive light: brain correlates of the self-positivity bias</article-title>. <source>Brain Res.</source> <volume>1152</volume>, <fpage>106</fpage>&#x02013;<lpage>110</lpage>.<pub-id pub-id-type="doi">10.1016/j.brainres.2007.03.049</pub-id><pub-id pub-id-type="pmid">17462610</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Welling</surname> <given-names>L. L. M.</given-names></name> <name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Smith</surname> <given-names>F. G.</given-names></name> <name><surname>Feinberg</surname> <given-names>D. R.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Al-Dujaili</surname> <given-names>E. A. S.</given-names></name></person-group> (<year>2008</year>). <article-title>Men report stronger attraction to femininity in women&#x00027;s faces when their testosterone levels are high</article-title>. <source>Horm. Behav.</source> <volume>54</volume>, <fpage>703</fpage>&#x02013;<lpage>708</lpage>.<pub-id pub-id-type="doi">10.1016/j.yhbeh.2008.07.012</pub-id><pub-id pub-id-type="pmid">18755192</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Werheid</surname> <given-names>K.</given-names></name> <name><surname>Schacht</surname> <given-names>A.</given-names></name> <name><surname>Sommer</surname> <given-names>W.</given-names></name></person-group> (<year>2007</year>). <article-title>Facial attractiveness modulates early and late event-related brain potentials</article-title>. <source>Biol. Psychol</source>. <volume>76</volume>, <fpage>100</fpage>&#x02013;<lpage>108</lpage>.<pub-id pub-id-type="doi">10.1016/j.biopsycho.2007.06.008</pub-id><pub-id pub-id-type="pmid">17681418</pub-id></citation></ref>
</ref-list>
</back>
</article>
