<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Hum. Neurosci.</journal-id>
<journal-title>Frontiers in Human Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Hum. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5161</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnhum.2017.00486</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Exploring the Role of Spatial Frequency Information during Neural Emotion Processing in Human Infants</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Jessen</surname> <given-names>Sarah</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/75022/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name><surname>Grossmann</surname> <given-names>Tobias</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/16456/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Research Group &#x0201C;Early Social Development&#x0201D;, Max Planck Institute for Human Cognitive and Brain Sciences</institution>, <addr-line>Leipzig</addr-line>, <country>Germany</country></aff>
<aff id="aff2"><sup>2</sup><institution>Department of Neurology, University of L&#x000FC;beck</institution> <country>L&#x000FC;beck, Germany</country></aff>
<aff id="aff3"><sup>3</sup><institution>Department of Psychology, University of Virginia</institution>, <addr-line>Charlottesville, VA</addr-line>, <country>United States</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Hidehiko Okamoto, National Institute for Physiological Sciences, Japan</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Mikko Peltola, University of Tampere, Finland; Alan J. Pegna, The University of Queensland, Australia</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Sarah Jessen <email>sarah.jessen&#x00040;neuro.uni-luebeck.de</email> Tobias Grossmann <email>tg3ny&#x00040;virginia.edu</email></p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>09</day>
<month>10</month>
<year>2017</year>
</pub-date>
<pub-date pub-type="collection">
<year>2017</year>
</pub-date>
<volume>11</volume>
<elocation-id>486</elocation-id>
<history>
<date date-type="received">
<day>11</day>
<month>05</month>
<year>2017</year>
</date>
<date date-type="accepted">
<day>20</day>
<month>09</month>
<year>2017</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2017 Jessen and Grossmann.</copyright-statement>
<copyright-year>2017</copyright-year>
<copyright-holder>Jessen and Grossmann</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract><p>Enhanced attention to fear expressions in adults is primarily driven by information from low as opposed to high spatial frequencies contained in faces. However, little is known about the role of spatial frequency information in emotion processing during infancy. In the present study, we examined the role of low compared to high spatial frequencies in the processing of happy and fearful facial expressions by using filtered face stimuli and measuring event-related brain potentials (ERPs) in 7-month-old infants (<italic>N</italic> = 26). Our results revealed that infants&#x02019; brains discriminated between emotional facial expressions containing high but not between expressions containing low spatial frequencies. Specifically, happy faces containing high spatial frequencies elicited a smaller Nc amplitude than fearful faces containing high spatial frequencies and happy and fearful faces containing low spatial frequencies. Our results demonstrate that already in infancy spatial frequency content influences the processing of facial emotions. Furthermore, we observed that fearful facial expressions elicited a comparable Nc response for high and low spatial frequencies, suggesting a robust detection of fearful faces irrespective of spatial frequency content, whereas the detection of happy facial expressions was contingent upon frequency content. In summary, these data provide new insights into the neural processing of facial emotions in early development by highlighting the differential role played by spatial frequencies in the detection of fear and happiness.</p></abstract>
<kwd-group>
<kwd>infancy</kwd>
<kwd>face processing</kwd>
<kwd>emotion perception</kwd>
<kwd>EEG</kwd>
<kwd>spatial frequencies</kwd>
</kwd-group>
<contract-sponsor id="cn001">Max-Planck-Gesellschaft<named-content content-type="fundref-id">10.13039/501100004189</named-content></contract-sponsor>
<counts>
<fig-count count="3"/>
<table-count count="0"/>
<equation-count count="8"/>
<ref-count count="52"/>
<page-count count="8"/>
<word-count count="7085"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction" id="s1">
<title>Introduction</title>
<p>Fast and efficient processing of emotional information is crucial for human behavior as it enables adaptive responding during social interactions (Frith, <xref ref-type="bibr" rid="B15">2009</xref>). Over the past two decades, much research has focused on investigating the neural basis of emotion processing (Adolphs, <xref ref-type="bibr" rid="B1">2002</xref>; G&#x000FC;ntekin and Ba&#x0015F;ar, <xref ref-type="bibr" rid="B18">2014</xref>; de Gelder et al., <xref ref-type="bibr" rid="B6">2015</xref>; Kragel and LaBar, <xref ref-type="bibr" rid="B28">2016</xref>). One important insight from this area of research is that magno- and parvocellular pathways in the visual system contribute in different ways to emotion processing. In particular, there is work to show that fast and efficient emotion processing is predominantly instantiated by the magnocellular pathway, whereas more detailed processing of facial information primarily involves the parvocellular pathway (Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>).</p>
<p>The general properties of these two pathways have been intensively studied in vision research. Independent of emotional content, the magnocellular pathway is primarily responsible for the fast, yet coarse, processing of visual input, while the parvocellular pathway is mainly involved in the slower processing of fine visual details (Livingstone and Hubel, <xref ref-type="bibr" rid="B30">1988</xref>). The two pathways can be studied by filtering the visual input with respect to its spatial frequency information (Hammarrenger et al., <xref ref-type="bibr" rid="B19">2003</xref>). Specifically, while the parvocellular pathway is most sensitive to high spatial frequency (HSF) information, the magnocellular pathway is most sensitive to low spatial frequencies (LSF; Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>). LSF filtered images predominantly contain global information, while HSF filtered images provide more detailed information necessary for fine-grained processing of images (Goffaux and Rossion, <xref ref-type="bibr" rid="B16">2006</xref>).</p>
<p>In recent years, the differential processing of HSF and LSF has been used to study different aspects of visual emotion processing in human adults. In particular, while LSF information (&#x0003C;6 cycles/&#x000B0;) appears to play a crucial role in the detection and classification of fearful information, HSF information (&#x0003E;24 cycles/&#x000B0;) is more important for non-emotional face processing such as facial identity matching (Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>). Accordingly, in adults, activity in the fusiform cortex is mainly driven by the HSF content of images, while activity in the amygdala is primarily driven by the LSF content of images (Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>; M&#x000E9;ndez-B&#x000E9;rtolo et al., <xref ref-type="bibr" rid="B31">2016</xref>). It has been suggested that emotionally negative LSF input primarily activates the magnocellular pathway which in turn elicits a fast and efficient processing of highly salient and arousing information in the amygdala (Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>). In contrast, the observed activation of the fusiform cortex by viewing HSF images points to a slow and more detailed processing of facial features required for identity recognition (Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>). Moreover, the predominant processing of emotionally salient information via a subcortical pathway receiving mainly magnocellular input has also been argued to underpin non-conscious visual emotion processing (for a review, see Tamietto and de Gelder, <xref ref-type="bibr" rid="B43">2010</xref>). For example, cortically blind patients show sensitive responding to fearful information, which is thought to rely on a subcortical pathway bypassing cortical visual processing (de Gelder et al., <xref ref-type="bibr" rid="B7">1999</xref>).</p>
<p>Further evidence for a specific role of the magnocellular pathway in the fast processing of emotionally salient information comes from event-related brain potential (ERP) studies. Viewing LSF filtered images of fearful faces but not HSF filtered images result in an enhancement of the visual P1 in adults (Pourtois et al., <xref ref-type="bibr" rid="B39">2005</xref>; Vlamings et al., <xref ref-type="bibr" rid="B47">2009</xref>). The P1 originates from the extrastriate visual cortex and is an ERP peaking between 100 ms and 130 ms post-stimulus in response to particularly salient visual information (Clark and Hillyard, <xref ref-type="bibr" rid="B5">1996</xref>), indicating an increased allocation of attention to LSF filtered images of fearful faces (Pourtois et al., <xref ref-type="bibr" rid="B39">2005</xref>). Although the P1 occurs before the N170 component, which is commonly linked to the structural processing of faces (Rossion, <xref ref-type="bibr" rid="B41">2014</xref>), a number of studies report a modulation of P1 amplitude by emotional, in particular fearful, facial expressions (Batty and Taylor, <xref ref-type="bibr" rid="B2">2003</xref>; Pourtois et al., <xref ref-type="bibr" rid="B40">2004</xref>; Smith et al., <xref ref-type="bibr" rid="B42">2013</xref>). The fact that emotional content can modulate brain responses before the structural processing of facial information takes place provides further support for the existence of a fast but coarse pathway that bypasses classical face processing to elicit a rapid response to negative, in particular fearful, facial expressions.</p>
<p>While the evidence from work with adults supports the notion of fast responding to fearful facial expressions mediated via LSF and the magnocellular pathway, little is known about the role of the magnocellular and parvocellular pathways in emotion processing in development. Recent findings suggest that children rely on HSF rather than LSF information when detecting fearful facial expressions (Vlamings et al., <xref ref-type="bibr" rid="B48">2010</xref>). Specifically, Vlamings et al. (<xref ref-type="bibr" rid="B48">2010</xref>) recorded EEG responses from children between 3 and 8 years of age in response to HSF and LSF fearful and neutral facial expressions. In contrast to previous findings with adults, children showed an enhanced P1 for fearful compared to neutral faces only when HSF images were presented. This is taken to suggest that children rely on different frequency information and might need more detailed feature-focused information than adults when processing fearful facial expressions. This developmental view has been confirmed by a recent ERP study with 9-to-10-month-old infants showing that infants at this age also predominantly use HSF to discriminate happy, fearful and neutral facial expressions (Munsters et al., <xref ref-type="bibr" rid="B33">2017</xref>). Specifically, this study revealed differential processing of emotional facial expressions in response to HSF images but not for LSF images. Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) observed emotion-related differences for face-sensitive ERP components (N290/P400 complex) seen as precursors of the adult N170, reflecting the structural encoding of faces (Eimer, <xref ref-type="bibr" rid="B13">2000</xref>; Rossion, <xref ref-type="bibr" rid="B41">2014</xref>). Together, these two developmental ERP studies point to the notion that in infants and children HSF information is needed for facial emotion processing to occur.</p>
<p>However, emotion discrimination from faces, in particular involving fearful faces, typically affects additional ERP components and can be reliably observed already in infants at a younger age (Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>, <xref ref-type="bibr" rid="B35">2013</xref>) than in Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) study, who investigated emotion processing in 9- to 10-month-old infants. Moreover, spatial frequency filtering might need to be adjusted to take into account the visual acuity at the age under investigation (Dobkins and Harms, <xref ref-type="bibr" rid="B11">2014</xref>), which had not been done in the prior study with infants that used spatial frequency cut-offs typically used with adults (Munsters et al., <xref ref-type="bibr" rid="B32">2016</xref>). It thus remains unclear whether, similar to what is known from adults, infants rely on LSF when frequency cut-offs are adjusted to their visual acuity. We therefore decided to extend this line of research by studying infants at a younger age and by using age-appropriate spatial frequency filters for stimulus generation. In the following, we will provide a detailed rationale for our experimental approach.</p>
<p>By 7 months of age, infants develop an attentional bias towards fearful expressions, which manifests itself in prolonged looking duration to fearful faces when compared to happy faces and in enhanced ERP responses to fearful facial expressions (Vaish et al., <xref ref-type="bibr" rid="B44">2008</xref>; Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>). The Nc ERP component, a central negativity linked to attention allocation and localized to prefrontal and anterior cingulate cortex is of particular interest in this context (Webb et al., <xref ref-type="bibr" rid="B51">2005</xref>). The Nc typically shows an enhanced amplitude in response to fearful compared to happy facial expressions (e.g., Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>; Grossmann et al., <xref ref-type="bibr" rid="B17">2011</xref>), but differential Nc responses can also be observed between different negative expressions, such as anger and fear (Kobiella et al., <xref ref-type="bibr" rid="B27">2008</xref>). A modulation of the Nc amplitude cannot only be observed following conscious processing of emotional information but is also seen in the absence of conscious perception of facial cues (Jessen and Grossmann, <xref ref-type="bibr" rid="B22">2014</xref>, <xref ref-type="bibr" rid="B23">2015</xref>). Furthermore, the N290/P400 ERP complex, which has been discussed as a precursor of the face-specific adult N170 (de Haan et al., <xref ref-type="bibr" rid="B8">2003</xref>), has also been shown to vary as a function of emotion in infants (Lepp&#x000E4;nen et al., <xref ref-type="bibr" rid="B29">2007</xref>; Kobiella et al., <xref ref-type="bibr" rid="B27">2008</xref>), which is similar to what has been observed in adults (Batty and Taylor, <xref ref-type="bibr" rid="B2">2003</xref>; Blau et al., <xref ref-type="bibr" rid="B3">2007</xref>; Pegna et al., <xref ref-type="bibr" rid="B34">2008</xref>).</p>
<p>Recently, using ERPs and eyetracking it has been shown that infants detect fearful faces independent of conscious perception (Jessen and Grossmann, <xref ref-type="bibr" rid="B22">2014</xref>, <xref ref-type="bibr" rid="B23">2015</xref>; Jessen et al., <xref ref-type="bibr" rid="B26">2016</xref>), a function that has been linked to the subcortical (magnocellular) processing route in adults (Whalen et al., <xref ref-type="bibr" rid="B53">2004</xref>). These recent findings with infants thus suggest that infants&#x02019; emotion detection might rely on a subcortical route for face processing based on information received through the magnocellular system. If infants process fearful information predominantly via the magnocellular pathway, one would expect the same distinction in processing LSF images of fearful facial expressions but not HSF images as observed in adults (Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>; Pourtois et al., <xref ref-type="bibr" rid="B39">2005</xref>; M&#x000E9;ndez-B&#x000E9;rtolo et al., <xref ref-type="bibr" rid="B31">2016</xref>). Importantly, differential processing of subliminally presented emotional expressions has only been observed for the Nc but not for the P400 (Jessen and Grossmann, <xref ref-type="bibr" rid="B23">2015</xref>). If unconscious (subliminal) emotion processing relies on the same pathway involving subcortical brain regions as the fast emotional responses elicited by images containing only LSF information (Tamietto and de Gelder, <xref ref-type="bibr" rid="B43">2010</xref>), then it might be expected that differential processing of LSF images of emotional faces will predominantly effect the Nc response but not the P400.</p>
<p>In the current study, we presented 7-month-old infants with images of faces expressing fear or happiness, which were manipulated to contain predominately high or low spatial frequencies. One important issue to consider when studying the role of the magnocellular compared to the parvocellular system in emotion processing in infants is the protracted development of visual acuity in humans. Visual spatial acuity matures slowly, and an adult-like acuity can only be observed from around 6&#x02013;7 years (Ellemberg et al., <xref ref-type="bibr" rid="B14">1999</xref>). The same holds true for the processing of high compared to low spatial frequencies, which continues to develop throughout childhood (van den Boomen et al., <xref ref-type="bibr" rid="B45">2015</xref>). Furthermore, it has been shown that at the structural (anatomical) level the magnocellular pathway matures faster than the parvocellular pathway (Hammarrenger et al., <xref ref-type="bibr" rid="B19">2003</xref>). Thus, when investigating the processing of spatial frequencies and its influence on higher-level visual processing in a developmental population, it is important to differentiate between relative and absolute high and low spatial frequencies. While studies in adults typically assume a range of 6&#x02013;24 cycles/image as the preferred range for face processing, and accordingly define HSF as &#x0003E;24 cycles/image and LSF as &#x0003C;6 cycles/image (Vuilleumier et al., <xref ref-type="bibr" rid="B50">2003</xref>; Pourtois et al., <xref ref-type="bibr" rid="B39">2005</xref>), this does not necessarily correspond to the visual acuity in infants and young children. While Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>), who investigated the role of spatial frequencies in 9&#x02013;10 month-olds, used 2 cycles/&#x000B0; as an upper boundary for their LSF images and 6 cycles/&#x000B0; as a lower boundary for their HSF images (Munsters et al., <xref ref-type="bibr" rid="B33">2017</xref>), which corresponds to the frequency ranges previously used in adults (Munsters et al., <xref ref-type="bibr" rid="B32">2016</xref>), other studies on the role of spatial frequency content in face processing in a developmental population have often adapted the frequency ranges to the assumed visual acuity at a given age. When adjusting the spatial frequencies contained in the stimulus material to the visual acuity of infants, it has been found that newborns process facial information primarily via spatial frequencies below 0.5 cycles/&#x000B0; (equivalent to 12 cycles/image; de Heering et al., <xref ref-type="bibr" rid="B9">2008</xref>). However, more recent work with 8-month-old infants found a face-inversion effect only for HSF (above 0.6 cycles/&#x000B0; Dobkins and Harms, <xref ref-type="bibr" rid="B11">2014</xref>) but not for LSF. Based on previous work using thresholds adapted to infant visual acuity, we therefore used 0.5 cycles/&#x000B0; as a cut-off point, which represents the spatial frequency most closely approximating the peak spatial frequency of the contrast sensitivity curve at 8 months of age (Peterzell, <xref ref-type="bibr" rid="B38">1993</xref>; Dobkins and Harms, <xref ref-type="bibr" rid="B11">2014</xref>). Thus, in the current study, LSF images contained frequencies below 0.4 cycles/&#x000B0; while HSF images contained frequencies above 0.6 cycles/&#x000B0;.</p>
<p>Based on previous studies that used unfiltered facial stimuli (Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>; Jessen and Grossmann, <xref ref-type="bibr" rid="B24">2016</xref>), we decided to study infants at the age of 7 months, because this is the age by which infants first show heightened allocation of attention to fearful faces in their looking time and ERPs. Critically, if infants use similar brain processes for fear detection to adults, involving the magnocellular system, then we would expect to see selective effects on processing fear from LSF faces but not necessarily from HSF faces. If, in contrast, infants rely predominantly on information from the parvocellular system, we expect a differential effect only for images containing high spatial frequencies. Addressing this question by examining the role of spatial frequency information in infants&#x02019; emotion processing fills an important gap in our understanding of the neurodevelopment of facial emotion processing systems.</p>
</sec>
<sec sec-type="materials and methods" id="s2">
<title>Materials and Methods</title>
<sec id="s2-1">
<title>Participants</title>
<p>Twenty six 7-month-old infants (mean age: 219 days, range: 205&#x02013;230 days, 15 female) were included in the final sample. This sample size was determined<italic> a priori</italic> based on comparable ERP studies on emotion perception in infancy (Lepp&#x000E4;nen et al., <xref ref-type="bibr" rid="B29">2007</xref>; Kobiella et al., <xref ref-type="bibr" rid="B27">2008</xref>; Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>). An additional three infants were tested but not included in the analysis because of failure to contribute at least 10 artifact-free trials per conditions (<italic>N</italic> = 2) or the mean amplitude across all conditions in the ROI and time-window used to analyze the Nc response was more than 2 standard deviations (SD) above or below the mean (<italic>N</italic> = 1). Infants contributed on average 35 &#x000B1; 13 (mean &#x000B1; SD) trials per condition (happy-LSF: 34 &#x000B1; 14, happy-HSF: 35 &#x000B1; 13, fear-LSF: 35 &#x000B1; 13, fear-HSF: 36 &#x000B1; 12).</p>
<p>All infants were born full-term (38&#x02013;42 weeks gestational age) and had a birth-weight of at least 2500 g. This study was carried out in accordance with the recommendations of the ethics committee at the University of Leipzig with written informed consent from the parents of all subjects. All parents of all subjects gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the ethics committee at the University of Leipzig.</p>
</sec>
<sec id="s2-2">
<title>Stimuli</title>
<p>The stimulus material consisted of photographs of six different actresses from the FACES database (age 18&#x02013;30, ID-number 54, 63, 85, 90, 115 and 173, see Ebner et al., <xref ref-type="bibr" rid="B12">2010</xref>) expressing fear and happiness, see Figure <xref ref-type="fig" rid="F1">1</xref> for an example. All images were edited according to an established procedure by transforming faces to gray-scale images and applying a spatial filter using a Matlab script adapted from Paul van Diepen<xref ref-type="fn" rid="fn0001"><sup>1</sup></xref> (see Dobkins and Harms, <xref ref-type="bibr" rid="B11">2014</xref>). For the stimulus images containing only HSF, a cut-off of 0.6 cycles/&#x000B0; (or 4.8 cycles/ face width) was chosen, while for the stimulus images containing LSF, a cut-off of 0.4 cycles/&#x000B0; (or 3.2 cycles/ face width) was used, based on the values used for a comparable age group by Dobkins and Harms (<xref ref-type="bibr" rid="B11">2014</xref>). The images did not differ in luminance (<italic>p</italic> &#x0003E; 0.4, as calculated based on the RGB values using Matlab) and had a standardized height of 18.5 cm and a width of 13 cm, leading to a horizontal visual angle of about 8&#x000B0; and a vertical visual angle of about 12&#x000B0; (at 90 cm viewing distance).</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Examples of stimulus material. Images of happy (top row) and fearful (bottom row) faces were filtered to contain only spatial frequencies below 0.4 cycles/&#x000B0; (low spatial frequencies (LSF), left column) or above 0.6 cycles/&#x000B0; (high spatial frequencies (HSF), right column).</p></caption>
<graphic xlink:href="fnhum-11-00486-g0001.tif"/>
</fig>
</sec>
<sec id="s2-3">
<title>Design</title>
<p>The experiment consisted of four conditions resulting in a 2 &#x000D7; 2 design with the factors Emotion (happy, fearful) and Frequency (HSF, LSF). Per condition, 84 trials were presented (14 per actress), leading to a total of 336 trials. The trials were presented in pseudo-randomized order, ensuring that the same condition was not presented more than twice in a row. Furthermore, trials were arranged into miniblocks consisting of 24 trials each (6 trials per condition, 1 trial per actress). The miniblocks were presented consecutively without interruption. Each participant received an individual randomization list. Every trial started with the presentation of a black fixation star on a gray background for 300 ms followed by the stimulus face presented for 750 ms. After each trial, a gray screen was shown for a randomly varying duration between 800 ms and 1200 ms.</p>
</sec>
<sec id="s2-4">
<title>Procedure</title>
<p>After arrival in the lab, infants and parents were familiarized with the new environment, and parents were informed about the experiment and then signed a consent form. The EEG recording was prepared while the infant was sitting on his or her parent&#x02019;s lap. An elastic cap (EasyCap) in which 27 Ag-Ag-Cl-electrodes were mounted according to the 10-20-system was used for recording. Additionally, an electrode was attached below the infant&#x02019;s right eye for computing the electrooculogram (EOG). The EEG was recorded with a sampling rate of 500 Hz using a PORTI-32/MREFA amplifier (Twente Medical Systems). The Cz electrode was used as an online reference. The experiment took place in a soundproof, electrically shielded chamber, in which the infant was seated on his or her parent&#x02019;s lap. Stimuli were presented on a CRT monitor with a screen resolution of 1024 &#x000D7; 786 and a refresh rate of 60 Hz at a distance of approximately 90 cm from the infant. The parent was instructed not to interact with the infant during the experiment. Infants&#x02019; looking behavior during the experiment was monitored using a small camera mounted on top of the monitor. When the infant became inattentive, video clips with colorful moving abstract shapes accompanied by ring tones were played in order to redirect the infant&#x02019;s attention to the screen. The experiment continued until the maximum number of trials was presented or the infant became too fussy to continue the experiment.</p>
</sec>
<sec id="s2-5">
<title>EEG Analysis</title>
<p>Data were re-referenced to the mean across all electrodes (average reference), and bandpass-filtered between 0.2 Hz and 20 Hz. Trials were segmented into 1 s-epochs lasting from 200 ms before stimulus onset to 800 m after stimulus onset. In five participants one electrode was noisy and therefore interpolated using spherical spline interpolation (Perrin et al., <xref ref-type="bibr" rid="B37">1989</xref>). In order to detect trials contaminated by artifacts, the standard deviation was computed in a sliding window of 200 ms. If the standard deviation exceeded 80 &#x003BC;V at any electrode or in the EOG, the entire trials was discarded. Additionally, the trials were inspected visually to ensure no artifacts remained. Furthermore, the video recording of the infants during the experiments was analyzed and all trials in which the infant did not attend to the screen were excluded from further analysis. To analyze the Nc amplitude, data were averaged for each condition at frontal electrodes (F3, Fz, F4) in a time-window from 500 ms to 600 ms after stimulus onset. This time window was determined based on visual inspection of the resulting wave form in order to appropriately capture the peak of the Nc. To analyze the N290 and P400 amplitude, we averaged the data at O1, O2, P7 and P8 from 150 ms to 300 ms (N290) and 350 ms to 600 ms (P400) after stimulus onset. The mean amplitude in these time-windows was entered into a repeated-measures analysis of variance (ANOVA) with the factors Emotion (fearful, happy) and Frequency (HSF, LSF). Student&#x02019;s <italic>t-</italic>tests were computed to further analyze interaction effects. Effect sizes are reported as partial eta-squared (<inline-formula><mml:math id="M1"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula>) for ANOVAs and <italic>r</italic> for <italic>t-</italic>tests.</p>
</sec>
</sec>
<sec sec-type="results" id="s3">
<title>Results</title>
<sec id="s3-1">
<title>Nc</title>
<p>Between 500 ms and 600 ms after stimulus onset, we observed an interaction between Emotion and Frequency at frontal electrodes (<italic>F</italic><sub>(1,25)</sub> = 4.69, <italic>p</italic> = 0.04, <inline-formula><mml:math id="M2"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.16, see Figure <xref ref-type="fig" rid="F2">2</xref>). Specifically, for images containing only high spatial frequencies, we observed a significantly larger Nc amplitude in response to fearful compared to happy faces (<italic>t</italic><sub>(25)</sub> = 2.53, <italic>p</italic> = 0.018, <italic>r</italic> = 0.45). In contrast, we did not find a significant difference between the responses to happy and fearful faces when the images contained only LSF (<italic>t</italic><sub>(25)</sub> = &#x02212;0.5, <italic>p</italic> = 0.62, <italic>r</italic> = 0.1). Moreover, we observed a significant difference between LSF and HSF images for happy faces (<italic>t</italic><sub>(25)</sub> = &#x02212;2.65, <italic>p</italic> = 0.014, <italic>r</italic> = 0.47) but not for fearful facial expressions (<italic>t</italic><sub>(25)</sub> = 0.05, <italic>p</italic> = 0.96, <italic>r</italic> = 0.01). Specifically, LSF images of happy faces elicited a larger Nc amplitude compared to HSF images of happy faces (LSF: &#x02212;3.71 &#x000B1; 1.32 &#x003BC;V (mean &#x000B1; standard error); HSF: 1.22 &#x000B1; 1.37 &#x003BC;V), whereas no difference was elicited by HSF when compared to LSF fearful faces (LSF: &#x02212;2.77 &#x000B1; 1.66 &#x003BC;V; HSF: &#x02212;2.84 &#x000B1; 1.58 &#x003BC;V). In addition, we observed a marginally significant effect of Frequency (<italic>F</italic><sub>(1,25)</sub> = 3.77, <italic>p</italic> = 0.063, <inline-formula><mml:math id="M3"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.13), but no main effect of Emotion (<italic>F</italic><sub>(1,25)</sub> = 1.40, <italic>p</italic> = 0.25, <inline-formula><mml:math id="M4"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.05).</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>Event-related brain potential (ERP) response at frontal electrodes (F3, Fz, F4). <bold>(A)</bold> shows mean responses to images containing HSF while <bold>(B)</bold> displays responses to images containing LSF (blue/green = happy expression, red/orange = fearful expression; displayed are mean responses &#x000B1; within-subject standard errors). Topographic representations show the difference in activation following happy and fearful faces between 500 ms and 600 ms, corresponding to the time-window used in the statistical analysis and marked in gray.</p></caption>
<graphic xlink:href="fnhum-11-00486-g0002.tif"/>
</fig>
</sec>
<sec id="s3-2">
<title>N290</title>
<p>We did not observe any significant effect between 150 ms and 300 ms at occipital electrodes (Emotion: <italic>F</italic><sub>(1,25)</sub> = 1.645, <italic>p</italic> = 0.211, <inline-formula><mml:math id="M5"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.06; Frequency: <italic>F</italic><sub>(1,25)</sub> = 0.676, <italic>p</italic> = 0.419, <inline-formula><mml:math id="M6"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.03; Emotion*Frequency: <italic>F</italic><sub>(1,25)</sub> = 0.177, <italic>p</italic> = 0.677, <inline-formula><mml:math id="M7"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.01).</p>
</sec>
<sec id="s3-3">
<title>P400</title>
<p>Between 350 ms and 600 ms we found a significant main effect of Frequency, (<italic>F</italic><sub>(1,25)</sub> = 4.95, <italic>p</italic> = 0.035, <inline-formula><mml:math id="M8"><mml:mrow><mml:msubsup><mml:mi>&#x003B7;</mml:mi><mml:mtext>p</mml:mtext><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:math></inline-formula> = 0.17, see Figure <xref ref-type="fig" rid="F3">3</xref>), showing a larger P400 amplitude for low compared to HSF faces irrespective of emotional content (LSF: 11.89 &#x000B1; 1.83 &#x003BC;V; HSF: 9.18 &#x000B1; 1.92 &#x003BC;V). We did not observe any other significant effects (all <italic>ps</italic> &#x0003E; 0.20).</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p>ERP response at occipital electrodes (O1, O2, P7, P8). Shows mean responses at occipital electrodes included in the analysis of the P400 (time-window marked in gray; blue/green = happy expression, red/orange = fearful expression; displayed are mean responses &#x000B1; within-subject standard errors). The topographic representation show the difference in brain responses to low compared to high spatial frequencies irrespective of emotional expression between 350 ms and 600 ms, corresponding to the time window used in the statistical analysis marked in gray.</p></caption>
<graphic xlink:href="fnhum-11-00486-g0003.tif"/>
</fig>
</sec>
</sec>
<sec sec-type="discussion" id="s4">
<title>Discussion</title>
<p>The current study examined the differential contribution of high and low spatial frequencies to facial emotion processing in 7-month-old infants. Our results show that infants&#x02019; brains discriminate between fearful and happy faces only when facial images contain HSF information but not when containing LSF information. This difference is reflected in the modulation of the Nc, which is a neural correlate of attention allocation in infants (de Haan et al., <xref ref-type="bibr" rid="B8">2003</xref>; Webb et al., <xref ref-type="bibr" rid="B51">2005</xref>), suggesting the differential attention allocation relies on detailed information contained in HSF images of these facial expressions. This finding is in line with existing developmental ERP research on this topic from older infants and children (Vlamings et al., <xref ref-type="bibr" rid="B48">2010</xref>; Munsters et al., <xref ref-type="bibr" rid="B33">2017</xref>), also showing that HSF is critical for facial emotion discrimination.</p>
<p>Our analysis further showed that differential processing of HSF emotional faces is driven by the impact of spatial frequency content on the processing of happy faces. This is because the Nc only differed between HSF and LSF happy faces but not between HSF and LSF fearful faces. In other words, the current data suggest that fearful faces are robustly detected regardless of the frequency information contained in the facial stimulus. In contrast, our data show that a smaller Nc amplitude in response to happy faces compared to fearful faces, as commonly reported for unfiltered faces (e.g., Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>; Grossmann et al., <xref ref-type="bibr" rid="B17">2011</xref>), is only seen when HSF information is presented and disappears when only LSF information is presented. This may point to the importance of detailed information predominantly conveyed via HSF information, presumably from the mouth region (see e.g., Wegrzyn et al., <xref ref-type="bibr" rid="B52">2017</xref>), in eliciting the response typically observed to happy faces at 7 months of age.</p>
<p>Our findings principally agree with Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) results from slightly older infants, 9&#x02013;10 months of age, who also reported specific ERP differences between processing HSF happy and fearful faces. However, while Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) observed an emotion effect at the P400 and N290, they did not report an interaction between spatial frequency and emotional content at the Nc as obtained in the current study. One possible reason for these differences across infant ERP studies might be differences in the cut-off used to define low and high spatial frequencies content (&#x0003E;2 cycles/&#x000B0; for LSF images and &#x0003E;6 cycles/&#x000B0; for HSF images by Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) as opposed to &#x0003C;0.4 cycles/&#x000B0; for LSF images and &#x0003E;0.6 cycles/&#x000B0; for HSF images in the present study). In this context, it is important to note that our frequency cut-offs were selected on the basis of infant visual acuity at this age, whereas cut-offs chosen by Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) were the same as used with adults. Therefore, a direct comparison between studies is problematic since our HSF and LSF range would both be considered as LSF according to Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>). Furthermore, Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) investigated an older age group (9&#x02013;10 months as opposed to 7 months in the present study) and used a more diverse set of facial stimuli (three emotional facial expressions, faces from different ethnicities, and male as well as female faces as opposed to two emotional expressions from female Caucasian faces only). Especially the use of other-race faces 40% of the time during stimulus presentations in Munsters et al. (<xref ref-type="bibr" rid="B33">2017</xref>) study might have influenced infants&#x02019; emotion processing since at this age infants have been shown to have difficulty in emotion discrimination from other-race faces (Vogel et al., <xref ref-type="bibr" rid="B49">2012</xref>). Dealing with unfamiliar or less familiar other-race faces might have required them to rely more on an analysis of facial details based on HSF information, which may be reflected at the P400 rather than the Nc. Clearly, future work is needed that directly assesses the exact parameters that impact facial emotion processing when manipulating spatial frequency contents.</p>
<p>The current ERP data show that an enhanced Nc response to fearful when compared to happy faces only occurs for HSF filtered facial stimuli. This pattern obtained for HSF is in line with what has been commonly reported in response to fearful and happy faces using naturalistic photographic images containing the entire frequency range (e.g., Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>; Grossmann et al., <xref ref-type="bibr" rid="B17">2011</xref>). This suggests that, at 7 months of age, infants rely on HSF information when discriminating between fear and happiness. More specifically, our analyses indicate that HSF happy faces elicit the typical attenuated Nc response seen in previous studies using unfiltered photographs. This might be explained by a need for detailed feature-based information from the mouth region, characteristic for happy faces, for the discrimination to occur (see e.g., Wegrzyn et al., <xref ref-type="bibr" rid="B52">2017</xref>). Alternatively, happy faces might be more difficult to recognize in the LSF condition, leading to increased attention (i.e., larger Nc) as this stimuli may be perceived as slightly ambiguous. In the HSF condition happy faces might be recognized more easily, leading to the typical Nc response. Importantly, our results for the Nc further show that processing fearful faces is immune to the spatial frequency manipulation, suggesting a robust processing of this emotion from the face independent of the specific information contained in the facial stimulus. This further strengthens the notion that fearful faces are effectively detected by infants of this age (Peltola et al., <xref ref-type="bibr" rid="B36">2009</xref>; Jessen and Grossmann, <xref ref-type="bibr" rid="B24">2016</xref>). One potential factor contributing to the robust detection of fearful faces may be infants&#x02019; sensitivity to enlarged eye whites, which is known to play a key role in fear perception (e.g., Whalen et al., <xref ref-type="bibr" rid="B53">2004</xref>; Jessen and Grossmann, <xref ref-type="bibr" rid="B22">2014</xref>) and might not be affected by spatial frequency content.</p>
<p>How these findings relate to previous research using functional resonance imaging (fMRI) to track subcortical activity with adults is unclear since the EEG signal is primarily generated by cortical sources (Jackson and Bolger, <xref ref-type="bibr" rid="B21">2014</xref>). Therefore, one way to directly examine the contribution of subcortical regions to emotional face processing in infancy is to resort to fMRI, which has very recently been successfully used to map high-level visual cortical regions implicated in face processing in infants of a similar age (Deen et al., <xref ref-type="bibr" rid="B10">2017</xref>). Another promising approach to use with infants in future studies in order to address the issue of subcortical involvement is to measure pupil dilation, which is primarily subcortically mediated (Bradley et al., <xref ref-type="bibr" rid="B4">2008</xref>) and has been successfully applied to study emotion processing in infants (Hepach and Westermann, <xref ref-type="bibr" rid="B20">2013</xref>; Jessen et al., <xref ref-type="bibr" rid="B26">2016</xref>).</p>
<p>The current data further revealed an enhanced P400 at posterior electrodes in response to LSF compared to HSF faces irrespective of emotional content. The P400 is commonly linked to the processing of structural facial information and is thought to represent the infant precursor to the highly face-sensitive N170 seen in adults (de Haan et al., <xref ref-type="bibr" rid="B8">2003</xref>). Our findings are therefore in agreement with prior empirical work showing that face encoding in the infant brain is mainly driven by LSF (see de Heering et al., <xref ref-type="bibr" rid="B9">2008</xref>) and occurs irrespective of emotional content. Importantly, this pattern further indicates that the spatial filtering applied to our face stimuli was effective in splitting the power spectrum into ranges that are processed differentially by infants because a face-sensitive ERP response, the P400, systematically differed as a function of the spatial frequency. Furthermore, it is critical to mention that the P400 responses elicited in the current study, while smaller in amplitude to HSF faces, were also elicited in response to HSF images, demonstrating that filtering did not abolish or disrupt face-sensitive processing in infants.</p>
<p>In our ERP analysis we did not observe differential processing of emotional faces for either frequency range on the P400, which is in contrast to previous ERP research on this topic with infants (Munsters et al., <xref ref-type="bibr" rid="B33">2017</xref>). In this context, it is important to again mention that there were several methodological differences outlined above that might have contributed to this difference between the current study and previous infant work. First and foremost, in the current study the spatial frequency filters were adjusted to the visual acuity of infants at this age, resulting in largely different HSF and LSF filter ranges. Moreover, emotion effects at the P400 have been observed less robustly compared to emotion effects at other ERP components, especially the Nc. Specifically, some studies report a larger P400 amplitude for fearful compared to happy faces (e.g., Lepp&#x000E4;nen et al., <xref ref-type="bibr" rid="B29">2007</xref>), whereas other studies did not find a differentiation for this component (e.g., Vanderwert et al., <xref ref-type="bibr" rid="B46">2015</xref>). In summary, the observed P400 effect demonstrates that the HSF and LSF faces used in the current study elicited systematic differences in face-sensitive processes in infants.</p>
</sec>
<sec sec-type="conclusion" id="s5">
<title>Conclusion</title>
<p>In conclusion, the current study critically adds to our understanding of the neurodevelopment of facial emotion processing in early ontogeny. Our ERP results show that 7-month-old infants distinguish between happy and fearful facial expressions when containing HSF information as reflected in the Nc, suggesting that detailed information matters for this distinction to emerge. This discriminatory ERP effect is driven by an attenuation of the Nc in response to HSF happy faces, whereas the Nc to fearful faces was unaffected by the frequency manipulation. Our results thus provide new insights into the role that spatial frequency information plays when processing facial emotions in infancy, highlighting the robustness of fearful face detection from early in development.</p>
</sec>
<sec id="s7">
<title>Author Contributions</title>
<p>SJ and TG designed the study, SJ collected the data, SJ and TG analyzed the data and wrote the manuscript.</p>
</sec>
<sec id="s8">
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<ack>
<p>We would like to thank Caterina B&#x000F6;ttcher and Katharina Kerber for help with the data collection and all the families for participating.</p>
</ack>
<fn-group>
<fn fn-type="financial-disclosure">
<p><bold>Funding.</bold> This study was funded by a grant of the Max-Planck-Society to TG.</p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Adolphs</surname> <given-names>R.</given-names></name></person-group> (<year>2002</year>). <article-title>Recognizing emotion from facial expressions: psychological and neurological mechanisms</article-title>. <source>Behav. Cogn. Neurosci. Rev.</source> <volume>1</volume>, <fpage>21</fpage>&#x02013;<lpage>62</lpage>. <pub-id pub-id-type="doi">10.1177/1534582302001001003</pub-id><pub-id pub-id-type="pmid">17715585</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Batty</surname> <given-names>M.</given-names></name> <name><surname>Taylor</surname> <given-names>M. J.</given-names></name></person-group> (<year>2003</year>). <article-title>Early processing of the six basic facial emotional expressions</article-title>. <source>Cogn. Brain Res.</source> <volume>17</volume>, <fpage>613</fpage>&#x02013;<lpage>620</lpage>. <pub-id pub-id-type="doi">10.1016/s0926-6410(03)00174-5</pub-id><pub-id pub-id-type="pmid">14561449</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Blau</surname> <given-names>V. C.</given-names></name> <name><surname>Maurer</surname> <given-names>U.</given-names></name> <name><surname>Tottenham</surname> <given-names>N.</given-names></name> <name><surname>McCandliss</surname> <given-names>B. D.</given-names></name></person-group> (<year>2007</year>). <article-title>The face-specific N170 component is modulated by emotional facial expression</article-title>. <source>Behav. Brain Funct.</source> <volume>3</volume>:<fpage>7</fpage>. <pub-id pub-id-type="doi">10.1186/1744-9081-3-7</pub-id><pub-id pub-id-type="pmid">17244356</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bradley</surname> <given-names>M. M.</given-names></name> <name><surname>Miccoli</surname> <given-names>L.</given-names></name> <name><surname>Escrig</surname> <given-names>M. A.</given-names></name> <name><surname>Lang</surname> <given-names>P. J.</given-names></name></person-group> (<year>2008</year>). <article-title>The pupil as a measure of emotional arousal and autonomic activation</article-title>. <source>Psychophysiology</source> <volume>45</volume>, <fpage>602</fpage>&#x02013;<lpage>607</lpage>. <pub-id pub-id-type="doi">10.1111/j.1469-8986.2008.00654.x</pub-id><pub-id pub-id-type="pmid">18282202</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Clark</surname> <given-names>V. P.</given-names></name> <name><surname>Hillyard</surname> <given-names>S. A.</given-names></name></person-group> (<year>1996</year>). <article-title>Spatial selective attention affects early extrastriate but not striate components of the visual evoked potential</article-title>. <source>J. Cogn. Neurosci.</source> <volume>8</volume>, <fpage>387</fpage>&#x02013;<lpage>402</lpage>. <pub-id pub-id-type="doi">10.1162/jocn.1996.8.5.387</pub-id><pub-id pub-id-type="pmid">23961943</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Gelder</surname> <given-names>B.</given-names></name> <name><surname>de Borst</surname> <given-names>A. W.</given-names></name> <name><surname>Watson</surname> <given-names>R.</given-names></name></person-group> (<year>2015</year>). <article-title>The perception of emotion in body expressions</article-title>. <source>Wiley Interdiscip. Rev. Cogn. Sci.</source> <volume>6</volume>, <fpage>149</fpage>&#x02013;<lpage>158</lpage>. <pub-id pub-id-type="doi">10.1002/wcs.1335</pub-id><pub-id pub-id-type="pmid">26263069</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Gelder</surname> <given-names>B.</given-names></name> <name><surname>Vroomen</surname> <given-names>J.</given-names></name> <name><surname>Pourtois</surname> <given-names>G.</given-names></name> <name><surname>Weiskrantz</surname> <given-names>L.</given-names></name></person-group> (<year>1999</year>). <article-title>Non-conscious recognition of affect in the absence of striate cortex</article-title>. <source>Neuroreport</source> <volume>10</volume>, <fpage>3759</fpage>&#x02013;<lpage>3763</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-199912160-00007</pub-id><pub-id pub-id-type="pmid">10716205</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Haan</surname> <given-names>M.</given-names></name> <name><surname>Johnson</surname> <given-names>M. H.</given-names></name> <name><surname>Halit</surname> <given-names>H.</given-names></name></person-group> (<year>2003</year>). <article-title>Development of face-sensitive event-related potentials during infancy: a review</article-title>. <source>Int. J. Psychophysiol.</source> <volume>51</volume>, <fpage>45</fpage>&#x02013;<lpage>58</lpage>. <pub-id pub-id-type="doi">10.1016/s0167-8760(03)00152-1</pub-id><pub-id pub-id-type="pmid">14629922</pub-id></citation></ref>
<ref id="B9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Heering</surname> <given-names>A.</given-names></name> <name><surname>Turati</surname> <given-names>C.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name> <name><surname>Bulf</surname> <given-names>H.</given-names></name> <name><surname>Goffaux</surname> <given-names>V.</given-names></name> <name><surname>Simion</surname> <given-names>F.</given-names></name></person-group> (<year>2008</year>). <article-title>Newborns&#x02019; face recognition is based on spatial frequencies below 0.5 cycles per degree</article-title>. <source>Cognition</source> <volume>106</volume>, <fpage>444</fpage>&#x02013;<lpage>454</lpage>. <pub-id pub-id-type="doi">10.1016/j.cognition.2006.12.012</pub-id><pub-id pub-id-type="pmid">17239361</pub-id></citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Deen</surname> <given-names>B.</given-names></name> <name><surname>Richardson</surname> <given-names>H.</given-names></name> <name><surname>Dilks</surname> <given-names>D. D.</given-names></name> <name><surname>Takahashi</surname> <given-names>A.</given-names></name> <name><surname>Keil</surname> <given-names>B.</given-names></name> <name><surname>Wald</surname> <given-names>L. L.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>Organization of high-level visual cortex in human infants</article-title>. <source>Nat. Commun.</source> <volume>8</volume>:<fpage>13995</fpage>. <pub-id pub-id-type="doi">10.1038/ncomms13995</pub-id><pub-id pub-id-type="pmid">28072399</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dobkins</surname> <given-names>K. R.</given-names></name> <name><surname>Harms</surname> <given-names>R.</given-names></name></person-group> (<year>2014</year>). <article-title>The face inversion effect in infants is driven by high, and not low, spatial frequencies</article-title>. <source>J. Vis.</source> <volume>14</volume>:<fpage>1</fpage>. <pub-id pub-id-type="doi">10.1167/14.1.1</pub-id><pub-id pub-id-type="pmid">24385345</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ebner</surname> <given-names>N. C.</given-names></name> <name><surname>Riediger</surname> <given-names>M.</given-names></name> <name><surname>Lindenberger</surname> <given-names>U.</given-names></name></person-group> (<year>2010</year>). <article-title>FACES&#x02014;a database of facial expressions in young, middle-aged, and older women and men: development and validation</article-title>. <source>Behav. Res. Methods</source> <volume>42</volume>, <fpage>351</fpage>&#x02013;<lpage>362</lpage>. <pub-id pub-id-type="doi">10.3758/brm.42.1.351</pub-id><pub-id pub-id-type="pmid">20160315</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Eimer</surname> <given-names>M.</given-names></name></person-group> (<year>2000</year>). <article-title>The face-specific N170 component reflects late stages in the structural encoding of faces</article-title>. <source>Neuroreport</source> <volume>11</volume>, <fpage>2319</fpage>&#x02013;<lpage>2324</lpage>. <pub-id pub-id-type="doi">10.1097/00001756-200007140-00050</pub-id><pub-id pub-id-type="pmid">10923693</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ellemberg</surname> <given-names>D.</given-names></name> <name><surname>Lewis</surname> <given-names>T. L.</given-names></name> <name><surname>Liu</surname> <given-names>C. H.</given-names></name> <name><surname>Maurer</surname> <given-names>D.</given-names></name></person-group> (<year>1999</year>). <article-title>Development of spatial and temporal vision during childhood</article-title>. <source>Vision Res.</source> <volume>39</volume>, <fpage>2325</fpage>&#x02013;<lpage>2333</lpage>. <pub-id pub-id-type="doi">10.1016/s0042-6989(98)00280-6</pub-id><pub-id pub-id-type="pmid">10367054</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Frith</surname> <given-names>C.</given-names></name></person-group> (<year>2009</year>). <article-title>Role of facial expressions in social interactions</article-title>. <source>Philos. Trans. R. Soc. Lond B Biol. Sci.</source> <volume>364</volume>, <fpage>3453</fpage>&#x02013;<lpage>3458</lpage>. <pub-id pub-id-type="doi">10.1098/rstb.2009.0142</pub-id><pub-id pub-id-type="pmid">19884140</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Goffaux</surname> <given-names>V.</given-names></name> <name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2006</year>). <article-title>Faces are &#x0201C;spatial&#x0201D;&#x02014;holistic face perception is supported by low spatial frequencies</article-title>. <source>J. Exp. Psychol. Hum. Percept. Perform.</source> <volume>32</volume>, <fpage>1023</fpage>&#x02013;<lpage>1039</lpage>. <pub-id pub-id-type="doi">10.1037/0096-1523.32.4.1023</pub-id><pub-id pub-id-type="pmid">16846295</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Grossmann</surname> <given-names>T.</given-names></name> <name><surname>Johnson</surname> <given-names>M. H.</given-names></name> <name><surname>Vaish</surname> <given-names>A.</given-names></name> <name><surname>Hughes</surname> <given-names>D. A.</given-names></name> <name><surname>Quinque</surname> <given-names>D.</given-names></name> <name><surname>Stoneking</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2011</year>). <article-title>Genetic and neural dissociation of individual responses to emotional expressions in human infants</article-title>. <source>Dev. Cogn. Neurosci.</source> <volume>1</volume>, <fpage>57</fpage>&#x02013;<lpage>66</lpage>. <pub-id pub-id-type="doi">10.1016/j.dcn.2010.07.001</pub-id><pub-id pub-id-type="pmid">22436418</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>G&#x000FC;ntekin</surname> <given-names>B.</given-names></name> <name><surname>Ba&#x0015F;ar</surname> <given-names>E.</given-names></name></person-group> (<year>2014</year>). <article-title>A review of brain oscillations in perception of faces and emotional pictures</article-title>. <source>Neuropsychologia</source> <volume>58</volume>, <fpage>33</fpage>&#x02013;<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2014.03.014</pub-id><pub-id pub-id-type="pmid">24709570</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hammarrenger</surname> <given-names>B.</given-names></name> <name><surname>Lepor&#x000E9;</surname> <given-names>F.</given-names></name> <name><surname>Lipp&#x000E9;</surname> <given-names>S.</given-names></name> <name><surname>Labrosse</surname> <given-names>M.</given-names></name> <name><surname>Guillemot</surname> <given-names>J. P.</given-names></name> <name><surname>Roy</surname> <given-names>M. S.</given-names></name></person-group> (<year>2003</year>). <article-title>Magnocellular and parvocellular developmental course in infants during the first year of life</article-title>. <source>Doc. Ophthalmol.</source> <volume>107</volume>, <fpage>225</fpage>&#x02013;<lpage>233</lpage>. <pub-id pub-id-type="doi">10.1023/b:doop.0000005331.66114.05</pub-id><pub-id pub-id-type="pmid">14711154</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hepach</surname> <given-names>R.</given-names></name> <name><surname>Westermann</surname> <given-names>G.</given-names></name></person-group> (<year>2013</year>). <article-title>Infants&#x02019; sensitivity to the congruence of others&#x02019; emotions and actions</article-title>. <source>J. Exp. Child Psychol.</source> <volume>115</volume>, <fpage>16</fpage>&#x02013;<lpage>29</lpage>. <pub-id pub-id-type="doi">10.1016/j.jecp.2012.12.013</pub-id><pub-id pub-id-type="pmid">23454359</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jackson</surname> <given-names>A. F.</given-names></name> <name><surname>Bolger</surname> <given-names>D. J.</given-names></name></person-group> (<year>2014</year>). <article-title>The neurophysiological bases of EEG and EEG measurement: a review for the rest of us</article-title>. <source>Psychophysiology</source> <volume>51</volume>, <fpage>1061</fpage>&#x02013;<lpage>1071</lpage>. <pub-id pub-id-type="doi">10.1111/psyp.12283</pub-id><pub-id pub-id-type="pmid">25039563</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jessen</surname> <given-names>S.</given-names></name> <name><surname>Altvater-Mackensen</surname> <given-names>N.</given-names></name> <name><surname>Grossmann</surname> <given-names>T.</given-names></name></person-group> (<year>2016</year>). <article-title>Pupillary responses reveal infants&#x02032; discrimination of facial emotions independent of conscious perception</article-title>. <source>Cognition</source> <volume>150</volume>, <fpage>163</fpage>&#x02013;<lpage>169</lpage>. <pub-id pub-id-type="doi">10.1016/j.cognition.2016.02.010</pub-id><pub-id pub-id-type="pmid">26896901</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jessen</surname> <given-names>S.</given-names></name> <name><surname>Grossmann</surname> <given-names>T.</given-names></name></person-group> (<year>2014</year>). <article-title>Unconscious discrimination of social cues from eye whites in infants</article-title>. <source>Proc. Natl. Acad. Sci. U S A</source> <volume>111</volume>, <fpage>16208</fpage>&#x02013;<lpage>16213</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.1411333111</pub-id><pub-id pub-id-type="pmid">25349392</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jessen</surname> <given-names>S.</given-names></name> <name><surname>Grossmann</surname> <given-names>T.</given-names></name></person-group> (<year>2015</year>). <article-title>Neural signatures of conscious and unconscious emotional face processing in human infants</article-title>. <source>Cortex</source> <volume>64</volume>, <fpage>260</fpage>&#x02013;<lpage>270</lpage>. <pub-id pub-id-type="doi">10.1016/j.cortex.2014.11.007</pub-id><pub-id pub-id-type="pmid">25528130</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jessen</surname> <given-names>S.</given-names></name> <name><surname>Grossmann</surname> <given-names>T.</given-names></name></person-group> (<year>2016</year>). <article-title>The developmental emergence of unconscious fear processing from eyes during infancy</article-title>. <source>J. Exp. Child Psychol.</source> <volume>142</volume>, <fpage>334</fpage>&#x02013;<lpage>343</lpage>. <pub-id pub-id-type="doi">10.1016/j.jecp.2015.09.009</pub-id><pub-id pub-id-type="pmid">26493612</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kobiella</surname> <given-names>A.</given-names></name> <name><surname>Grossmann</surname> <given-names>T.</given-names></name> <name><surname>Reid</surname> <given-names>V. M.</given-names></name> <name><surname>Striano</surname> <given-names>T.</given-names></name></person-group> (<year>2008</year>). <article-title>The discrimination of angry and fearful facial expressions in 7-month-old infants: an event-related potential study</article-title>. <source>Cogn. Emot.</source> <volume>22</volume>, <fpage>134</fpage>&#x02013;<lpage>146</lpage>. <pub-id pub-id-type="doi">10.1080/02699930701394256</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kragel</surname> <given-names>P. A.</given-names></name> <name><surname>LaBar</surname> <given-names>K. S.</given-names></name></person-group> (<year>2016</year>). <article-title>Decoding the nature of emotion in the brain</article-title>. <source>Trends Cogn. Sci.</source> <volume>20</volume>, <fpage>444</fpage>&#x02013;<lpage>455</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2016.03.011</pub-id><pub-id pub-id-type="pmid">27133227</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lepp&#x000E4;nen</surname> <given-names>J. M.</given-names></name> <name><surname>Moulson</surname> <given-names>M. C.</given-names></name> <name><surname>Vogel-Farley</surname> <given-names>V. K.</given-names></name> <name><surname>Nelson</surname> <given-names>C. A.</given-names></name></person-group> (<year>2007</year>). <article-title>An ERP study of emotional face processing in the adult and infant brain</article-title>. <source>Child Dev.</source> <volume>78</volume>, <fpage>232</fpage>&#x02013;<lpage>245</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-8624.2007.00994.x</pub-id><pub-id pub-id-type="pmid">17328702</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Livingstone</surname> <given-names>M.</given-names></name> <name><surname>Hubel</surname> <given-names>D.</given-names></name></person-group> (<year>1988</year>). <article-title>Segregation of form, color, movement, and depth: anatomy, physiology, and perception</article-title>. <source>Science</source> <volume>240</volume>, <fpage>740</fpage>&#x02013;<lpage>749</lpage>. <pub-id pub-id-type="doi">10.1126/science.3283936</pub-id><pub-id pub-id-type="pmid">3283936</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>M&#x000E9;ndez-B&#x000E9;rtolo</surname> <given-names>C.</given-names></name> <name><surname>Moratti</surname> <given-names>S.</given-names></name> <name><surname>Toledano</surname> <given-names>R.</given-names></name> <name><surname>Lopez-Sosa</surname> <given-names>F.</given-names></name> <name><surname>Mart&#x000ED;nez-Alvarez</surname> <given-names>R.</given-names></name> <name><surname>Mah</surname> <given-names>Y. H.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>A fast pathway for fear in human amygdala</article-title>. <source>Nat. Neurosci.</source> <volume>19</volume>, <fpage>1041</fpage>&#x02013;<lpage>1049</lpage>. <pub-id pub-id-type="doi">10.1038/nn.4324</pub-id><pub-id pub-id-type="pmid">27294508</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Munsters</surname> <given-names>N. M.</given-names></name> <name><surname>van den Boomen</surname> <given-names>C.</given-names></name> <name><surname>Hooge</surname> <given-names>I. T. C.</given-names></name> <name><surname>Kemner</surname> <given-names>C.</given-names></name></person-group> (<year>2016</year>). <article-title>The role of global and local visual information during gaze-cued orienting of attention</article-title>. <source>PLoS One</source> <volume>11</volume>:<fpage>e0160405</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0160405</pub-id><pub-id pub-id-type="pmid">27560368</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Munsters</surname> <given-names>N. M.</given-names></name> <name><surname>van Ravenswaaij</surname> <given-names>H.</given-names></name> <name><surname>van den Boomen</surname> <given-names>C.</given-names></name> <name><surname>Kemner</surname> <given-names>C.</given-names></name></person-group> (<year>2017</year>). <article-title>Test-retest reliability of infant event related potentials evoked by faces</article-title>. <source>Neuropsychologia</source> [Epub ahead of print]. <pub-id pub-id-type="doi">10.1016/j.neuropsychologia.2017.03.030</pub-id><pub-id pub-id-type="pmid">28389367</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pegna</surname> <given-names>A.</given-names></name> <name><surname>Landis</surname> <given-names>T.</given-names></name> <name><surname>Khateb</surname> <given-names>A.</given-names></name></person-group> (<year>2008</year>). <article-title>Electrophysiological evidence for early non-conscious processing of fearful facial expressions</article-title>. <source>Int. J. Psychophysiol.</source> <volume>70</volume>, <fpage>127</fpage>&#x02013;<lpage>136</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2008.08.007</pub-id><pub-id pub-id-type="pmid">18804496</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peltola</surname> <given-names>M. J.</given-names></name> <name><surname>Hietanen</surname> <given-names>J. K.</given-names></name> <name><surname>Forssman</surname> <given-names>L.</given-names></name> <name><surname>Lepp&#x000E4;nen</surname> <given-names>J. M.</given-names></name></person-group> (<year>2013</year>). <article-title>The emergence and stability of the attentional bias to fearful faces in infancy</article-title>. <source>Infancy</source> <volume>18</volume>, <fpage>905</fpage>&#x02013;<lpage>926</lpage>. <pub-id pub-id-type="doi">10.1111/infa.12013</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peltola</surname> <given-names>M. J.</given-names></name> <name><surname>Lepp&#x000E4;nen</surname> <given-names>J. M.</given-names></name> <name><surname>M&#x000E4;ki</surname> <given-names>S.</given-names></name> <name><surname>Hietanen</surname> <given-names>J. K.</given-names></name></person-group> (<year>2009</year>). <article-title>Emergence of enhanced attention to fearful faces between 5 and 7 months of age</article-title>. <source>Soc. Cogn. Affect. Neurosci.</source> <volume>4</volume>, <fpage>134</fpage>&#x02013;<lpage>142</lpage>. <pub-id pub-id-type="doi">10.1093/scan/nsn046</pub-id><pub-id pub-id-type="pmid">19174536</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perrin</surname> <given-names>F.</given-names></name> <name><surname>Pernier</surname> <given-names>J.</given-names></name> <name><surname>Bertrand</surname> <given-names>O.</given-names></name> <name><surname>Echallier</surname> <given-names>J. F.</given-names></name></person-group> (<year>1989</year>). <article-title>Spherical splines for scalp potential and current density mapping</article-title>. <source>Electroencephalogr. Clin. Neurophysiol.</source> <volume>72</volume>, <fpage>184</fpage>&#x02013;<lpage>187</lpage>. <pub-id pub-id-type="doi">10.1016/0013-4694(89)90180-6</pub-id><pub-id pub-id-type="pmid">2464490</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peterzell</surname> <given-names>D. H.</given-names></name></person-group> (<year>1993</year>). <article-title>Individual differences in the visual attention of human infants: further evidence for separate sensitization and habituation processes</article-title>. <source>Dev. Psychobiol.</source> <volume>26</volume>, <fpage>207</fpage>&#x02013;<lpage>218</lpage>. <pub-id pub-id-type="doi">10.1002/dev.420260404</pub-id><pub-id pub-id-type="pmid">8354426</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pourtois</surname> <given-names>G.</given-names></name> <name><surname>Dan</surname> <given-names>E. S.</given-names></name> <name><surname>Grandjean</surname> <given-names>D.</given-names></name> <name><surname>Sander</surname> <given-names>D.</given-names></name> <name><surname>Vuilleumier</surname> <given-names>P.</given-names></name></person-group> (<year>2005</year>). <article-title>Enhanced extrastriate visual response to bandpass spatial frequency filtered fearful faces: time course and topographic evoked-potentials mapping</article-title>. <source>Hum. Brain Mapp.</source> <volume>26</volume>, <fpage>65</fpage>&#x02013;<lpage>79</lpage>. <pub-id pub-id-type="doi">10.1002/hbm.20130</pub-id><pub-id pub-id-type="pmid">15954123</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pourtois</surname> <given-names>G.</given-names></name> <name><surname>Grandjean</surname> <given-names>D.</given-names></name> <name><surname>Sander</surname> <given-names>D.</given-names></name> <name><surname>Vuilleumier</surname> <given-names>P.</given-names></name></person-group> (<year>2004</year>). <article-title>Electrophysiological correlates of rapid spatial orienting towards fearful faces</article-title>. <source>Cereb. Cortex</source> <volume>14</volume>, <fpage>619</fpage>&#x02013;<lpage>633</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhh023</pub-id><pub-id pub-id-type="pmid">15054077</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rossion</surname> <given-names>B.</given-names></name></person-group> (<year>2014</year>). <article-title>Understanding face perception by means of human electrophysiology</article-title>. <source>Trends Cogn. Sci.</source> <volume>18</volume>, <fpage>310</fpage>&#x02013;<lpage>318</lpage>. <pub-id pub-id-type="doi">10.1016/j.tics.2014.02.013</pub-id><pub-id pub-id-type="pmid">24703600</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Smith</surname> <given-names>E.</given-names></name> <name><surname>Weinberg</surname> <given-names>A.</given-names></name> <name><surname>Moran</surname> <given-names>T.</given-names></name> <name><surname>Hajcak</surname> <given-names>G.</given-names></name></person-group> (<year>2013</year>). <article-title>Electrocortical responses to NIMSTIM facial expressions of emotion</article-title>. <source>Int. J. Psychophysiol.</source> <volume>88</volume>, <fpage>17</fpage>&#x02013;<lpage>25</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2012.12.004</pub-id><pub-id pub-id-type="pmid">23280304</pub-id></citation></ref>
<ref id="B43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tamietto</surname> <given-names>M.</given-names></name> <name><surname>de Gelder</surname> <given-names>B.</given-names></name></person-group> (<year>2010</year>). <article-title>Neural bases of the non-conscious perception of emotional signals</article-title>. <source>Nat. Rev. Neurosci.</source> <volume>11</volume>, <fpage>697</fpage>&#x02013;<lpage>709</lpage>. <pub-id pub-id-type="doi">10.1038/nrn2889</pub-id><pub-id pub-id-type="pmid">20811475</pub-id></citation></ref>
<ref id="B44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vaish</surname> <given-names>A.</given-names></name> <name><surname>Grossmann</surname> <given-names>T.</given-names></name> <name><surname>Woodward</surname> <given-names>A.</given-names></name></person-group> (<year>2008</year>). <article-title>Not all emotions are created equal: the negativity bias in social-emotional development</article-title>. <source>Psychol. Bull.</source> <volume>134</volume>, <fpage>383</fpage>&#x02013;<lpage>403</lpage>. <pub-id pub-id-type="doi">10.1037/0033-2909.134.3.383</pub-id><pub-id pub-id-type="pmid">18444702</pub-id></citation></ref>
<ref id="B45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>van den Boomen</surname> <given-names>C.</given-names></name> <name><surname>Jonkman</surname> <given-names>L. M.</given-names></name> <name><surname>Jaspers-Vlamings</surname> <given-names>P. H.</given-names></name> <name><surname>Cousijn</surname> <given-names>J.</given-names></name> <name><surname>Kemner</surname> <given-names>C.</given-names></name></person-group> (<year>2015</year>). <article-title>Developmental changes in ERP responses to spatial frequencies</article-title>. <source>PLoS One</source> <volume>10</volume>:<fpage>e0122507</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0122507</pub-id><pub-id pub-id-type="pmid">25799038</pub-id></citation></ref>
<ref id="B46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vanderwert</surname> <given-names>R. E.</given-names></name> <name><surname>Westerlund</surname> <given-names>A.</given-names></name> <name><surname>Montoya</surname> <given-names>L.</given-names></name> <name><surname>McCormick</surname> <given-names>S. A.</given-names></name> <name><surname>Miguel</surname> <given-names>H. O.</given-names></name> <name><surname>Nelson</surname> <given-names>C. A.</given-names></name></person-group> (<year>2015</year>). <article-title>Looking to the eyes influences the processing of emotion on face-sensitive event-related potentials in 7-month-old infants</article-title>. <source>Dev. Neurobiol.</source> <volume>75</volume>, <fpage>1154</fpage>&#x02013;<lpage>1163</lpage>. <pub-id pub-id-type="doi">10.1002/dneu.22204</pub-id><pub-id pub-id-type="pmid">24962465</pub-id></citation></ref>
<ref id="B47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vlamings</surname> <given-names>P. H.</given-names></name> <name><surname>Goffaux</surname> <given-names>V.</given-names></name> <name><surname>Kemner</surname> <given-names>C.</given-names></name></person-group> (<year>2009</year>). <article-title>Is the early modulation of brain activity by fearful facial expressions primarily mediated by coarse low spatial frequency information?</article-title> <source>J. Vis.</source> <volume>9</volume>, <fpage>12.1</fpage>&#x02013;<lpage>12.13</lpage>. <pub-id pub-id-type="doi">10.1167/9.5.12</pub-id><pub-id pub-id-type="pmid">19757890</pub-id></citation></ref>
<ref id="B48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vlamings</surname> <given-names>P. H.</given-names></name> <name><surname>Jonkman</surname> <given-names>L. M.</given-names></name> <name><surname>Kemner</surname> <given-names>C.</given-names></name></person-group> (<year>2010</year>). <article-title>An eye for detail: an event-related potential study of the rapid processing of fearful facial expressions in children</article-title>. <source>Child Dev.</source> <volume>81</volume>, <fpage>1304</fpage>&#x02013;<lpage>1319</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-8624.2010.01470.x</pub-id><pub-id pub-id-type="pmid">20636697</pub-id></citation></ref>
<ref id="B49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vogel</surname> <given-names>M.</given-names></name> <name><surname>Monesson</surname> <given-names>A.</given-names></name> <name><surname>Scott</surname> <given-names>L. S.</given-names></name></person-group> (<year>2012</year>). <article-title>Building biases in infancy: the influence of race on face and voice emotion matching</article-title>. <source>Dev. Sci.</source> <volume>15</volume>, <fpage>359</fpage>&#x02013;<lpage>372</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-7687.2012.01138.x</pub-id><pub-id pub-id-type="pmid">22490176</pub-id></citation></ref>
<ref id="B50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Vuilleumier</surname> <given-names>P.</given-names></name> <name><surname>Armony</surname> <given-names>J. L.</given-names></name> <name><surname>Driver</surname> <given-names>J.</given-names></name> <name><surname>Dolan</surname> <given-names>R. J.</given-names></name></person-group> (<year>2003</year>). <article-title>Distinct spatial frequency sensitivities for processing faces and emotional expressions</article-title>. <source>Nat. Neurosci.</source> <volume>6</volume>, <fpage>624</fpage>&#x02013;<lpage>631</lpage>. <pub-id pub-id-type="doi">10.1038/nn1057</pub-id><pub-id pub-id-type="pmid">12740580</pub-id></citation></ref>
<ref id="B51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Webb</surname> <given-names>S. J.</given-names></name> <name><surname>Long</surname> <given-names>J. D.</given-names></name> <name><surname>Nelson</surname> <given-names>C. A.</given-names></name></person-group> (<year>2005</year>). <article-title>A longitudinal investigation of visual event-related potentials in the first year of life</article-title>. <source>Dev. Sci.</source> <volume>8</volume>, <fpage>605</fpage>&#x02013;<lpage>616</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-7687.2005.00452.x</pub-id><pub-id pub-id-type="pmid">16246251</pub-id></citation></ref>
<ref id="B52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wegrzyn</surname> <given-names>M.</given-names></name> <name><surname>Vogt</surname> <given-names>M.</given-names></name> <name><surname>Kireclioglu</surname> <given-names>B.</given-names></name> <name><surname>Schneider</surname> <given-names>J.</given-names></name> <name><surname>Kissler</surname> <given-names>J.</given-names></name></person-group> (<year>2017</year>). <article-title>Mapping the emotional face. How individual face parts contribute to successful emotion recognition</article-title>. <source>PLoS One</source> <volume>12</volume>:<fpage>e0177239</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0177239</pub-id><pub-id pub-id-type="pmid">28493921</pub-id></citation></ref>
<ref id="B53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Whalen</surname> <given-names>P. J.</given-names></name> <name><surname>Kagan</surname> <given-names>J.</given-names></name> <name><surname>Cook</surname> <given-names>R. G.</given-names></name> <name><surname>Davis</surname> <given-names>F. C.</given-names></name> <name><surname>Kim</surname> <given-names>H.</given-names></name> <name><surname>Polis</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2004</year>). <article-title>Human amygdala responsivity to masked fearful eye whites</article-title>. <source>Science</source> <volume>306</volume>:<fpage>2061</fpage>. <pub-id pub-id-type="doi">10.1126/science.1103617</pub-id><pub-id pub-id-type="pmid">15604401</pub-id></citation></ref>
</ref-list>
<fn-group>
<fn id="fn0001"><p><sup>1</sup><ext-link ext-link-type="uri" xlink:href="http://www.perceptionweb.com/perception/misc/p271141/pvdmatl.txt">http://www.perceptionweb.com/perception/misc/p271141/pvdmatl.txt</ext-link></p></fn>
</fn-group>
</back>
</article>