<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychiatry</journal-id>
<journal-title>Frontiers in Psychiatry</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychiatry</abbrev-journal-title>
<issn pub-type="epub">1664-0640</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyt.2022.897595</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychiatry</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Inhibitory Control of Emotional Interference in Deaf Children: Evidence From Event-Related Potentials and Event-Related Spectral Perturbation Analysis</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Chen</surname> <given-names>Qiong</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1721529/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name><surname>Zhao</surname> <given-names>Junfeng</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1494962/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name><surname>Gu</surname> <given-names>Huang</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="corresp" rid="c002"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1159211/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Li</surname> <given-names>Xiaoming</given-names></name>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1173017/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Shaanxi Provincial Key Research Center for Children Mental and Behavioral Health, School of Psychology, Shaanxi Normal University</institution>, <addr-line>Xi&#x00027;an</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>Institute of Behavior and Psychology, School of Psychology, Henan University</institution>, <addr-line>Kaifeng</addr-line>, <country>China</country></aff>
<aff id="aff3"><sup>3</sup><institution>Department of Health Promotion, Education, and Behavior, University of South Carolina</institution>, <addr-line>Columbia, SC</addr-line>, <country>United States</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Giorgio Di Lorenzo, University of Rome Tor Vergata, Italy</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Wenhai Zhang, Hengyang Normal University, China; Simone Battaglia, University of Bologna, Italy</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Junfeng Zhao <email>jfzhao63&#x00040;hotmail.com</email></corresp>
<corresp id="c002">Huang Gu <email>huanggu1017&#x00040;hotmail.com</email></corresp>
<fn fn-type="other" id="fn001"><p>This article was submitted to Public Mental Health, a section of the journal Frontiers in Psychiatry</p></fn></author-notes>
<pub-date pub-type="epub">
<day>24</day>
<month>06</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>13</volume>
<elocation-id>897595</elocation-id>
<history>
<date date-type="received">
<day>16</day>
<month>03</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>26</day>
<month>05</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2022 Chen, Zhao, Gu and Li.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Chen, Zhao, Gu and Li</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license> </permissions>
<abstract>
<sec>
<title>Background</title>
<p>Impairment of interference control ability may reflect a more general deficit in executive functioning, and lead to an increase in internal-externalized problems such as impulsivity, which has been reported in deaf children. However, few researches have examined the neural mechanism of this impairment.</p></sec>
<sec>
<title>Methods</title>
<p>This study applied the electroencephalogram (EEG) technique to investigate the interference control ability in 31 deaf children and 28 hearing controls with emotional face-word stroop task.</p></sec>
<sec>
<title>Results</title>
<p>Results from behavioral task showed that deaf children exhibited lower accuracy compared to hearing controls. As for EEG analysis, reduced activation of ERP components in N1 and enhanced activation of ERP components in N450 have been found in deaf children. Besides, incongruent condition elicited larger N450 than congruent condition. Furthermore, for brain oscillation, alpha band (600&#x02013;800 ms) revealed a reduced desynchronization in deaf children, while theta band (200&#x02013;400 ms) revealed an enhanced synchronization in deaf children and incongruent condition, which were in line with ERP components.</p></sec>
<sec>
<title>Conclusion</title>
<p>The present findings seem to indicate that the deficit during emotional interference control ability among deaf children might be due to the impaired attention allocation ability and emotional cognitive monitoring function during emotional conflict detection process. Consequently, reduced N1 and enhanced N450 might be due to early attention impairment causing more effort of deaf children later in emotional cognitive monitoring.</p></sec></abstract>
<kwd-group>
<kwd>deaf children</kwd>
<kwd>interference control</kwd>
<kwd>emotional stroop</kwd>
<kwd>event-related potentials</kwd>
<kwd>time-frequency analysis</kwd>
</kwd-group>
<contract-num rid="cn001">212102310985</contract-num>
<contract-num rid="cn002">2020-ZDJH-026</contract-num>
<contract-sponsor id="cn001">Science and Technology Department of Henan Province<named-content content-type="fundref-id">10.13039/501100011447</named-content></contract-sponsor>
<contract-sponsor id="cn002">Education Department of Henan Province<named-content content-type="fundref-id">10.13039/501100009101</named-content></contract-sponsor>
<counts>
<fig-count count="4"/>
<table-count count="4"/>
<equation-count count="0"/>
<ref-count count="78"/>
<page-count count="11"/>
<word-count count="7612"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>Introduction</title>
<p>The World Health Organization (WHO) estimates that there are approximately 360 million people with hearing impairment in the world and almost one-tenth of the affected population are children (<xref ref-type="bibr" rid="B1">1</xref>). Previous research has shown that hearing loss not only affects the normal development of language skills, but also affects other neurocognitive functions among deaf children, such as interference control ability (<xref ref-type="bibr" rid="B2">2</xref>, <xref ref-type="bibr" rid="B3">3</xref>). However, most of these previous studies were done with questionnaires or behavioral experiments (<xref ref-type="bibr" rid="B4">4</xref>&#x02013;<xref ref-type="bibr" rid="B6">6</xref>) and focused on the performance of working memory, attention, inhibitory control and other executive functions of deaf children. Few of them ever implied electroencephalogram (EEG) technique to investigate the neural mechanisms of this interference control impairment among deaf children. However, EEG signals, with their millisecond temporal resolution, are excellent at tracking rapid changes in brain function, and techniques to acquire these signals are relatively simple and non-invasive, providing more accurate and detailed information to help estimate inhibitory control. Therefore, this study selected it to identify the inhibitory control of emotional interference in deaf children (<xref ref-type="bibr" rid="B7">7</xref>, <xref ref-type="bibr" rid="B8">8</xref>).</p>
<p>In addition to the commonly studied interference control ability, various emotion skills are also believed to be impaired in deaf children. For instance, facial emotion processing, as one of the most studied aspects of social cognitive function, is reported being impaired in deaf children (<xref ref-type="bibr" rid="B4">4</xref>, <xref ref-type="bibr" rid="B9">9</xref>, <xref ref-type="bibr" rid="B10">10</xref>). Moreover, dozens of studies also showed that deaf children are more challenged in terms of emotion identification, emotion understanding, and the expression of emotion compared with hearing controls (<xref ref-type="bibr" rid="B11">11</xref>&#x02013;<xref ref-type="bibr" rid="B16">16</xref>). Such challenges can be attributed to delayed language acquisition or lack of personal experience opportunities to talk with others, as well as the long-term stress environment in which they are trapped in emotional states such as anxiety, depression and subjective anxiety.</p>
<p>Moreover, Gray (<xref ref-type="bibr" rid="B17">17</xref>, <xref ref-type="bibr" rid="B18">18</xref>) believed that cognition and emotion are strongly integrated and inseparable in the process of information processing (<xref ref-type="bibr" rid="B19">19</xref>). A meta-analysis of inhibitory control demonstrated that several brain areas have been associated with the mechanisms underlying inhibitory control, with a network involving left and right inferior frontal gyrus (IFG) dorsolateral pre-frontal cortex (dlPFC), anterior cingulate (ACC) (<xref ref-type="bibr" rid="B20">20</xref>). Specifically, the role of the anterior cingulate cortex (ACC) and the dorsolateral pre-frontal cortex (DLPFC) regions have been shown to be components of a neural network which plays a critical role in the completion of tasks requiring self-monitoring and inhibition (<xref ref-type="bibr" rid="B21">21</xref>, <xref ref-type="bibr" rid="B22">22</xref>). In addition, several meta-analyses of emotion regulation reported activations in the bilateral dorsolateral pre-frontal cortex (dlPFC), ventrolateral pre-frontal cortex (vlPFC), dorsal anterior cingulate cortex (dACC) (<xref ref-type="bibr" rid="B23">23</xref>&#x02013;<xref ref-type="bibr" rid="B25">25</xref>), which largely overlaps with the classic frontoparietal cognitive control network (<xref ref-type="bibr" rid="B26">26</xref>).</p>
<p>According to the above evidence analysis (<xref ref-type="bibr" rid="B2">2</xref>&#x02013;<xref ref-type="bibr" rid="B4">4</xref>, <xref ref-type="bibr" rid="B9">9</xref>, <xref ref-type="bibr" rid="B10">10</xref>, <xref ref-type="bibr" rid="B20">20</xref>, <xref ref-type="bibr" rid="B23">23</xref>&#x02013;<xref ref-type="bibr" rid="B26">26</xref>), we can know that emotional and inhibitory control disorders both exist in deaf children, and the activation brain regions of cognitive and emotional networks are highly overlapped. Therefore, the present study combined two aspects and employed the face-word emotional stroop task to investigate the emotional inhibitory control of deaf children (<xref ref-type="bibr" rid="B27">27</xref>&#x02013;<xref ref-type="bibr" rid="B29">29</xref>). In this task, &#x0201C;happy&#x0201D; and &#x0201C;fear&#x0201D; words with red color are superimposed across facial expressions of happy and fear, conflict effects occur when emotional words and facial expressions are incongruent, which has been widely used to examine the inhibitory control of emotional interference (<xref ref-type="bibr" rid="B28">28</xref>, <xref ref-type="bibr" rid="B29">29</xref>). Previous studies have observed two important ERP components that were related to emotional interference control processing: N1 and N450 (<xref ref-type="bibr" rid="B28">28</xref>, <xref ref-type="bibr" rid="B30">30</xref>&#x02013;<xref ref-type="bibr" rid="B39">39</xref>). The N1 component of the ERP reflect brain activation in the early perceptual stages (<xref ref-type="bibr" rid="B38">38</xref>). It was hypothesized that larger amplitudes of the sensory components (N1) to emotional words indicated an increased attention-related cerebral processing during relatively early perceptual stages of information processing (<xref ref-type="bibr" rid="B31">31</xref>). The N450 is a popular index of conflict detection in emotional conflict control tasks which shows larger negative amplitude in the incongruent condition compared to congruent condition (<xref ref-type="bibr" rid="B34">34</xref>&#x02013;<xref ref-type="bibr" rid="B37">37</xref>).</p>
<p>Furthermore, time-frequency analysis (TFA) can provide complementary information on neural processing dynamics that is distinctive from traditional phase-locked ERP method. Therefore, according to previous studies, theta and alpha band are used for analysis to explore the characteristics of emotional suppression control in deaf children (<xref ref-type="bibr" rid="B40">40</xref>&#x02013;<xref ref-type="bibr" rid="B46">46</xref>). The frontal-central distribution of the evoked theta (4&#x02013;7 Hz) response is suggested to be related with central executive and working memory processes, and reflects initiation of the central executive processes to detect interference and to inhibit the response for task-irrelevant features (<xref ref-type="bibr" rid="B40">40</xref>, <xref ref-type="bibr" rid="B42">42</xref>, <xref ref-type="bibr" rid="B43">43</xref>, <xref ref-type="bibr" rid="B47">47</xref>). Alpha desynchronization (8&#x02013;14 Hz) which reflects attentional processes, processing of sensory&#x02013;semantic information and the difficulty of the task, that is to say, the more demanding a task, the stronger the amount of event-related alpha desynchronization (<xref ref-type="bibr" rid="B40">40</xref>, <xref ref-type="bibr" rid="B44">44</xref>&#x02013;<xref ref-type="bibr" rid="B46">46</xref>).</p>
<p>Taken together, inhibitory control of emotional interference is vital not only for good behavior and cognitive function, but also for adequate emotional control and social interaction (<xref ref-type="bibr" rid="B48">48</xref>, <xref ref-type="bibr" rid="B49">49</xref>). In the current study, we applied EEG technique to investigate both emotion and cognitive abilities using face-word emotional stroop task, and further explore the potential neural markers of emotional interference control deficit among deaf children. On the basis of previous research (<xref ref-type="bibr" rid="B10">10</xref>, <xref ref-type="bibr" rid="B48">48</xref>, <xref ref-type="bibr" rid="B50">50</xref>&#x02013;<xref ref-type="bibr" rid="B52">52</xref>), our hypothesis was that compared to hearing controls, deaf children would show worse performance in both behavioral and EEG measures during emotional stroop task.</p>
</sec>
<sec sec-type="materials and methods" id="s2">
<title>Materials and Methods</title>
<sec>
<title>Participants</title>
<p>We performed a sample size calculation on the basis of G<sup>&#x0002A;</sup>Power 3.1, using an alpha level of 0.05 with 95% power to detect a large effect size (f = 0.4). Results showed that a sample size of 12 would be needed to assure the adequate statistical power. Therefore, a total of 31 deaf children aged 9&#x02013;13 were recruited from the Kaifeng Special Education School in central China and 28 matched hearing controls were recruited from the same geography area. Parent and teacher reports established that all children were born to hearing parents and had no apparent mental health disorders such as ADHD or autism spectrum disorders. Main demographic characteristics of both deaf children and hearing controls were shown in <xref ref-type="table" rid="T1">Table 1</xref>. Participants were required to complete a computer version of emotional conflict task while recording EEG. Each participant received an age-appropriate gift at the completion of the experiment as token of appreciation. The study protocol was approved by the Institutional Review Board of Henan University, and all participants provided written informed consent prior to data collection.</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Descriptive characteristics of deaf children and hearing controls.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center"><bold>Deaf children</bold></th>
<th valign="top" align="center"><bold>Hearing controls</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">No. of children</td>
<td valign="top" align="center">31</td>
<td valign="top" align="center">28</td>
</tr>
<tr>
<td valign="top" align="left">Mean age (SD) (years)</td>
<td valign="top" align="center">11.613 (0.230)</td>
<td valign="top" align="center">11.321 (0.242)</td>
</tr>
<tr>
<td valign="top" align="left">Range of age (years)</td>
<td valign="top" align="center">9&#x02013;13</td>
<td valign="top" align="center">9&#x02013;13</td>
</tr>
<tr>
<td valign="top" align="left">Ratio of female/male (%)</td>
<td valign="top" align="center">48.39/51.61</td>
<td valign="top" align="center">53.57/46.43</td>
</tr>
<tr>
<td valign="top" align="left">With hearing aids/Without hearing aids (%)</td>
<td valign="top" align="center">48.39/51.61</td>
<td valign="top" align="center">//</td>
</tr>
<tr>
<td valign="top" align="left">Communication mode</td>
<td valign="top" align="center">Sign language</td>
<td valign="top" align="center">Oral language</td>
</tr>
</tbody>
</table>
</table-wrap></sec>
<sec>
<title>Stimuli and Procedure</title>
<p>Twenty face pictures were selected from Chinese affective picture system (<xref ref-type="bibr" rid="B53">53</xref>), including 10 happy face pictures (5 female, 5 male) and 10 fearful face pictures (5 female, 5 male). Two Chinese characters, &#x0201C;&#x06109;&#x05FEB;&#x0201D; (which means &#x0201C;happy&#x0201D;) or &#x0201C;&#x06050;&#x060E7;&#x0201D; (which means &#x0201C;fear&#x0201D;) were superimposed on the faces in red. The words and facial expressions were either congruent (e.g., character meaning fear superimposed onto a fear face picture, see <xref ref-type="fig" rid="F1">Figure 1</xref>) or incongruent (e.g., character meaning happy superimposed onto a fear face picture, see <xref ref-type="fig" rid="F1">Figure 1</xref>). The stimuli were programmed by E-Prime 2.0 software, and they were presented on a Dell 19 -in monitor.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Procedures for emotional stroop task.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpsyt-13-897595-g0001.tif"/>
</fig>
<p>Participants performed modified face-word stroop task (judging facial expression) while sitting in quiet room with dim light, participants had to identify the facial expression of the target faces while ignoring the meaning of the words. They were instructed to respond by pressing a button, corresponding to &#x0201C;fear&#x0201D; faces (right index finger) or &#x0201C;happy&#x0201D; faces (right middle finger), as quickly and accurately as possible. The order of performing the experimental task was counterbalanced across participants.</p>
<p>The face-word stroop task consisted of 240 trials that were presented over 4 blocks (60 trials per block). Each block in each task consisted of an equal amount of congruent and incongruent trials. Stimuli were presented in random order within each block. Participants performed in a 24-trial practice block prior to the experiment. The timing and order of each trial was the same for each block: a fixation dot was presented for a specific duration (500 ms) followed by a blank screen of variable duration (300&#x02013;500 ms). Then, the target face appeared for 1000 ms at the center of the screen. Participants had to respond within 1500 ms. The inter-trial interval (ITI) varied randomly between 800 ms and 1200 ms, with a mean of 1000 ms (<xref ref-type="fig" rid="F1">Figure 1</xref>).</p>
</sec>
<sec>
<title>EEG Recording</title>
<p>The electroencephalogram (EEG) was recorded from a 32 scalp standard channel cap (10/20 system; Brain Products, Munich, Germany) (<xref ref-type="fig" rid="F2">Figure 2</xref>). Electrooculogram (EOG) was recorded from electrodes placed above the right eye. All electrode recordings were online referenced to FCz. All inter-electrode impedance was maintained below 5 k&#x003A9;. The EEG and EOG signals were amplified using a 0.01&#x02013;100 Hz band pass filter and continuously sampled at 500 Hz/channel for offline analysis.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>Standard electrode map, illustrating the commonly deployed 10-20 System. F refers to Frontal lobe, T refers to Temporal lobe, C refers to Central lobe, P refers to Parietal lobe, O refers to Occipital lobe, z refers to an electrode placed on the mid-line.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpsyt-13-897595-g0002.tif"/>
</fig>
<p>After data acquisition, EEG data were transferred into the EEGLAB and Letswave toolboxes, which are open-source Matlab toolboxes for neurophysiologic data analysis (<xref ref-type="bibr" rid="B54">54</xref>, <xref ref-type="bibr" rid="B55">55</xref>). EEG were re-referenced to the average of the two mastoids and filtered with a band pass of 0.1&#x02013;30 Hz. Epochs were extracted between the 200 ms pre-stimulus and 1000 ms post-stimulus time points, and the baseline correction was performed in 200 ms pre-stimulus interval. Eye movement artifacts were removed with ICA. Finally, data were inspected and cleansed manually for any obvious remaining artifacts.</p>
</sec>
<sec>
<title>ERP Analysis</title>
<p>This study analyzed the potentials of the ERP components N1 and N450. The electrodes for further analysis were chosen according to ERP topographical distribution and previous studies (<xref ref-type="bibr" rid="B56">56</xref>, <xref ref-type="bibr" rid="B57">57</xref>). Specifically, the amplitudes of the N1 (100-200 ms) were analyzed at F3, F4, Fz, and the N450 (330&#x02013;400 ms) at C3, C4, Cz, P3, P4, Pz, and were measured as mean values. The time windows were determined through visual detection in the grand-averaged ERPs.</p>
</sec>
<sec>
<title>Time&#x02013;Frequency Analyses</title>
<p>An estimate of the oscillatory power as a function of time and frequency (time&#x02013;frequency representation) was obtained from single-trial EEG epochs using the continuous wavelet transform (CWT) (<xref ref-type="bibr" rid="B55">55</xref>). The time&#x02013;frequency representations were explored between 1 Hz and 30 Hz in steps of 0.29 Hz. Epochs were extracted between the 400 ms pre-stimulus and 1000 ms post-stimulus time points. To avoid edge effects when performing CWT, the pre-stimulus time interval (&#x02212;400 ms to &#x02212;200 ms) was used as a baseline interval. Based on average condition contrast maps and previous studies (<xref ref-type="bibr" rid="B58">58</xref>), 2 clusters were tested in this study: 4&#x02013;7 Hz at 200&#x02013;400 ms for theta (F3, F4, Fz, C3, C4, and Cz), 8&#x02013;14 Hz at 600&#x02013;800 ms for alpha (P3, P4 and Pz). Each of these oscillatory components was quantified as the mean amplitude within these time windows of each participant.</p>
</sec>
<sec>
<title>Statistical Analysis</title>
<p>SPSS 20.0 was used to perform ANOVA or the chi-square test to investigate whether the demographic factors (including age and gender) showed significant differences between groups (deaf children and hearing controls). Furthermore, repeated measures ANOVA was conducted on behavioral and ERP data with group (deaf children vs. hearing controls) as a between-subject factor, while stimulus type (congruent, and incongruent), Hemisphere (Hemi) (only in EEG data: Left, Midline and Right) and antero-posterior distribution (AP) (only in EEG data: Frontal, Central, Parietal and Occipital) were considered as within-subject factors. For all the analyses in this study, the <italic>p</italic>-values were corrected by Greenhouse-Geisser correction when appropriate.</p>
</sec>
</sec>
<sec sec-type="results" id="s3">
<title>Results</title>
<sec>
<title>Behavioral Data</title>
<sec>
<title>Accuracy</title>
<p>ANOVA showed significant main effect of group on response accuracy (F<sub>1, 57</sub> = 11.705, <italic>p</italic> = 0.002, &#x003B7;<sup>2</sup> = 0.163), with overall lower accuracy in deaf children compared to hearing controls (see <xref ref-type="table" rid="T2">Table 2</xref>), which indicated that deaf children had difficulties in suppressing irrelevant information and suffered from deficient cognitive control mechanisms (<xref ref-type="bibr" rid="B4">4</xref>, <xref ref-type="bibr" rid="B5">5</xref>, <xref ref-type="bibr" rid="B50">50</xref>). The main effect of condition was also significant (F<sub>1, 57</sub> = 63.094, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.525), with incongruent condition (0.872 &#x000B1; 0.010) being significantly lower than congruent condition (0.927 &#x000B1; 0.009), which indicated that in the presence of a conflict effect, the incongruent condition invested more cognitive resources than the congruent condition (<xref ref-type="bibr" rid="B28">28</xref>, <xref ref-type="bibr" rid="B29">29</xref>, <xref ref-type="bibr" rid="B59">59</xref>).</p>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>Mean accuracy and reaction time (M &#x000B1; SD) of deaf children and hearing controls, and results of repeated measures ANOVA for conditions.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center"><bold>Congruent</bold></th>
<th valign="top" align="center"><bold>Incongruent</bold></th>
<th valign="top" align="center"><bold>F<sub><bold>GROUP</bold></sub> (<italic>p</italic>)</bold></th>
<th valign="top" align="center"><bold>F<sub><bold>CON</bold></sub> (<italic>p</italic>)</bold></th>
<th valign="top" align="center"><bold>F<sub><bold>CON&#x0002A;GROUP</bold></sub> (<italic>p</italic>)</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left"><bold>ACC</bold></td>
</tr>
<tr>
<td valign="top" align="left">Total (<italic>n</italic> = 59)</td>
<td valign="top" align="center">0.927 &#x000B1; 0.009</td>
<td valign="top" align="center">0.872 &#x000B1; 0.010</td>
<td valign="top" align="center">63.094 (0.000<xref ref-type="table-fn" rid="TN1"><sup>&#x0002A;&#x0002A;&#x0002A;</sup></xref>)</td>
<td valign="top" align="center">11.075 (0.002<xref ref-type="table-fn" rid="TN1"><sup>&#x0002A;&#x0002A;&#x0002A;</sup></xref>)</td>
<td valign="top" align="center">1.696 (0.198)</td>
</tr>
<tr>
<td valign="top" align="left">Deaf Children (<italic>n</italic> = 31)</td>
<td valign="top" align="center">0.893 &#x000B1; 0.012</td>
<td valign="top" align="center">0.847 &#x000B1; 0.014</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Controls (<italic>n</italic> = 28)</td>
<td valign="top" align="center">0.961 &#x000B1; 0.013</td>
<td valign="top" align="center">0.897 &#x000B1; 0.014</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left"><bold>Reaction Time (ms)</bold></td>
</tr>
<tr>
<td valign="top" align="left">Total (<italic>n</italic> = 59)</td>
<td valign="top" align="center">759.818 &#x000B1; 14.833</td>
<td valign="top" align="center">816.231 &#x000B1; 14.481</td>
<td valign="top" align="center">3.395 (0.071)</td>
<td valign="top" align="center">135.774 (0.000<xref ref-type="table-fn" rid="TN1"><sup>&#x0002A;&#x0002A;&#x0002A;</sup></xref>)</td>
<td valign="top" align="center">24.981 (0.000<xref ref-type="table-fn" rid="TN1"><sup>&#x0002A;&#x0002A;&#x0002A;</sup></xref>)</td>
</tr>
<tr>
<td valign="top" align="left">Deaf Children (<italic>n</italic> = 31)</td>
<td valign="top" align="center">745.278 &#x000B1; 20.437</td>
<td valign="top" align="center">777.492 &#x000B1; 19.952</td>
<td valign="top" align="center">23.325 (0.000<xref ref-type="table-fn" rid="TN1"><sup>&#x0002A;&#x0002A;&#x0002A;</sup></xref>)</td>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Controls (<italic>n</italic> = 28)</td>
<td valign="top" align="center">774.358 &#x000B1; 21.504</td>
<td valign="top" align="center">854.969 &#x000B1; 20.994</td>
<td valign="top" align="center">131.910 (0.000<xref ref-type="table-fn" rid="TN1"><sup>&#x0002A;&#x0002A;&#x0002A;</sup></xref>)</td>
<td/>
<td/>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn id="TN1">
<label>&#x0002A;&#x0002A;&#x0002A;</label>
<p><italic>p &#x0003C; 0.01</italic>.</p></fn>
</table-wrap-foot>
</table-wrap></sec>
<sec>
<title>Reaction Time</title>
<p>The significant main effect of condition was found that congruent condition (759.818 &#x000B1; 14.833) was significantly faster than incongruent condition (816.231 &#x000B1; 14.481) (F<sub>1, 57</sub> = 135.774, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.704), which is consistent with the response accuracy, indicating that the conflict condition requires more cognitive resources and that participants need longer reaction times to make judgments. The interaction between group and condition was significant (F<sub>1, 57</sub> = 24.981, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.305). Further analysis indicated that a significant interference effect with the congruent condition (745.278 &#x000B1; 20.437) (774.358 &#x000B1; 21.504) being faster than the incongruent condition (777.492 &#x000B1; 19.952) (854.969 &#x000B1; 20.994) in deaf children (F<sub>1, 57</sub> = 23.325, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.290) and hearing controls (F<sub>1, 57</sub> = 131.910, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.698).</p>
</sec>
<sec>
<title>ERP Amplitude Analysis</title>
<p>For the consideration of space, we only included significant results in this part. <xref ref-type="fig" rid="F3">Figure 3</xref> showed the grand-averaged waveforms of deaf children and hearing controls. The Means and SEs for each component were displayed in <xref ref-type="table" rid="T3">Table 3</xref>, and the Means and SDs of amplitudes of each electrode N1 and N450 were displayed in <xref ref-type="table" rid="T4">Table 4</xref>.</p>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p>Waveforms of N1 and N450 components in emotional stroop task of deaf children and hearing controls.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpsyt-13-897595-g0003.tif"/>
</fig>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p>ERP amplitudes (M &#x000B1; SD) of deaf children and hearing controls, and results of repeated measures ANOVA for conditions.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center"><bold>Congruent</bold></th>
<th valign="top" align="center"><bold>Incongruent</bold></th>
<th valign="top" align="center"><bold>F<sub><bold>CON</bold></sub> (<italic>p</italic>)</bold></th>
<th valign="top" align="center"><bold>F<sub><bold>GROUP</bold></sub>(<italic>p</italic>)</bold></th>
<th valign="top" align="center"><bold>F<sub><bold>CON&#x0002A;GROUP</bold></sub> (<italic>p</italic>)</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left"><bold>N1 amplitude</bold></td>
</tr>
<tr>
<td valign="top" align="left">Total (<italic>n</italic> = 59)</td>
<td valign="top" align="center">&#x02212;6.898 &#x000B1; 0.574</td>
<td valign="top" align="center">&#x02212;6.776 &#x000B1; 0.496</td>
<td valign="top" align="center">0.118 (0.732)</td>
<td valign="top" align="center">4.517 (0.038<xref ref-type="table-fn" rid="TN2"><sup>&#x0002A;</sup></xref>)</td>
<td valign="top" align="center">0.171 (0.681)</td>
</tr>
<tr>
<td valign="top" align="left">Deaf Children (<italic>n</italic> = 31)</td>
<td valign="top" align="center">&#x02212;5.896 &#x000B1; 0.791</td>
<td valign="top" align="center">&#x02212;5.627 &#x000B1; 0.683</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Controls (<italic>n</italic> = 28)</td>
<td valign="top" align="center">&#x02212;7.900 &#x000B1; 0.832</td>
<td valign="top" align="center">&#x02212;7.925 &#x000B1; 0.719</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left"><bold>N450 amplitude</bold></td>
</tr>
<tr>
<td valign="top" align="left">Total (<italic>n</italic> = 59)</td>
<td valign="top" align="center">7.285 &#x000B1; 0.938</td>
<td valign="top" align="center">6.319 &#x000B1; 0.945</td>
<td valign="top" align="center">5.090 (0.028<xref ref-type="table-fn" rid="TN2"><sup>&#x0002A;</sup></xref>)</td>
<td valign="top" align="center">5.883 (0.018<xref ref-type="table-fn" rid="TN2"><sup>&#x0002A;</sup></xref>)</td>
<td valign="top" align="center">0.237 (0.629)</td>
</tr>
<tr>
<td valign="top" align="left">Deaf Children (<italic>n</italic> = 31)</td>
<td valign="top" align="center">4.957 &#x000B1; 1.293</td>
<td valign="top" align="center">4.199 &#x000B1; 1.302</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td valign="top" align="left">Controls (n=28)</td>
<td valign="top" align="center">9.613 &#x000B1;1.360</td>
<td valign="top" align="center">8.439 &#x000B1;1.370</td>
<td/>
<td/>
<td/>
</tr>
</tbody>
</table>
<table-wrap-foot>
<fn id="TN2">
<label>&#x0002A;</label>
<p><italic>p &#x0003C; 0.05</italic>.</p></fn>
</table-wrap-foot>
</table-wrap>
<table-wrap position="float" id="T4">
<label>Table 4</label>
<caption><p>The average amplitudes (&#x003BC;V) of the ERP components (M &#x000B1; SD) between deaf children and hearing controls.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th valign="top" align="left"><bold>ERP components</bold></th>
<th valign="top" align="left"><bold>Stimulus type</bold></th>
<th valign="top" align="left"><bold>Electrode point</bold></th>
<th valign="top" align="center" colspan="2" style="border-bottom: thin solid #000000;"><bold>Average amplitudes (&#x003BC;V)</bold></th>
</tr>
<tr>
<th/>
<th/>
<th/>
<th valign="top" align="center"><bold>Deaf children</bold></th>
<th valign="top" align="center"><bold>Hearing controls</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">N1</td>
<td valign="top" align="left">congruent</td>
<td valign="top" align="left">F3</td>
<td valign="top" align="center">&#x02212;6.066 &#x000B1; 4.142</td>
<td valign="top" align="center">&#x02212;7.687 &#x000B1; 4.475</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">Fz</td>
<td valign="top" align="center">&#x02212;5.703 &#x000B1; 4.479</td>
<td valign="top" align="center">&#x02212;8.360 &#x000B1; 4.716</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">F4</td>
<td valign="top" align="center">&#x02212;5.919 &#x000B1; 4.569</td>
<td valign="top" align="center">&#x02212;7.654 &#x000B1; 4.570</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">incongruent</td>
<td valign="top" align="left">F3</td>
<td valign="top" align="center">&#x02212;5.710 &#x000B1; 3.381</td>
<td valign="top" align="center">&#x02212;7.682 &#x000B1; 3.868</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">Fz</td>
<td valign="top" align="center">&#x02212;5.681 &#x000B1; 3.492</td>
<td valign="top" align="center">&#x02212;8.360 &#x000B1; 4.257</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">F4</td>
<td valign="top" align="center">&#x02212;5.489 &#x000B1; 4.195</td>
<td valign="top" align="center">&#x02212;7.733 &#x000B1; 4.257</td>
</tr>
<tr>
<td valign="top" align="left">N450</td>
<td valign="top" align="left">congruent</td>
<td valign="top" align="left">C3</td>
<td valign="top" align="center">&#x02212;0.395 &#x000B1; 7.057</td>
<td valign="top" align="center">3.522 &#x000B1; 7.800</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">Cz</td>
<td valign="top" align="center">1.233 &#x000B1; 7.891</td>
<td valign="top" align="center">3.107 &#x000B1; 10.568</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">C4</td>
<td valign="top" align="center">1.171 &#x000B1; 8.083</td>
<td valign="top" align="center">3.162 &#x000B1; 8.849</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">P3</td>
<td valign="top" align="center">8.555 &#x000B1; 7.166</td>
<td valign="top" align="center">15.931 &#x000B1; 8.604</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">Pz</td>
<td valign="top" align="center">9.506 &#x000B1; 6.065</td>
<td valign="top" align="center">15.075 &#x000B1; 9.569</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">P4</td>
<td valign="top" align="center">9.670 &#x000B1; 7.649</td>
<td valign="top" align="center">16.884 &#x000B1; 9.240</td>
</tr>
<tr>
<td/>
<td valign="top" align="left">incongruent</td>
<td valign="top" align="left">C3</td>
<td valign="top" align="center">&#x02212;0.818 &#x000B1; 7.630</td>
<td valign="top" align="center">2.660 &#x000B1; 7.194</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">Cz</td>
<td valign="top" align="center">0.182 &#x000B1; 8.320</td>
<td valign="top" align="center">2.191 &#x000B1; 10.376</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">C4</td>
<td valign="top" align="center">0.466 &#x000B1; 8.285</td>
<td valign="top" align="center">2.414 &#x000B1; 8.185</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">P3</td>
<td valign="top" align="center">8.075 &#x000B1; 7.098</td>
<td valign="top" align="center">14.529 &#x000B1; 7.968</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">Pz</td>
<td valign="top" align="center">8.449 &#x000B1; 7.165</td>
<td valign="top" align="center">13.514 &#x000B1; 9.028</td>
</tr>
<tr>
<td/>
<td/>
<td valign="top" align="left">P4</td>
<td valign="top" align="center">8.840 &#x000B1; 7.791</td>
<td valign="top" align="center">15.327 &#x000B1; 8.594</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec>
<title>N1</title>
<p>A repeated-measures analysis of variance (ANOVA) was applied in this procedure with N1 amplitude as dependent variable, with stimulus type (congruent, and incongruent), Hemisphere (Hemi: Left, Midline and Right) and AP [frontal(F) (electrodes: F3, Fz, F4), central (C) (electrodes: C3, Cz, C4), and parietal (P) (electrodes: P3, Pz, P4)] as within-subject factors, and group (deaf children vs. hearing controls) as a between-subject factor. Results showed significant main effect of group on N1 amplitude (F<sub>1, 57</sub> = 4.517, <italic>p</italic> = 0.038, &#x003B7;<sup>2</sup> = 0.073), with hearing controls (&#x02212;7.913 &#x000B1; 0.734 &#x003BC;V) eliciting overall larger N1 compared to deaf children (&#x02212;5.761 &#x000B1; 0.697 &#x003BC;V), which suggested that the smaller N1 amplitudes were neurophysiological reflex of deficient inhibition for emotional stroop task in deaf children (<xref ref-type="bibr" rid="B60">60</xref>). The interaction effect of Group &#x000D7; Hemisphere was also significant for N1 amplitude (F<sub>2, 56</sub> = 3.197, <italic>p</italic> = 0.046, &#x003B7;<sup>2</sup> = 0.053). Further analysis indicated that midline region (Fz) elicited larger N1 activation in hearing controls (&#x02212;8.360 &#x000B1; 0.75 6&#x003BC;V) compared to deaf children (&#x02212;5.692 &#x000B1; 0.718 &#x003BC;V), while there was no significant group difference in response to right (F4) and left hemispheres (F3).</p>
</sec>
<sec>
<title>N450</title>
<p>Analysis of N450 amplitude showed a significant main effect of group (F<sub>1, 57</sub> = 5.883, <italic>p</italic> = 0.018, &#x003B7;<sup>2</sup> = 0.094), with deaf children (4.578 &#x000B1; 1.263 &#x003BC;V) eliciting overall larger N450 compared to hearing controls (9.026 &#x000B1; 1.329 &#x003BC;V). There also was a significant main effect of condition (F<sub>1, 57</sub> = 5.090, <italic>p</italic> = 0.028, &#x003B7;<sup>2</sup> = 0.082), incongruent condition (6.319 &#x000B1; 0.945 &#x003BC;V) eliciting larger N450 than congruent condition (7.285 &#x000B1; 0.938&#x003BC;V). The results of the N450 amplitudes were consistent with behavioral outcomes, reflecting deficits in inhibitory control in deaf children and inconsistent conditions requiring more effort to complete. The results showed a significant main effect of AP (F<sub>1, 57</sub> = 203.155, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.781), with central area (1.574 &#x000B1; 1.031 &#x003BC;V) eliciting larger N450 than parietal area (12.030 &#x000B1; 0.942 &#x003BC;V). There was also a significant Group &#x000D7; AP interaction effect (F<sub>1, 57</sub> = 6.798, <italic>p</italic> = 0.012, &#x003B7;<sup>2</sup> = 0.107). According to further analysis, this interaction indicated that the central area (0.306 &#x000B1; 1.420 &#x003BC;V) (2.842 &#x000B1; 1.495 &#x003BC;V) elicited larger N450 than parietal area (8.849 &#x000B1; 1.298 &#x003BC;V) (15.210 &#x000B1; 1.366 &#x003BC;V) in both deaf children (F<sub>1, 57</sub> = 71.447, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.556) and hearing controls (F<sub>1, 57</sub> = 135.262, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.704).</p>
</sec>
</sec>
<sec>
<title>Time-Frequency Results</title>
<sec>
<title>Theta Activity</title>
<p><xref ref-type="fig" rid="F4">Figure 4</xref> presented the Time-frequency (TF) analysis results. Theta synchronization (200&#x02013;400 ms) showed significant main effect of condition (F<sub>1, 57</sub> = 5.022, <italic>p</italic> = 0.029, &#x003B7;<sup>2</sup> = 0.081), with incongruent condition (7.135 &#x000B1; 0.467) eliciting larger theta synchronization than congruent condition (6.624 &#x000B1; 0.460). Theta synchronization also showed significant main effect of group (F<sub>1, 57</sub> = 5.348, <italic>p</italic> = 0.024, &#x003B7;<sup>2</sup> = 0.086), with deaf children (7.920 &#x000B1; 0.620) eliciting larger theta synchronization than hearing controls (5.840 &#x000B1; 0.652), which were consistent with the results of N450. There also was a significant main effect of emotional background (F<sub>1, 57</sub> = 7.659, <italic>p</italic> = 0.008, &#x003B7;<sup>2</sup> = 0.118), with fear background (7.189 &#x000B1; 0.458) eliciting larger theta synchronization than happy background (6.571 &#x000B1; 0.469). The results also showed significant main effect of AP (F<sub>1, 57</sub> = 44.420, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.438) and hemisphere (F<sub>2, 56</sub> = 12.970, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.185), with frontal area (7.674 &#x000B1; 0.490) eliciting larger theta synchronization than central area (6.085 &#x000B1; 0.439), midline region (Fz, Cz) (7.408 &#x000B1; 0.494) eliciting larger theta synchronization than left hemispheres (F3, C3) (6.857 &#x000B1; 0.476) and right hemispheres (F4, C4) (5.840 &#x000B1; 0.652). A significant interaction was found between the emotional background and group (F<sub>1, 57</sub> = 7.832, <italic>p</italic> = 0.007, &#x003B7;<sup>2</sup> = 0.121), further analysis indicated that deaf children (8.541 &#x000B1; 0.631) elicited larger theta synchronization in fear background compared to hearing controls (5.836 &#x000B1; 0.663), while there was no significant group difference in response to happy background.</p>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p>Group-averaged time-frequency spectrogram during facial emotion recognition. Time (in ms) is denoted on the x-axis, with 0 ms defined as the onset of the stimuli. Frequency (in Hz) is shown on the y-axis. A represent the theta band (200&#x02013;400 ms), and B represent the alpha band (600&#x02013;800 ms).</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fpsyt-13-897595-g0004.tif"/>
</fig></sec>
<sec>
<title>Alpha Activity</title>
<p>Alpha desynchronization (600&#x02013;800 ms) showed significant main effect of group (F<sub>1, 57</sub> = 6.868, <italic>p</italic> = 0.011, &#x003B7;<sup>2</sup> = 0.108), with hearing controls (&#x02212;6.376 &#x000B1; 1.003) eliciting larger alpha desynchronization than deaf children (&#x02212;2.750 &#x000B1; 0.953), which were in line with the results of N1. Alpha desynchronization also showed significant main effect of emotional background (F<sub>1, 57</sub> = 6.939, <italic>p</italic> = 0.011, &#x003B7;<sup>2</sup> = 0.109), with fear background (&#x02212;4.882 &#x000B1; 0.773) eliciting larger alpha desynchronization than happy background (&#x02212;4.244 &#x000B1; 0.624). There also was a significant main effect of hemisphere (F<sub>2, 56</sub> = 10.817, <italic>p</italic> = 0.000, &#x003B7;<sup>2</sup> = 0.160), with midline region (Pz) (&#x02212;3.665 &#x000B1; 0.540) eliciting smaller alpha desynchronization than left hemispheres (P3) (&#x02212;4.823 &#x000B1; 0.798) and right hemispheres (P4) (&#x02212;5.190 &#x000B1; 0.790).</p>
</sec>
</sec>
</sec>
<sec sec-type="discussion" id="s4">
<title>Discussion</title>
<p>The present study explored the emotional interference effect among deaf children. Behavioral results showed that the main effect of condition was significant not only for the accuracy data but also for the reaction time data, which indicated that in the presence of a conflict effect, the incongruent condition invested more cognitive resources than the congruent condition (<xref ref-type="bibr" rid="B28">28</xref>, <xref ref-type="bibr" rid="B29">29</xref>, <xref ref-type="bibr" rid="B59">59</xref>). In addition, it was also found that deaf children demonstrated significantly lower accuracy rate in emotional stroop task than hearing controls, which is consistent with previous findings that deaf children had difficulties in suppressing irrelevant information and suffered from deficient cognitive control mechanisms (<xref ref-type="bibr" rid="B4">4</xref>, <xref ref-type="bibr" rid="B5">5</xref>, <xref ref-type="bibr" rid="B50">50</xref>).</p>
<p>These behavioral results were further explored by ERP analysis. Deaf children showed diminished activation in the emotional interference processing components compared to hearing controls. Given that the N1 components reflect the attentional focus on the target and a discrimination process within the focus of attention (<xref ref-type="bibr" rid="B31">31</xref>, <xref ref-type="bibr" rid="B60">60</xref>), the smaller N1 amplitudes suggest two important points. First, although the emotional stroop task needed participants to concentrate on the facial expression of the picture, the deaf children paid less attention on the task-relevant information (the words of the picture) because of the interference of the meaning of the words. Therefore, a weaker allocation for target information was obtained. Second, for task-irrelevant information (the words of the picture), the deaf children also automatically put attentional resources on it; consequently, the attentional resources for task completing were deficient, and the time needed to complete the task was prolonged or the task was poorly performed. So the smaller N1 amplitudes might be correlated with a slower response to emotional interference stimuli, which suggested that the smaller N1 amplitudes were neurophysiological reflex of deficient inhibition for emotional stroop task (<xref ref-type="bibr" rid="B61">61</xref>). This finding is in accordance with the evidence of similar alterations during stroop task in individuals with schizophrenia, amblyopic, obsessive compulsive disorder, or depression (<xref ref-type="bibr" rid="B31">31</xref>, <xref ref-type="bibr" rid="B38">38</xref>, <xref ref-type="bibr" rid="B62">62</xref>, <xref ref-type="bibr" rid="B63">63</xref>).</p>
<p>In contrast to N1, deaf children showed enhanced activation in N450 compared to hearing controls. According to previous studies, the N450 is a valid index of conflict monitoring in emotional conflict control tasks and shows larger negative amplitude in the incongruent condition compared to congruent condition (<xref ref-type="bibr" rid="B34">34</xref>&#x02013;<xref ref-type="bibr" rid="B37">37</xref>, <xref ref-type="bibr" rid="B64">64</xref>). The role of the anterior cingulate cortex (ACC) and the dorsolateral pre-frontal cortex (DLPFC) regions have been shown to be components of a neural network which plays a critical role in the completion of tasks requiring self-monitoring and inhibition (<xref ref-type="bibr" rid="B21">21</xref>). The N450 component demonstrated enhanced amplitudes in deaf children, suggesting that deaf children may require a greater recruitment of cognitive resources from the ACC and DLPFC to achieve the performance levels of hearing controls during emotional stroop task (<xref ref-type="bibr" rid="B22">22</xref>, <xref ref-type="bibr" rid="B64">64</xref>). In addition, compared to healthy controls, major depressive disorder (MDD) patients showed enhanced N450 amplitude (<xref ref-type="bibr" rid="B31">31</xref>, <xref ref-type="bibr" rid="B65">65</xref>). Patients with attention deficit hyperactivity disorder (AHDH), nocturnal enuresis (NE) and developmental coordination disorder (DCD) showed increased activation in the bilateral temporoparietal junctions, bilateral dorsolateral pre-frontal cortex, and bilateral anterior cingulate cortex (<xref ref-type="bibr" rid="B66">66</xref>&#x02013;<xref ref-type="bibr" rid="B68">68</xref>). There is a causal relationship between the decreased N1 and the increased N450, which is similar to previous studies of person with depression (<xref ref-type="bibr" rid="B31">31</xref>). The correlation may be due to reduced early attention requiring more effort later in emotional stroop task. Combined with the analysis of behavioral results which showed that accuracy rate of deaf children in emotional stroop task was significantly lower than hearing controls, it can be found that although deaf children made more efforts and showed more activation on N450 than hearing controls, they did not reach the same level as hearing controls, which revealed the emotional impairment of cognitive monitoring function of the deaf children, meanwhile indicated that the emotional cognitive resources for monitoring conflict information and inhibition irrelevant information in the inhibition control process of the deaf children are very limited.</p>
<p>Besides ERP analysis, the current study employed time&#x02013;frequency measures in alpha and theta band which showed significantly more desynchronization in hearing controls and significantly more synchronization in deaf children and incongruent condition. A number of studies have found that alpha oscillation was a reliable marker of attention (<xref ref-type="bibr" rid="B69">69</xref>, <xref ref-type="bibr" rid="B70">70</xref>), theta band was related with central executive and working memory processes, and reflected initiation of the central executive processes to detect interference and to inhibit the response for task-irrelevant features (<xref ref-type="bibr" rid="B40">40</xref>, <xref ref-type="bibr" rid="B42">42</xref>, <xref ref-type="bibr" rid="B43">43</xref>). Alpha desynchronization was similar to the results from ERP components in N1. Previous studies have showed diminished alpha suppression in the predominantly inattentive (IA) and absent alpha oscillation in ADHD (<xref ref-type="bibr" rid="B71">71</xref>, <xref ref-type="bibr" rid="B72">72</xref>). Therefore, the diminished alpha desynchronization might suggest impaired attention distribution ability during emotional interference processing of deaf children. Theta synchronization also showed a conflict effect which was in line with the previous studies (<xref ref-type="bibr" rid="B73">73</xref>, <xref ref-type="bibr" rid="B74">74</xref>). In addition, consistent with ERP results, deaf children showed enhanced theta synchronization which might suggest impaired cognitive monitoring function during emotional interference processing of them.</p>
<p>The negative impact poor inhibitory control has on a range of outcomes for deaf children, but considering that after a period of training, children showed great improvements in their inhibition skills (<xref ref-type="bibr" rid="B49">49</xref>, <xref ref-type="bibr" rid="B75">75</xref>). Therefore, we can train the inhibitory control of emotional interference ability of deaf children to help them form a healthy personality and better integrate into the society. Specifically, school education can strengthen the training of deaf children&#x00027;s inhibitory control of emotional interference ability through flexible and diverse classroom forms. Based on the fact that deaf children are more inclined to visual images when receiving information, they have stronger perception and memory for actions, expressions or visualized pictures. Teachers can use multimedia animation and small games in teaching to mobilize students to participate in various emotional situations, help them improve the problems in emotional control, let them understand the way of emotional expression, enhance their ability to understand the behavior intention of others, and have a certain ability to predict the behavior consequences, so as to improve their ability of the inhibitory control of emotional interference.</p>
</sec>
<sec id="s5">
<title>Contributions, Limitations, and Future Directions</title>
<p>Combining the ERP and TFA data analysis methods simultaneously can not only enhance the energy of ERP components by using the high temporal resolution ERP technology, but also greatly reduce the amplitude and noise of the spontaneous EEG, so as to reveal the time process of individuals in the process of inhibitory control of emotional interference (<xref ref-type="bibr" rid="B76">76</xref>). Moreover, time-frequency analysis can be used to simultaneously extract the temporal and spectral domains of event-related brain activity, improving the detectability of ERP and allowing characterization of non-phase-locked components that cannot be identified by traditional time-domain averaging in healthy and deaf children (<xref ref-type="bibr" rid="B77">77</xref>, <xref ref-type="bibr" rid="B78">78</xref>).</p>
<p>The limitation of this study is that it does not use nuclear magnetic technology with high spatial resolution, but only ERP technology with high temporal resolution. Future research should use the combination of ERP and functional magnetic resonance imaging (fMRI) to measure the neural mechanism of deaf children in emotional stroop task, and use the advantages of the combination of high temporal resolution and high spatial resolution to explore the damage degree of inhibitory control of emotional interference and the corresponding impaired brain regions of deaf children.</p>
</sec>
<sec sec-type="conclusions" id="s6">
<title>Conclusion</title>
<p>In conclusion, the current study revealed major deficits in deaf children during emotion-related conflict control, with overall worse behavioral performance and reduced activation of N1 and alpha desynchronization, and enhanced activation of N450 and theta synchronization compared to the hearing controls, which might suggest impaired attention allocation ability and cognitive monitoring function during conflict detection process in deaf children. The Findings enriched the understanding of impaired inhibitory control of emotional interference in deaf children, and helped educators to take timely and appropriate intervention measures to promote the optimal neuropsychological development of deaf children.</p>
</sec>
<sec sec-type="data-availability" id="s7">
<title>Data Availability Statement</title>
<p>The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.</p>
</sec>
<sec id="s8">
<title>Ethics Statement</title>
<p>The studies involving human participants were reviewed and approved by Institutional Review Board of Henan University. Written informed consent to participate in this study was provided by the participants&#x00027; legal guardian/next of kin.</p>
</sec>
<sec id="s9">
<title>Author Contributions</title>
<p>JZ contributed to the conception of the study. HG contributed significantly to analysis and manuscript preparation. QC performed the data analyses and wrote the manuscript. XL helped revise the manuscript. QC contributed to the interpretation and discussion of the results of the analysis. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec sec-type="funding-information" id="s10">
<title>Funding</title>
<p>This work was supported by the Science and Technology Research Project of Henan Provincial Department of Science and Technology [212102310985], the Humanities and Social Science Research Project of Henan Provincial Department of Education [2020-ZDJH-026], and the Social Science Planning Project of Henan Province [2021CJY051].</p>
</sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s11">
<title>Publisher&#x00027;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<ref-list>
<title>References</title>
<ref id="B1">
<label>1.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lam</surname> <given-names>AMK</given-names></name> <name><surname>Stringer</surname> <given-names>P</given-names></name> <name><surname>Toizumi</surname> <given-names>M</given-names></name> <name><surname>Dang</surname> <given-names>DA</given-names></name></person-group>. <article-title>An international partnership analysis of a cohort of Vietnamese children with hearing impairment</article-title>. <source>Speech, Language Hearing.</source> (<year>2016</year>) <volume>19</volume>:<fpage>27</fpage>&#x02013;<lpage>35</lpage>. <pub-id pub-id-type="doi">10.1080/2050571X.2015.1108066</pub-id></citation>
</ref>
<ref id="B2">
<label>2.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cupples</surname> <given-names>L</given-names></name> <name><surname>Ching</surname> <given-names>TYC</given-names></name> <name><surname>Button</surname> <given-names>L</given-names></name> <name><surname>Leigh</surname> <given-names>G</given-names></name> <name><surname>Marnane</surname> <given-names>V</given-names></name> <name><surname>Whitfield</surname> <given-names>J</given-names></name> <etal/></person-group>. <article-title>Language and speech outcomes of children with hearing loss and additional disabilities: identifying the variables that influence performance at five years of age</article-title>. <source>Int J Audiol</source>. (<year>2016</year>) <volume>57</volume>:<fpage>1</fpage>&#x02013;<lpage>48</lpage>. <pub-id pub-id-type="doi">10.1080/14992027.2016.1228127</pub-id><pub-id pub-id-type="pmid">27630013</pub-id></citation></ref>
<ref id="B3">
<label>3.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Zupan</surname> <given-names>B</given-names></name></person-group>. <source>The Role of Audition in Audiovisual Perception of Speech and Emotion in Children with Hearing Loss</source>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>Springer</publisher-name> (<year>2013</year>).</citation>
</ref>
<ref id="B4">
<label>4.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Aubuchon</surname> <given-names>AM</given-names></name> <name><surname>Pisoni</surname> <given-names>DB</given-names></name></person-group>. <article-title>Verbal processing speed and executive functioning in long-term cochlear implant users</article-title>. <source>J Speech Lang Hear Res.</source> (<year>2015</year>) <volume>58</volume>:<fpage>151</fpage>&#x02013;<lpage>62</lpage>. <pub-id pub-id-type="doi">10.1044/2014_JSLHR-H-13-0259</pub-id><pub-id pub-id-type="pmid">25320961</pub-id></citation></ref>
<ref id="B5">
<label>5.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mani</surname> <given-names>RA</given-names></name></person-group>. <article-title>The executive functions&#x00027; deficits in children with hearing loss</article-title>. In: <source>6th International Congress on Child and Adolescent Psychiatry. Tabriz</source>: Tabtiz University of Medical Sciences (<year>2013</year>).</citation>
</ref>
<ref id="B6">
<label>6.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Temurova</surname> <given-names>G</given-names></name></person-group>. <article-title>Using hearing aids in determining the level of speech development in children with hearing impairment</article-title>. <source>Ment Enlightenment Sci-Methodol J</source>. (<year>2020</year>) <volume>1</volume>:<fpage>111</fpage>&#x02013;<lpage>7</lpage>.</citation>
</ref>
<ref id="B7">
<label>7.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bell</surname> <given-names>MA</given-names></name></person-group>. <article-title>Using EEG to study cognitive development: issues and practices</article-title>. <source>J Cogn Dev.</source> (<year>2012</year>) <volume>13</volume>:<fpage>281</fpage>&#x02013;<lpage>94</lpage>. <pub-id pub-id-type="doi">10.1080/15248372.2012.691143</pub-id><pub-id pub-id-type="pmid">23144592</pub-id></citation></ref>
<ref id="B8">
<label>8.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Luo</surname> <given-names>Y</given-names></name> <name><surname>Fu</surname> <given-names>Q</given-names></name> <name><surname>Xie</surname> <given-names>J</given-names></name> <name><surname>Qin</surname> <given-names>Y</given-names></name> <name><surname>Wu</surname> <given-names>G.</given-names></name> <name><surname>Liu</surname> <given-names>J</given-names></name> <name><surname>Ding</surname> <given-names>X</given-names></name></person-group>. <article-title>EEG-based emotion classification using spiking neural networks <italic>IEEE Access</italic></article-title>. (<year>2020</year>) <volume>8</volume>:<fpage>46007</fpage>&#x02013;<lpage>16</lpage>. <pub-id pub-id-type="doi">10.1109/ACCESS.2020.2978163</pub-id></citation>
</ref>
<ref id="B9">
<label>9.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sidera</surname> <given-names>F</given-names></name> <name><surname>Amad&#x000F3;</surname> <given-names>A</given-names></name></person-group>. <article-title>Influences on facial emotion recognition in deaf children</article-title>. <source>J Deaf Stud Deaf Educ.</source> (<year>2017</year>) <volume>22</volume>:<fpage>164</fpage>&#x02013;<lpage>77</lpage>. <pub-id pub-id-type="doi">10.1093/deafed/enw072</pub-id><pub-id pub-id-type="pmid">27927685</pub-id></citation></ref>
<ref id="B10">
<label>10.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gu</surname> <given-names>H</given-names></name> <name><surname>Chen</surname> <given-names>Q</given-names></name> <name><surname>Xing</surname> <given-names>XL</given-names></name> <name><surname>Zhao</surname> <given-names>JF</given-names></name></person-group>. <article-title>Facial emotion recognition in deaf children: Evidence from event-related potentials and event-related spectral perturbation analysis</article-title>. <source>Neurosci Letters</source>. (<year>2019</year>) <volume>703</volume>:<fpage>198</fpage>&#x02013;<lpage>204</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2019.01.032</pub-id><pub-id pub-id-type="pmid">30677434</pub-id></citation></ref>
<ref id="B11">
<label>11.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Denmark</surname> <given-names>T.</given-names></name> <name><surname>Atkinson</surname> <given-names>J.</given-names></name> <name><surname>Campbell</surname> <given-names>R</given-names></name></person-group>. <article-title>How do typically developing deaf children and deaf children with autism spectrum disorder use the face when comprehending emotional facial expressions in british sign language?</article-title> <source>J Autism Dev Dis</source>. (<year>2014</year>) <volume>44</volume>:<fpage>2584</fpage>&#x02013;<lpage>92</lpage>. <pub-id pub-id-type="doi">10.1007/s10803-014-2130-x</pub-id><pub-id pub-id-type="pmid">24803370</pub-id></citation></ref>
<ref id="B12">
<label>12.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Peterson</surname> <given-names>CC</given-names></name></person-group>. <article-title>The mind behind the message: advancing theory-of-mind scales for typically developing children, and those with deafness, autism, or Asperger syndrome</article-title>. <source>Child Dev.</source> (<year>2012</year>) <volume>83</volume>:<fpage>469</fpage>&#x02013;<lpage>85</lpage>. <pub-id pub-id-type="doi">10.1111/j.1467-8624.2011.01728.x</pub-id><pub-id pub-id-type="pmid">22304467</pub-id></citation></ref>
<ref id="B13">
<label>13.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tsou</surname> <given-names>YT</given-names></name> <name><surname>Li</surname> <given-names>B</given-names></name> <name><surname>Kret</surname> <given-names>ME</given-names></name> <name><surname>Frijns</surname> <given-names>JH</given-names></name></person-group>. <article-title>Hearing status affects children&#x00027;s emotion understanding in dynamic social situations: An eye-tracking study</article-title>. <source>Ear Hear.</source> (<year>2021</year>) <volume>42</volume>:<fpage>1024</fpage>. <pub-id pub-id-type="doi">10.1097/AUD.0000000000000994</pub-id><pub-id pub-id-type="pmid">33369943</pub-id></citation></ref>
<ref id="B14">
<label>14.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tsou</surname> <given-names>YT</given-names></name> <name><surname>Li</surname> <given-names>B</given-names></name> <name><surname>Kret</surname> <given-names>ME</given-names></name> <name><surname>Sabino da Costa</surname> <given-names>I</given-names></name> <name><surname>Rieffe</surname> <given-names>C</given-names></name></person-group>. <article-title>Reading emotional faces in deaf and hard-of-hearing and typically hearing children</article-title>. <source>Emotion</source>. (<year>2020</year>). <pub-id pub-id-type="doi">10.1037/emo0000863.</pub-id> [Epub ahead of print].<pub-id pub-id-type="pmid">33370143</pub-id></citation></ref>
<ref id="B15">
<label>15.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>Y</given-names></name> <name><surname>Su</surname> <given-names>Y</given-names></name></person-group>. <article-title>Facial expression recognition in children with cochlear implants and hearing aids</article-title>. <source>Front Psychol.</source> (<year>2016</year>) <volume>7</volume>:<fpage>1</fpage>&#x02013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2016.01989</pub-id><pub-id pub-id-type="pmid">28066306</pub-id></citation></ref>
<ref id="B16">
<label>16.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wiefferink</surname> <given-names>CH</given-names></name> <name><surname>Carolien</surname> <given-names>R</given-names></name> <name><surname>Lizet</surname> <given-names>K</given-names></name> <name><surname>Leo</surname> <given-names>DR</given-names></name></person-group>. <article-title>Emotion understanding in deaf children with a cochlear implant</article-title>. <source>J Deaf Stud Deaf Educ.</source> (<year>2013</year>) <volume>18</volume>:<fpage>175</fpage>&#x02013;<lpage>86</lpage>. <pub-id pub-id-type="doi">10.1093/deafed/ens042</pub-id><pub-id pub-id-type="pmid">23232770</pub-id></citation></ref>
<ref id="B17">
<label>17.</label>
<citation citation-type="journal"><person-group person-group-type="author"><collab>Gray</collab></person-group>. <article-title>JRI. of emotion and cognitive control</article-title>. <source>Curr Dir Psychol Sci.</source> (<year>2004</year>) <volume>13</volume>:<fpage>46</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1111/j.0963-7214.2004.00272.x</pub-id></citation>
</ref>
<ref id="B18">
<label>18.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gray</surname> <given-names>JR</given-names></name> <name><surname>Braver</surname> <given-names>TS</given-names></name></person-group>. <article-title>Integration of emotion and cognition in the lateral prefrontal cortex</article-title>. <source>Proc Natl Acad Sci U S A.</source> (<year>2002</year>) <volume>99</volume>:<fpage>4115</fpage>&#x02013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.062381899</pub-id><pub-id pub-id-type="pmid">11904454</pub-id></citation></ref>
<ref id="B19">
<label>19.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Battaglia</surname> <given-names>S</given-names></name> <name><surname>Serio</surname> <given-names>G</given-names></name> <name><surname>Scarpazza</surname> <given-names>C</given-names></name> <name><surname>D&#x00027;Ausilio</surname> <given-names>A</given-names></name> <name><surname>Borgomaneri</surname> <given-names>S</given-names></name></person-group>. <article-title>Frozen in (e) motion: How reactive motor inhibition is influenced by the emotional content of stimuli in healthy and psychiatric populations</article-title>. <source>Behav Res Ther.</source> (<year>2021</year>) <volume>146</volume>:<fpage>103963</fpage>. <pub-id pub-id-type="doi">10.1016/j.brat.2021.103963</pub-id><pub-id pub-id-type="pmid">34530318</pub-id></citation></ref>
<ref id="B20">
<label>20.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Borgomaneri</surname> <given-names>S</given-names></name> <name><surname>Serio</surname> <given-names>G</given-names></name></person-group>. <article-title>Please, don&#x00027;t do it! fifteen years of progress of non-invasive brain stimulation in action inhibition</article-title>. <source>Cortex</source>. (<year>2020</year>) <volume>132</volume>:<fpage>404</fpage>&#x02013;<lpage>22</lpage>. <pub-id pub-id-type="doi">10.1016/j.cortex.2020.09.002</pub-id><pub-id pub-id-type="pmid">33045520</pub-id></citation></ref>
<ref id="B21">
<label>21.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gruber</surname> <given-names>SA</given-names></name> <name><surname>Rogowska</surname> <given-names>J</given-names></name></person-group>. <article-title>Decreased activation of the anterior cingulate in bipolar patients: an fMRI study</article-title>. <source>J Affect Disord.</source> (<year>2004</year>) <volume>82</volume>:<fpage>191</fpage>&#x02013;<lpage>201</lpage>. <pub-id pub-id-type="doi">10.1016/j.jad.2003.10.010</pub-id><pub-id pub-id-type="pmid">15488247</pub-id></citation></ref>
<ref id="B22">
<label>22.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gu</surname> <given-names>H</given-names></name> <name><surname>Fan</surname> <given-names>RL</given-names></name> <name><surname>Zhao</surname> <given-names>JF</given-names></name> <name><surname>Chen</surname> <given-names>YN</given-names></name> <name><surname>Chen</surname> <given-names>Q</given-names></name></person-group>. <article-title>Inhibitory control of emotional interference in children with learning disorders: evidence from event-related potentials and event-related spectral perturbation analysis</article-title>. <source>Neurosci Letters</source>. (<year>2019</year>) <volume>1718</volume>:<fpage>252</fpage>&#x02013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2019.04.016</pub-id><pub-id pub-id-type="pmid">31004577</pub-id></citation></ref>
<ref id="B23">
<label>23.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kohn</surname> <given-names>N</given-names></name> <name><surname>Eickhoff</surname> <given-names>SB</given-names></name> <name><surname>Scheller</surname> <given-names>M</given-names></name></person-group>. <article-title>Laird a R Fox PT, Habel U neural network of cognitive emotion regulation - an ALE meta-analysis and MACM analysis</article-title>. <source>NeuroImage.</source> (<year>2014</year>) <volume>87</volume>:<fpage>345</fpage>&#x02013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2013.11.001</pub-id><pub-id pub-id-type="pmid">24220041</pub-id></citation></ref>
<ref id="B24">
<label>24.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Frank</surname> <given-names>DW</given-names></name> <name><surname>Dewitt</surname> <given-names>M</given-names></name> <name><surname>Hudgens-Haney</surname> <given-names>M</given-names></name> <name><surname>Schaeffer</surname> <given-names>DJ</given-names></name> <name><surname>Ball</surname> <given-names>BH</given-names></name> <name><surname>Schwarz</surname> <given-names>NF</given-names></name></person-group>. <article-title>Emotion regulation: quantitative meta-analysis of functional activation and deactivation</article-title>. <source>Neurosci Biobehav Rev</source>. (<year>2014</year>) <volume>45</volume>:<fpage>202</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1016/j.neubiorev.2014.06.010</pub-id><pub-id pub-id-type="pmid">24984244</pub-id></citation></ref>
<ref id="B25">
<label>25.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Buhle</surname> <given-names>JT</given-names></name> <name><surname>Silvers</surname> <given-names>JA</given-names></name> <name><surname>Wager</surname> <given-names>TD</given-names></name> <name><surname>Lopez</surname> <given-names>R</given-names></name> <name><surname>Onyemekwu</surname> <given-names>C</given-names></name> <name><surname>Kober</surname> <given-names>H</given-names></name></person-group>. <article-title>Cognitive reappraisal of emotion: a meta-analysis of human neuroimaging studies</article-title>. <source>Cerebral Cortex.</source> (<year>2014</year>) <volume>24</volume>:<fpage>2981</fpage>&#x02013;<lpage>90</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bht154</pub-id><pub-id pub-id-type="pmid">23765157</pub-id></citation></ref>
<ref id="B26">
<label>26.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zilverstand</surname> <given-names>A</given-names></name> <name><surname>Parvaz</surname> <given-names>MA</given-names></name></person-group>. <article-title>Neuroimaging cognitive reappraisal in clinical populations to define neural targets for enhancing emotion regulation. a systematic review</article-title>. <source>Neuroimage</source>. (<year>2017</year>) <volume>151</volume>:<fpage>105</fpage>&#x02013;<lpage>116</lpage>. <pub-id pub-id-type="doi">10.1016/j.neuroimage.2016.06.009</pub-id><pub-id pub-id-type="pmid">27288319</pub-id></citation></ref>
<ref id="B27">
<label>27.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tobias</surname> <given-names>E</given-names></name> <name><surname>Amit</surname> <given-names>E</given-names></name> <name><surname>Seth</surname> <given-names>G</given-names></name></person-group>. <article-title>Dissociable neural systems resolve conflict from emotional versus nonemotional distracters</article-title>. <source>Cerebral Cortex.</source> (<year>2008</year>) <volume>18</volume>:<fpage>1475</fpage>&#x02013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.1093/cercor/bhm179</pub-id><pub-id pub-id-type="pmid">17940084</pub-id></citation></ref>
<ref id="B28">
<label>28.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Xue</surname> <given-names>S</given-names></name> <name><surname>Li</surname> <given-names>Y</given-names></name> <name><surname>Kong</surname> <given-names>X</given-names></name> <name><surname>He</surname> <given-names>Q</given-names></name> <name><surname>Liu</surname> <given-names>J</given-names></name></person-group>. <article-title>The dissociable neural dynamics of cognitive conflict and emotional conflict control: an ERP study</article-title>. <source>Neurosci Letters</source>. (<year>2016</year>) <volume>619</volume>:<fpage>149</fpage>&#x02013;<lpage>54</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2016.03.020</pub-id><pub-id pub-id-type="pmid">26987720</pub-id></citation></ref>
<ref id="B29">
<label>29.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhaoa</surname> <given-names>X</given-names></name> <name><surname>Lia</surname> <given-names>X</given-names></name></person-group>. <article-title>Influence of inhibitory tagging (IT) on emotional and cognitive conflict processing: Evidence from event-related potentials</article-title>. <source>Neurosci Letters</source>. (<year>2017</year>) <volume>657</volume>:<fpage>120</fpage>&#x02013;<lpage>5</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2017.08.014</pub-id><pub-id pub-id-type="pmid">28797904</pub-id></citation></ref>
<ref id="B30">
<label>30.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chu</surname> <given-names>CH</given-names></name> <name><surname>Kramer</surname> <given-names>AF</given-names></name> <name><surname>Song</surname> <given-names>TF</given-names></name> <name><surname>Wu</surname> <given-names>CH</given-names></name> <name><surname>Hung</surname> <given-names>TM</given-names></name></person-group>. <article-title>Acute exercise and neurocognitive development in preadolescents and young adults: an ERP study</article-title>. <source>Neural Plast.</source> (<year>2017</year>) <volume>2017</volume>:<fpage>1</fpage>&#x02013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1155/2017/2631909</pub-id><pub-id pub-id-type="pmid">29147585</pub-id></citation></ref>
<ref id="B31">
<label>31.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dai</surname> <given-names>Q</given-names></name></person-group>. <article-title>Deficient interference inhibition for negative stimuli in depression: an event-related potential study</article-title>. <source>Clin Neurophysiology</source>. (<year>2011</year>) <volume>122</volume>:<fpage>52</fpage>&#x02013;<lpage>61</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2010.05.025</pub-id><pub-id pub-id-type="pmid">20605107</pub-id></citation></ref>
<ref id="B32">
<label>32.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Greg</surname> <given-names>H</given-names></name> <name><surname>Annmarie</surname> <given-names>MN</given-names></name></person-group>. <article-title>Event-related potentials, emotion, and emotion regulation: an integrative review</article-title>. <source>Dev Neuropsychol.</source> (<year>2010</year>) <volume>35</volume>:<fpage>129</fpage>&#x02013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1080/87565640903526504</pub-id><pub-id pub-id-type="pmid">20390599</pub-id></citation></ref>
<ref id="B33">
<label>33.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Giller</surname> <given-names>F</given-names></name> <name><surname>Aggensteiner</surname> <given-names>PM</given-names></name> <name><surname>Banaschewski</surname> <given-names>T</given-names></name> <name><surname>D&#x000F6;pfner</surname> <given-names>M</given-names></name> <name><surname>Brandeis</surname> <given-names>D</given-names></name> <name><surname>Roessner</surname> <given-names>V</given-names></name></person-group>. <article-title>Affective dysregulation in children is associated with difficulties in response control in emotional ambiguous situations</article-title>. <source>Biol Psychiatry Cogn Neurosci Neuroimaging</source>. (<year>2022</year>) <volume>7</volume>:<fpage>66</fpage>&#x02013;<lpage>75</lpage>. <pub-id pub-id-type="doi">10.1016/j.bpsc.2021.03.014</pub-id><pub-id pub-id-type="pmid">33857639</pub-id></citation></ref>
<ref id="B34">
<label>34.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Imbir</surname> <given-names>KK</given-names></name> <name><surname>Spustek</surname> <given-names>T</given-names></name> <name><surname>Duda</surname> <given-names>J</given-names></name> <name><surname>Bernatowicz</surname> <given-names>G</given-names></name></person-group>. <article-title>&#x00026; J, Z. N450 and LPC event-related potential correlates of an emotional stroop task with words differing in valence and emotional origin</article-title>. <source>Front Psychol.</source> (<year>2017</year>) <volume>8</volume>:<fpage>1</fpage>&#x02013;<lpage>14</lpage>. <pub-id pub-id-type="doi">10.3389/fpsyg.2017.00880</pub-id><pub-id pub-id-type="pmid">28611717</pub-id></citation></ref>
<ref id="B35">
<label>35.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liotti</surname> <given-names>M</given-names></name> <name><surname>Woldorff</surname> <given-names>MG</given-names></name> <name><surname>Iii</surname> <given-names>RP</given-names></name></person-group>. <article-title>An ERP study of the temporal course of the Stroop color-word interference effect</article-title>. <source>Neuropsychologia.</source> (<year>2000</year>) <volume>38</volume>:<fpage>701</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1016/S0028-3932(99)00106-2</pub-id><pub-id pub-id-type="pmid">10689046</pub-id></citation></ref>
<ref id="B36">
<label>36.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shen</surname> <given-names>YM</given-names></name> <name><surname>Xue</surname> <given-names>S</given-names></name> <name><surname>Wang</surname> <given-names>KC</given-names></name></person-group>. <article-title>Neural time course of emotional conflict control: an ERP study</article-title>. <source>Neuroscience Letters</source>. (<year>2013</year>) <volume>541</volume>:<fpage>34</fpage>&#x02013;<lpage>38</lpage>. <pub-id pub-id-type="doi">10.1016/j.neulet.2013.02.032</pub-id><pub-id pub-id-type="pmid">23454616</pub-id></citation></ref>
<ref id="B37">
<label>37.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Szucs</surname> <given-names>D</given-names></name></person-group>. <article-title>Functional definition of the N450 event-related brain potential marker of conflict processing: A numerical Stroop study</article-title>. <source>BMC Neurosci.</source> (<year>2012</year>) <volume>13</volume>:<fpage>35</fpage>&#x02013;<lpage>49</lpage>. <pub-id pub-id-type="doi">10.1186/1471-2202-13-35</pub-id><pub-id pub-id-type="pmid">22452924</pub-id></citation></ref>
<ref id="B38">
<label>38.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhou</surname> <given-names>A</given-names></name> <name><surname>Jiang</surname> <given-names>Y</given-names></name> <name><surname>Chen</surname> <given-names>J</given-names></name> <name><surname>Wei</surname> <given-names>J</given-names></name> <name><surname>Dang</surname> <given-names>B</given-names></name> <name><surname>Li</surname> <given-names>S</given-names></name></person-group>. <article-title>Neural mechanisms of selective attention in children with amblyopia</article-title>. <source>PLoS ONE.</source> (<year>2015</year>) <volume>10</volume>:<fpage>1</fpage>&#x02013;<lpage>17</lpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0125370</pub-id><pub-id pub-id-type="pmid">26067259</pub-id></citation></ref>
<ref id="B39">
<label>39.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Huang</surname> <given-names>Z</given-names></name></person-group>. <article-title>Processing of emotional information in working memory in major depressive disorder</article-title>. <source>Advance Psychol Sc</source>i. (<year>2021</year>) <volume>29</volume>:<fpage>252</fpage>&#x02013;<lpage>67</lpage>. <pub-id pub-id-type="doi">10.3724/SP.J.1042.2021.00252</pub-id></citation>
</ref>
<ref id="B40">
<label>40.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ergen</surname> <given-names>M</given-names></name> <name><surname>Saban</surname> <given-names>S</given-names></name> <name><surname>Kirmizi-Alsan</surname> <given-names>E</given-names></name> <name><surname>Uslu</surname> <given-names>A</given-names></name> <name><surname>Keskin-Ergen</surname> <given-names>Y</given-names></name></person-group>. <article-title>Time&#x02013;frequency analysis of the event-related potentials associated with the Stroop test</article-title>. <source>Int J Psychophysiol</source>. (<year>2014</year>) <volume>94</volume>: <fpage>463</fpage>&#x02013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2014.08.177</pub-id><pub-id pub-id-type="pmid">25135670</pub-id></citation></ref>
<ref id="B41">
<label>41.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gu</surname> <given-names>H</given-names></name> <name><surname>Zhao</surname> <given-names>Q</given-names></name> <name><surname>Liu</surname> <given-names>J</given-names></name> <name><surname>Zhao</surname> <given-names>J</given-names></name> <name><surname>Ji</surname> <given-names>L</given-names></name> <name><surname>Chi</surname> <given-names>P</given-names></name></person-group>. <article-title>EEG oscillation evidences of altered resting-state brain activity in children orphaned by parental HIV/AIDS</article-title>. <source>AIDS Care.</source> (<year>2020</year>) <volume>32</volume>:<fpage>177</fpage>&#x02013;<lpage>82</lpage>. <pub-id pub-id-type="doi">10.1080/09540121.2020.1739211</pub-id><pub-id pub-id-type="pmid">32168993</pub-id></citation></ref>
<ref id="B42">
<label>42.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Paul</surname> <given-names>S</given-names></name> <name><surname>Wolfgang</surname> <given-names>K</given-names></name> <name><surname>Manuel</surname> <given-names>S</given-names></name></person-group>. <article-title>Fronto-parietal EEG coherence in theta and upper alpha reflect central executive functions of working memory</article-title>. <source>Int J Psychophysiol</source>. (<year>2005</year>) <volume>57</volume>:<fpage>97</fpage>&#x02013;<lpage>103</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2005.03.018</pub-id><pub-id pub-id-type="pmid">15967528</pub-id></citation></ref>
<ref id="B43">
<label>43.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tesche</surname> <given-names>CD</given-names></name></person-group>. <article-title>Theta oscillations index human hippocampal activation during a working memory task</article-title>. <source>Proc Natl Acad Sci U S A.</source> (<year>2000</year>) <volume>97</volume>:<fpage>919</fpage>&#x02013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.97.2.919</pub-id><pub-id pub-id-type="pmid">10639180</pub-id></citation></ref>
<ref id="B44">
<label>44.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Compton</surname> <given-names>RJ</given-names></name> <name><surname>Arnstein</surname> <given-names>D</given-names></name> <name><surname>Freedman</surname> <given-names>G</given-names></name> <name><surname>Dainer-Best</surname> <given-names>J</given-names></name></person-group>. <article-title>Cognitive control in the intertrial interval: evidence from EEG alpha power</article-title>. <source>Psychophysiology.</source> (<year>2011</year>) <volume>48</volume>:<fpage>583</fpage>&#x02013;<lpage>90</lpage>. <pub-id pub-id-type="doi">10.1111/j.1469-8986.2010.01124.x</pub-id><pub-id pub-id-type="pmid">20840195</pub-id></citation></ref>
<ref id="B45">
<label>45.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jiang</surname> <given-names>J</given-names></name> <name><surname>Zhang</surname> <given-names>Q</given-names></name></person-group>. <article-title>EEG neural oscillatory dynamics reveal semantic and response conflict at difference levels of conflict awareness</article-title>. <source>Sci Rep.</source> (<year>2015</year>) <volume>5</volume>:<fpage>1</fpage>&#x02013;<lpage>12</lpage>. <pub-id pub-id-type="doi">10.1038/srep12008</pub-id><pub-id pub-id-type="pmid">26169473</pub-id></citation></ref>
<ref id="B46">
<label>46.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kamaradova</surname> <given-names>D</given-names></name> <name><surname>Hajda</surname> <given-names>M</given-names></name> <name><surname>Prasko</surname> <given-names>J</given-names></name> <name><surname>Taborsky</surname> <given-names>J</given-names></name> <name><surname>Grambal</surname> <given-names>A</given-names></name> <name><surname>Latalova</surname> <given-names>K</given-names></name> <etal/></person-group>. <article-title>Cognitive deficits in patients with obsessive-compulsive disorder - electroencephalography correlates</article-title>. <source>Neuropsychiatr Dis Treat.</source> (<year>2016</year>) <volume>12</volume>:<fpage>1119</fpage>&#x02013;<lpage>25</lpage>. <pub-id pub-id-type="doi">10.2147/NDT.S93040</pub-id><pub-id pub-id-type="pmid">27226716</pub-id></citation></ref>
<ref id="B47">
<label>47.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Becske</surname> <given-names>M</given-names></name> <name><surname>Marosi</surname> <given-names>C</given-names></name> <name><surname>Moln&#x000E1;r</surname> <given-names>H</given-names></name> <name><surname>Fodor</surname> <given-names>Z</given-names></name> <name><surname>Tombor</surname> <given-names>L</given-names></name></person-group>. <article-title>Distractor filtering and its electrophysiological correlates in schizophrenia</article-title>. <source>Clin Neurophysiol</source>. (<year>2022</year>) <volume>133</volume>:<fpage>71</fpage>&#x02013;<lpage>82</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2021.10.009</pub-id><pub-id pub-id-type="pmid">34814018</pub-id></citation></ref>
<ref id="B48">
<label>48.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Merch&#x000E1;n</surname> <given-names>A</given-names></name> <name><surname>Garc&#x000ED;a</surname> <given-names>LF</given-names></name> <name><surname>Maurno</surname> <given-names>NG</given-names></name> <name><surname>Casta&#x000F1;eda</surname> <given-names>PR</given-names></name></person-group>. <article-title>Executive functions in deaf and hearing children: the mediating role of language skills in inhibitory control</article-title>. <source>J Exp Child Psychol.</source> (<year>2022</year>) <volume>218</volume>:<fpage>1</fpage>&#x02013;<lpage>17</lpage>. <pub-id pub-id-type="doi">10.1016/j.jecp.2022.105374</pub-id><pub-id pub-id-type="pmid">35124332</pub-id></citation></ref>
<ref id="B49">
<label>49.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mason</surname> <given-names>K</given-names></name> <name><surname>Marshall</surname> <given-names>CR</given-names></name></person-group>. <article-title>Executive function training for deaf children: impact of a music intervention</article-title>. <source>J Deaf Stud Deaf Educ.</source> (<year>2021</year>) <volume>26</volume>:<fpage>490</fpage>&#x02013;<lpage>500</lpage>. <pub-id pub-id-type="doi">10.1093/deafed/enab026</pub-id><pub-id pub-id-type="pmid">34476479</pub-id></citation></ref>
<ref id="B50">
<label>50.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Botting</surname> <given-names>N</given-names></name> <name><surname>Jones</surname> <given-names>A</given-names></name> <name><surname>Marshall</surname> <given-names>C</given-names></name> <name><surname>Denmark</surname> <given-names>T</given-names></name> <name><surname>Atkinson</surname> <given-names>J</given-names></name></person-group>. <article-title>Nonverbal executive function is mediated by language: a study of deaf and hearing children</article-title>. <source>Child Dev.</source> (<year>2017</year>) <volume>88</volume>:<fpage>1689</fpage>&#x02013;<lpage>700</lpage>. <pub-id pub-id-type="doi">10.1111/cdev.12659</pub-id><pub-id pub-id-type="pmid">27859007</pub-id></citation></ref>
<ref id="B51">
<label>51.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kronenberger</surname> <given-names>WG</given-names></name> <name><surname>Colson</surname> <given-names>BG</given-names></name> <name><surname>Henning</surname> <given-names>SC</given-names></name></person-group>. <article-title>Executive functioning and speech-language skills following long-term use of cochlear implants</article-title>. <source>J Deaf Stud Deaf Educ.</source> (<year>2014</year>) <volume>19</volume>:<fpage>456</fpage>&#x02013;<lpage>70</lpage>. <pub-id pub-id-type="doi">10.1093/deafed/enu011</pub-id><pub-id pub-id-type="pmid">24903605</pub-id></citation></ref>
<ref id="B52">
<label>52.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kronenberger</surname> <given-names>WG</given-names></name> <name><surname>Pisoni</surname> <given-names>DB</given-names></name> <name><surname>Henning</surname> <given-names>SC</given-names></name></person-group>. <article-title>Executive functioning skills in long-term users of cochlear implants: a case control study</article-title>. <source>J Pediatr Psychol.</source> (<year>2013</year>) <volume>38</volume>:<fpage>902</fpage>&#x02013;<lpage>14</lpage>. <pub-id pub-id-type="doi">10.1093/jpepsy/jst034</pub-id><pub-id pub-id-type="pmid">23699747</pub-id></citation></ref>
<ref id="B53">
<label>53.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lu</surname> <given-names>B</given-names></name> <name><surname>Hui</surname> <given-names>M</given-names></name></person-group>. <article-title>The development of native Chinese affective picture system&#x02013;a pretest in 46 college students</article-title>. <source>Chinese Ment Health J.</source> (<year>2005</year>) <volume>19</volume>:<fpage>719</fpage>&#x02013;<lpage>22</lpage>.</citation>
</ref>
<ref id="B54">
<label>54.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Delorme</surname> <given-names>A</given-names></name></person-group>. <article-title>EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis</article-title>. <source>J Neurosci Methods</source>. (<year>2004</year>) <volume>134</volume>:<fpage>9</fpage>&#x02013;<lpage>21</lpage>. <pub-id pub-id-type="doi">10.1016/j.jneumeth.2003.10.009</pub-id><pub-id pub-id-type="pmid">15102499</pub-id></citation></ref>
<ref id="B55">
<label>55.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mouraux</surname> <given-names>A</given-names></name></person-group>. <article-title>Across-trial averaging of event-related EEG responses and beyond</article-title>. <source>Magn Reson Imaging</source>. (<year>2008</year>) <volume>26</volume>:<fpage>1041</fpage>&#x02013;<lpage>54</lpage>. <pub-id pub-id-type="doi">10.1016/j.mri.2008.01.011</pub-id><pub-id pub-id-type="pmid">18479877</pub-id></citation></ref>
<ref id="B56">
<label>56.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liu</surname> <given-names>C</given-names></name> <name><surname>Yao</surname> <given-names>R</given-names></name> <name><surname>Wang</surname> <given-names>Z</given-names></name></person-group>. <article-title>N450 as a candidate neural marker for interference control deficits in children with learning disabilities</article-title>. <source>Int J Psychophysiology</source>. (<year>2014</year>) <volume>93</volume>:<fpage>70</fpage>&#x02013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2014.05.007</pub-id><pub-id pub-id-type="pmid">24858538</pub-id></citation></ref>
<ref id="B57">
<label>57.</label>
<citation citation-type="web"><person-group person-group-type="author"><name><surname>Sopov</surname> <given-names>M</given-names></name> <name><surname>Ostapenko</surname> <given-names>M</given-names></name></person-group>. <source>Distractor familiarity in picture-word interference paradigm: An ERP study</source>. Paper presented at the Conference of Experimental Psychologists. (<year>2016</year>) <ext-link ext-link-type="uri" xlink:href="https://www.researchgate.net/publication/316660425">https://www.researchgate.net/publication/316660425</ext-link><pub-id pub-id-type="pmid">29975074</pub-id></citation></ref>
<ref id="B58">
<label>58.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>L</given-names></name></person-group>. <article-title>Oscillatory brain dynamics associated with the automatic processing of emotion in words</article-title>. <source>Brain Lang</source>. (<year>2014</year>) <volume>137</volume>:<fpage>120</fpage>&#x02013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1016/j.bandl.2014.07.011</pub-id><pub-id pub-id-type="pmid">25195197</pub-id></citation></ref>
<ref id="B59">
<label>59.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bosworth</surname> <given-names>RG</given-names></name> <name><surname>Binder</surname> <given-names>EM</given-names></name> <name><surname>Tyler</surname> <given-names>SC</given-names></name></person-group>. <article-title>Automaticity of lexical access in deaf and hearing bilinguals: Cross-linguistic evidence from the color Stroop task across five languages</article-title>. <source>Cognition.</source> (<year>2021</year>) <volume>212</volume>:<fpage>104659</fpage>. <pub-id pub-id-type="doi">10.1016/j.cognition.2021.104659</pub-id><pub-id pub-id-type="pmid">33798950</pub-id></citation></ref>
<ref id="B60">
<label>60.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Frings</surname> <given-names>C</given-names></name></person-group>. <article-title>Electrophysiological correlates of visual identity negative priming</article-title>. <source>Brain Res</source>. (<year>2007</year>) <volume>1176</volume>:<fpage>82</fpage>&#x02013;<lpage>91</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2007.07.093</pub-id><pub-id pub-id-type="pmid">17904111</pub-id></citation></ref>
<ref id="B61">
<label>61.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perez-Edgar</surname> <given-names>K</given-names></name></person-group>. <article-title>Individual differences in children&#x00027;s performance during an emotional stroop task</article-title>. <source>Brain Cogn.</source> (<year>2003</year>) <volume>52</volume>:<fpage>33</fpage>&#x02013;<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1016/S0278-2626(03)00007-1</pub-id><pub-id pub-id-type="pmid">12812803</pub-id></citation></ref>
<ref id="B62">
<label>62.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mathis</surname> <given-names>KI</given-names></name> <name><surname>Wynn</surname> <given-names>JK</given-names></name> <name><surname>Jahshan</surname> <given-names>C</given-names></name> <name><surname>Hellemann</surname> <given-names>G</given-names></name> <name><surname>Darque</surname> <given-names>A</given-names></name></person-group>. <article-title>An electrophysiological investigation of attentional blink in schizophrenia: separating perceptual and attentional processes</article-title>. <source>Int J Psychophysiol</source>. (<year>2012</year>) <volume>86</volume>:<fpage>108</fpage>&#x02013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2012.06.052</pub-id><pub-id pub-id-type="pmid">22771850</pub-id></citation></ref>
<ref id="B63">
<label>63.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ozcan</surname> <given-names>H</given-names></name> <name><surname>Ozer</surname> <given-names>S</given-names></name></person-group>. <article-title>Neuropsychological, electrophysiological and neurological impairments in patients with obsessive compulsive disorder, their healthy siblings and healthy controls: Identifying potential endophenotype(s)</article-title>. <source>Psychiatry Res</source>. (<year>2016</year>) <volume>240</volume>:<fpage>110</fpage>&#x02013;<lpage>17</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2016.04.013</pub-id><pub-id pub-id-type="pmid">27100062</pub-id></citation></ref>
<ref id="B64">
<label>64.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Larson</surname> <given-names>MJ</given-names></name> <name><surname>Clayson</surname> <given-names>PE</given-names></name></person-group>. <article-title>Making sense of all the conflict: a theoretical review and critique of conflict-related ERPs</article-title>. <source>Int J Psychophysiol.</source> (<year>2014</year>) <volume>93</volume>:<fpage>283</fpage>&#x02013;<lpage>97</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijpsycho.2014.06.007</pub-id><pub-id pub-id-type="pmid">24950132</pub-id></citation></ref>
<ref id="B65">
<label>65.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mcneely</surname> <given-names>HE</given-names></name> <name><surname>Lau</surname> <given-names>MA</given-names></name> <name><surname>Christensen</surname> <given-names>BK</given-names></name></person-group>. <article-title>Neurophysiological evidence of cognitive inhibition anomalies in persons with major depressive disorder</article-title>. <source>Clin Neurophysiol</source>. (<year>2008</year>) <volume>119</volume>:<fpage>1578</fpage>&#x02013;<lpage>89</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2008.03.031</pub-id><pub-id pub-id-type="pmid">18482863</pub-id></citation></ref>
<ref id="B66">
<label>66.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fan</surname> <given-names>LY</given-names></name> <name><surname>Gau</surname> <given-names>SS</given-names></name></person-group>. <article-title>Neural correlates of inhibitory control and visual processing in youths with attention deficit hyperactivity disorder: a counting Stroop functional MRI study</article-title>. <source>Psychol Med.</source> (<year>2014</year>) <volume>44</volume>:<fpage>2661</fpage>&#x02013;<lpage>71</lpage>. <pub-id pub-id-type="doi">10.1017/S0033291714000038</pub-id><pub-id pub-id-type="pmid">24451066</pub-id></citation></ref>
<ref id="B67">
<label>67.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Koch</surname> <given-names>JKL</given-names></name> <name><surname>Miguel</surname> <given-names>H</given-names></name></person-group>. <article-title>Prefrontal activation during Stroop and Wisconsin card sort tasks in children with developmental coordination disorder: a NIRS study</article-title>. <source>Exp Brain Res.</source> (<year>2018</year>) <volume>236</volume>:<fpage>3053</fpage>&#x02013;<lpage>64</lpage>. <pub-id pub-id-type="doi">10.1007/s00221-018-5358-4</pub-id><pub-id pub-id-type="pmid">30121740</pub-id></citation></ref>
<ref id="B68">
<label>68.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>M</given-names></name> <name><surname>Zhang</surname> <given-names>K</given-names></name> <name><surname>Zhang</surname> <given-names>J</given-names></name> <name><surname>Dong</surname> <given-names>G</given-names></name> <name><surname>Zhang</surname> <given-names>H</given-names></name></person-group>. <article-title>Abnormal neural responses to emotional stimuli but not Go/NoGo and stroop tasks in adults with a history of childhood nocturnal enuresis</article-title>. <source>PLoS ONE.</source> (<year>2015</year>) <volume>10</volume>:<fpage>1</fpage>&#x02013;<lpage>10</lpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0142957</pub-id><pub-id pub-id-type="pmid">26571500</pub-id></citation></ref>
<ref id="B69">
<label>69.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Urzua</surname> <given-names>A</given-names></name> <name><surname>Domic</surname> <given-names>M</given-names></name> <name><surname>Ramos</surname> <given-names>M</given-names></name> <name><surname>Cerda</surname> <given-names>A</given-names></name></person-group>. <article-title>Psychometric properties of three rating scales for attention deficit hyperactivity disorder in Chilean students</article-title>. <source>Revista Panamericana de Salud Publica.</source> (<year>2010</year>) <volume>27</volume>:<fpage>157</fpage>&#x02013;<lpage>67</lpage>. <pub-id pub-id-type="doi">10.1590/S1020-49892010000300002</pub-id><pub-id pub-id-type="pmid">20414504</pub-id></citation></ref>
<ref id="B70">
<label>70.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Janssens</surname> <given-names>C</given-names></name> <name><surname>De Loof</surname> <given-names>E</given-names></name> <name><surname>Boehler</surname> <given-names>CN</given-names></name> <name><surname>Pourtois</surname> <given-names>G</given-names></name> <name><surname>Verguts</surname> <given-names>T</given-names></name></person-group>. <article-title>Occipital alpha power reveals fast attentional inhibition of incongruent distractors</article-title>. <source>Psychophysiology</source>. (<year>2018</year>) <volume>55</volume>:<fpage>1</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1111/psyp.13011</pub-id><pub-id pub-id-type="pmid">28929499</pub-id></citation></ref>
<ref id="B71">
<label>71.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mazaheri</surname> <given-names>A</given-names></name> <name><surname>Fassbender</surname> <given-names>C</given-names></name> <name><surname>Coffey-Corina</surname> <given-names>S</given-names></name> <name><surname>Hartanto</surname> <given-names>TA</given-names></name> <name><surname>Schweitzer</surname> <given-names>JB</given-names></name></person-group>. <article-title>Differential oscillatory electroencephalogram between attention-deficit/hyperactivity disorder subtypes and typically developing adolescents</article-title>. <source>Biol Psychiatry</source>. (<year>2014</year>) <volume>76</volume>:<fpage>422</fpage>&#x02013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsych.2013.08.023</pub-id><pub-id pub-id-type="pmid">24120092</pub-id></citation></ref>
<ref id="B72">
<label>72.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fassbender</surname> <given-names>C</given-names></name> <name><surname>Zhang</surname> <given-names>H</given-names></name> <name><surname>Buzy</surname> <given-names>WM</given-names></name> <name><surname>Cortes</surname> <given-names>CR</given-names></name> <name><surname>Mizuiri</surname> <given-names>D</given-names></name> <name><surname>Beckett</surname> <given-names>L</given-names></name></person-group>. <article-title>A lack of default network suppression is linked to increased distractibility in ADHD</article-title>. <source>Brain Res</source>. (<year>2009</year>) <volume>1273</volume>:<fpage>114</fpage>&#x02013;<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1016/j.brainres.2009.02.070</pub-id><pub-id pub-id-type="pmid">19281801</pub-id></citation></ref>
<ref id="B73">
<label>73.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sanja</surname> <given-names>K</given-names></name> <name><surname>Sheeva</surname> <given-names>A</given-names></name> <name><surname>Andrei</surname> <given-names>I</given-names></name> <name><surname>Jason</surname> <given-names>S</given-names></name> <name><surname>Eric</surname> <given-names>H</given-names></name></person-group>. <article-title>Theta oscillations are sensitive to both early and late conflict processing stages: effects of alcohol intoxication</article-title>. <source>PLoS ONE.</source> (<year>2012</year>) <volume>7</volume>:<fpage>1</fpage>&#x02013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0043957</pub-id><pub-id pub-id-type="pmid">22952823</pub-id></citation></ref>
<ref id="B74">
<label>74.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tang</surname> <given-names>D</given-names></name> <name><surname>Hu</surname> <given-names>L</given-names></name></person-group>. <article-title>The neural oscillations of conflict adaptation in the human frontal region</article-title>. <source>Biological Psychol</source>. (<year>2013</year>) <volume>93</volume>: <fpage>364</fpage>&#x02013;<lpage>372</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2013.03.004</pub-id><pub-id pub-id-type="pmid">23570676</pub-id></citation></ref>
<ref id="B75">
<label>75.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jones</surname> <given-names>A</given-names></name> <name><surname>Atkinson</surname> <given-names>J</given-names></name> <name><surname>Marshall</surname> <given-names>C</given-names></name> <name><surname>Botting</surname> <given-names>N</given-names></name> <name><surname>St Clair</surname> <given-names>MC</given-names></name></person-group>. <article-title>Expressive vocabulary predicts nonverbal executive function: a 2-year longitudinal study of deaf and hearing children</article-title>. <source>Child Dev.</source> (<year>2020</year>) <volume>91</volume>:<fpage>400</fpage>&#x02013;<lpage>14</lpage>. <pub-id pub-id-type="doi">10.1111/cdev.13226</pub-id><pub-id pub-id-type="pmid">30740665</pub-id></citation></ref>
<ref id="B76">
<label>76.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>G</given-names></name> <name><surname>Zhang</surname> <given-names>C</given-names></name> <name><surname>Cao</surname> <given-names>S</given-names></name> <name><surname>Xia</surname> <given-names>X</given-names></name> <name><surname>Tan</surname> <given-names>X</given-names></name> <name><surname>Si</surname> <given-names>L</given-names></name> <etal/></person-group>. <article-title>Multi-domain features of the non-phase-locked component of interest extracted from ERP data by tensor decomposition</article-title>. <source>Brain Topograph</source>. (<year>2020</year>) <volume>33</volume>:<fpage>37</fpage>&#x02013;<lpage>47</lpage>. <pub-id pub-id-type="doi">10.1007/s10548-019-00750-8</pub-id><pub-id pub-id-type="pmid">31879854</pub-id></citation></ref>
<ref id="B77">
<label>77.</label>
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schriever</surname> <given-names>VA</given-names></name> <name><surname>Han</surname> <given-names>P</given-names></name> <name><surname>Weise</surname> <given-names>S</given-names></name> <name><surname>H&#x000F6;sel</surname> <given-names>F</given-names></name> <name><surname>Pellegrino</surname> <given-names>R</given-names></name></person-group>. <article-title>Time frequency analysis of olfactory induced EEG-power change</article-title>. <source>PLoS ONE.</source> (<year>2017</year>) <volume>12</volume>:<fpage>1</fpage>&#x02013;<lpage>11</lpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0185596</pub-id><pub-id pub-id-type="pmid">29016623</pub-id></citation></ref>
<ref id="B78">
<label>78.</label>
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Cohen</surname> <given-names>MX</given-names></name></person-group>. <source>Analyzing Neural Time Series Data: Theory and Practice</source>. <publisher-loc>Cambridge, MA</publisher-loc>: <publisher-name>MIT Press</publisher-name> (<year>2014</year>). <pub-id pub-id-type="doi">10.7551/mitpress/9609.001.0001</pub-id></citation>
</ref>
</ref-list>
</back>
</article>