<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Behav. Neurosci.</journal-id>
<journal-title>Frontiers in Behavioral Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Behav. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-5153</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnbeh.2017.00199</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Comparison of Ecological Micro-Expression Recognition in Patients with Depression and Healthy Individuals</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Zhu</surname> <given-names>Chuanlin</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/202175/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Chen</surname> <given-names>Xinyun</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Zhang</surname> <given-names>Jianxin</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Liu</surname> <given-names>Zhiying</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Tang</surname> <given-names>Zhen</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
</contrib>
<contrib contrib-type="author">
<name><surname>Xu</surname> <given-names>Yuting</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/485508/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Zhang</surname> <given-names>Didi</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/455598/overview"/>
</contrib> 
<contrib contrib-type="author" corresp="yes">
<name><surname>Liu</surname> <given-names>Dianzhi</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="author-notes" rid="fn001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/455493/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Psychology, School of Education, Soochow University</institution>, <addr-line>Suzhou</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>Suzhou Psychiatric Hospital</institution>, <addr-line>Suzhou</addr-line>, <country>China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Antonella Gasbarri, University of L&#x02019;Aquila, Italy</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Gennady Knyazev, Institute of Physiology and Basic Medicine, Russia; Andres Antonio Gonzalez-Garrido, University of Guadalajara, Mexico</p></fn>
<fn fn-type="corresp" id="fn001"><p>&#x0002A;Correspondence: Dianzhi Liu <email>dianzhiliu&#x00040;foxmail.com</email></p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>17</day>
<month>10</month>
<year>2017</year>
</pub-date>
<pub-date pub-type="collection">
<year>2017</year>
</pub-date>
<volume>11</volume>
<elocation-id>199</elocation-id>
<history>
<date date-type="received">
<day>28</day>
<month>06</month>
<year>2017</year>
</date>
<date date-type="accepted">
<day>03</day>
<month>10</month>
<year>2017</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2017 Zhu, Chen, Zhang, Liu, Tang, Xu, Zhang and Liu.</copyright-statement>
<copyright-year>2017</copyright-year>
<copyright-holder>Zhu, Chen, Zhang, Liu, Tang, Xu, Zhang and Liu</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract><p>Previous studies have focused on the characteristics of ordinary facial expressions in patients with depression, and have not investigated the processing characteristics of ecological micro-expressions (MEs, i.e., MEs that presented in different background expressions) in these patients. Based on this, adopting the ecological MEs recognition paradigm, this study aimed to comparatively evaluate facial ME recognition in depressed and healthy individuals. The findings of the study are as follows: (1) background expression: the accuracy (ACC) in the neutral background condition tended to be higher than that in the fear background condition, and the reaction time (RT) in the neutral background condition was significantly longer than that in other backgrounds. The type of ME and its interaction with the type of background expression could affect participants&#x02019; ecological MEs recognition ACC and speed. Depression type: there was no significant difference between the ecological MEs recognition ACC of patients with depression and healthy individuals, but the patients&#x02019; RT was significantly longer than that of healthy individuals; and (2) patients with depression judged happy MEs that were presented against different backgrounds as neutral and judged neutral MEs that were presented against sad backgrounds as sad. The present study suggested the following: (1) ecological MEs recognition was influenced by background expressions. The ACC of happy MEs was the highest, of neutral ME moderate and of sadness and fear the lowest. The response to the happy MEs was significantly shorter than that of identifying other MEs. It is necessary to conduct research on ecological MEs recognition; (2) the speed of patients with depression in identifying ecological MEs was slower than of healthy individuals; indicating that the patients&#x02019; cognitive function was impaired; and (3) the patients with depression showed negative bias in the ecological MEs recognition task, reflecting the lack of happy ME recognition ability and the generalized identification of sad MEs in those patients.</p></abstract>
<kwd-group>
<kwd>micro-expression recognition</kwd>
<kwd>ecological</kwd>
<kwd>depression</kwd>
<kwd>negative bias</kwd>
<kwd>context</kwd>
</kwd-group>
<counts>
<fig-count count="1"/>
<table-count count="3"/>
<equation-count count="0"/>
<ref-count count="42"/>
<page-count count="8"/>
<word-count count="6870"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="introduction" id="s1">
<title>Introduction</title>
<p>Micro-expressions (MEs) are very fast (1/25&#x02013;1/2 s) facial expressions, MEs contribute to revealing individuals&#x02019; emotions that they attempt to conceal (Ekman, <xref ref-type="bibr" rid="B7">2003</xref>; Matsumoto and Hwang, <xref ref-type="bibr" rid="B24">2011</xref>; Yan et al., <xref ref-type="bibr" rid="B36">2013</xref>). Ekman and Friesen (<xref ref-type="bibr" rid="B8">1974</xref>) developed the first ME recognition test, the Brief Affect Recognition Test (BART). In this test, various ME images (happiness, sadness, fear, anger, disgust and surprise) are presented (1/100&#x02013;1/25 s), participants are asked to complete an emotional classification task, and then the corresponding accuracy (ACC) is analyzed. Although the BART laid the foundations for further research, it also has some shortcomings. First, it is difficult to measure real ME recognition with the BART, due to the lack of masking after the target stimulus, which may extend the time of processing the target stimulus, resulting in the processing being influenced by the visual aftereffects. Second, in the BART, each ME is presented independently, while no forward and backward expressions are presented; hence, participants cannot know the background information associated with the ME, which has led to questioning the ecological validity of the test.</p>
<p>In order to overcome these shortcomings, Matsumoto et al. (<xref ref-type="bibr" rid="B25">2000</xref>) developed an improved test, the Japanese and Caucasian BART (JACBART), based on the BART. In the JACBART, a non-neutral face expression image (target stimulus) is embedded in a neutral face expression video (mask stimuli), with a time of 1 s. The identity of the people in the target and mask stimuli are controlled. The participants&#x02019; task is to identify the emotion conveyed by the target stimuli. Numerous studies (Hall and Matsumoto, <xref ref-type="bibr" rid="B14">2004</xref>; Russell et al., <xref ref-type="bibr" rid="B30">2006</xref>; Matsumoto and Hwang, <xref ref-type="bibr" rid="B24">2011</xref>) have reported that the JACBART has good reliability and validity; therefore, the test is being widely used. Though the JACBART succeeded in eliminating the influence of visual aftereffects, it only examined ME processing in the neutral (non-emotional) context, and did not examine it in the emotional context, such as pleasure, sadness and fear. However, in real life, MEs are present in emotional expressions as well. Does ME recognition differ in emotional and non-emotional contexts? That is to say, is ME recognition influenced by the types of context?</p>
<p>Aiming to solve this problem, based on the JACBART, Zhang et al. (<xref ref-type="bibr" rid="B41">2014</xref>) were the first to explore the role of neutral, sad and happy contexts in the ME recognition task. The results showed that the ACC of recognizing all MEs was decreased in the sad context compared to the neutral and happy contexts, which suggests that participants&#x02019; performance in the ME recognition task is influenced by the context. This study was a further refinement of the JACBART. However, the types of context that were adopted in Zhang et al.&#x02019;s (<xref ref-type="bibr" rid="B41">2014</xref>) study were still limited. Soon after that, Zhang et al. (<xref ref-type="bibr" rid="B40">2017</xref>) examined the ME recognition characteristics of college students in fearful, sad, disgusting, angry, surprised and happy contexts. The results showed that the main effects of the fearful, sad, disgusting and angry context were significant, while those of surprise and happiness were not significant; on the basis of these results, Zhang et al. (<xref ref-type="bibr" rid="B40">2017</xref>) established an ecologically valid ME recognition test. Ecological MEs are MEs that occur in real life, rather than MEs only accompanying neutral expressions. The MEs in Zhang et al.&#x02019;s (<xref ref-type="bibr" rid="B40">2017</xref>) study were ecologically valid; therefore, they are considered to be ecological MEs.</p>
<p>Previous studies have shown that, in addition to sex (Hall and Matsumoto, <xref ref-type="bibr" rid="B14">2004</xref>), age (Mill et al., <xref ref-type="bibr" rid="B28">2009</xref>; Hurley et al., <xref ref-type="bibr" rid="B15">2014</xref>) and personality (Hurley et al., <xref ref-type="bibr" rid="B15">2014</xref>), depression (Liu et al., <xref ref-type="bibr" rid="B22">2012</xref>; Gollan et al., <xref ref-type="bibr" rid="B12">2015b</xref>; Kerestes et al., <xref ref-type="bibr" rid="B17">2016</xref>; Milders et al., <xref ref-type="bibr" rid="B27">2016</xref>) can affect one&#x02019;s performance when recognizing ordinary facial expressions. Depression is one of the most common mental illness (Bocharov et al., <xref ref-type="bibr" rid="B3">2017</xref>); the official website of WHO reports that the global prevalence of patients with depression exceeds 300 million. Unlike the usual mood swings or emotional responses to daily challenges, long-term moderate or severe depression can lead to serious health issues, and in the most severe cases, depression can lead to suicide. Every year, about 800,000 people commit suicide due to depression, and suicide is the second leading cause of the death in people aged 15&#x02013;29 years. Many researchers have studied the recognition characters of ordinary facial expressions in patients with depression; the results showed that the patients showed obvious negative bias when processing ordinary facial expressions (Dai and Feng, <xref ref-type="bibr" rid="B5">2012</xref>; Gollan et al., <xref ref-type="bibr" rid="B11">2015a</xref>; Jaworska et al., <xref ref-type="bibr" rid="B16">2015</xref>; Fonseka et al., <xref ref-type="bibr" rid="B10">2016</xref>). Compared with happy and neutral expressions, the patients were more sensitive to sad expressions (Maniglio et al., <xref ref-type="bibr" rid="B23">2014</xref>; Zhang et al., <xref ref-type="bibr" rid="B39">2016</xref>), and they tended to judge happy expressions as neutral (Bocharov et al., <xref ref-type="bibr" rid="B3">2017</xref>), while judging neutral expressions as sad (Maniglio et al., <xref ref-type="bibr" rid="B23">2014</xref>; Fonseka et al., <xref ref-type="bibr" rid="B10">2016</xref>). In addition, researchers found that patients with depression can accurately identify ordinary facial expressions (Gollan et al., <xref ref-type="bibr" rid="B13">2008</xref>; Robinson et al., <xref ref-type="bibr" rid="B29">2015</xref>), but their reaction time (RT) were longer than that of healthy individuals (Ba&#x0015F;g&#x000F6;ze et al., <xref ref-type="bibr" rid="B1">2015</xref>). However, in addition to ordinary facial expressions, there are many MEs in our daily life.</p>
<p>Compared with ordinary expressions, ME recognition has its particularities. First, recognizing ME requires higher sensitivity and recognition ability (Matsumoto and Hwang, <xref ref-type="bibr" rid="B24">2011</xref>). Therefore, there may be differences in the characteristics of ordinary expressions and MEs in patients with depression. Second, ME recognition ability is associated with discerning ability (Hurley et al., <xref ref-type="bibr" rid="B15">2014</xref>; Yin et al., <xref ref-type="bibr" rid="B38">2016</xref>) and social ability (Matsumoto and Hwang, <xref ref-type="bibr" rid="B24">2011</xref>), so the defects in recognizing MEs could reflect defects in discerning and social skills, to some degree. However, the negative bias showed in previous studies was based on adopting ordinary expressions as stimuli. To our knowledge, there has been no study examining the ME recognition characteristics of patients with depression. Based on this discussion, employing the ecological MEs recognition test established by Zhang et al. (<xref ref-type="bibr" rid="B40">2017</xref>), this four (contexts: happy, neutral, sad and fearful) &#x000D7; 4 (ME: happy, neutral, sad and fearful) &#x000D7; 2 (group: depression and control group) study aimed to explore the ecological MEs recognition characteristics of patients with depression. Based on previous studies and the characteristics of MEs, we hypothesized that: (1) background expressions affect participants&#x02019; recognition of MEs; (2) compared with healthy individuals, the patients with depression would show lower ACC and longer RT while completing the ecological MEs recognition task; and (3) patients with depression would also show a negative bias in the ecological MEs task.</p>
</sec>
<sec sec-type="materials and methods" id="s2">
<title>Materials and Methods</title>
<sec id="s2-1">
<title>Subjects</title>
<p>Thirty unmedicated patients (21 females) with a first episode of depression, diagnosed with a current depression according to the Diagnostic and Statistical Manual (DSM-IV), were selected from Suzhou Guangji Hospital. Inclusion criteria are: (1) no head trauma experience; (2) no drug and other material abuse experience; (3) female subjects are not breast-feeding or pregnancy; (4) no anxiety disorder, bipolar disorder and other mental illness; (5) between 20&#x02013;60 years old; and (6) the degree of education is junior high school and above. Thirty healthy individuals (control group, 21 females) were enrolled. The two groups were matched in age, education and handedness. All subjects were right hand, normal or corrected to normal eyesight visual acuity. All subjects provided written informed consent, which was in accordance with the Declaration of Helsinki (1991) before the experiment, which was approved by the Suzhou Psychiatric Hospital Ethics Committee. Participants received 50 RMB for participation.</p>
<p>The depression level of all subjects was measured using the Chinese version of the Beck Depression Inventory II (BDI-II; Beck et al., <xref ref-type="bibr" rid="B2">1997</xref>), revised by Wang et al. (<xref ref-type="bibr" rid="B34">2011</xref>). The revised version&#x02019;s internal consistency coefficient Cronbach &#x003B1; is 0.94. The score of &#x0003C;14 points means no depression, &#x02265;14 points means depression, the scores of all control group members&#x02019; were below 14, while all patients scores were higher than 14. The two groups were matched in age, education and handless. The basic information of all subjects is shown in Table <xref ref-type="table" rid="T1">1</xref>.</p>
<table-wrap id="T1" position="float">
<label>Table 1</label>
<caption><p>Characteristics of the patient and control groups (<italic>N</italic> = 30).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center"></th>
<th align="center">Patient (M &#x000B1; SD)</th>
<th align="center">Control (M &#x000B1; SD)</th>
<th align="center" colspan="1"><italic>t</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Mean age</td>
<td align="center">36.93 &#x000B1; 11.86</td>
<td align="center">37.60 &#x000B1; 12.06</td>
<td align="center">1.29</td>
</tr>
<tr>
<td align="left">Education time</td>
<td align="center">12.96 &#x000B1; 3.09</td>
<td align="center">13.33 &#x000B1; 3.08</td>
<td align="center">0.78</td>
</tr>
<tr>
<td align="left">BDI</td>
<td align="center">24.83 &#x000B1; 6.88</td>
<td align="center">7.63 &#x000B1; 3.63</td>
<td align="center">14.56***</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Note: &#x0201C;***&#x0201D; stands for &#x0201C;<italic>p</italic> &#x0003C; 0.001&#x0201D;. BDI, Beck Depression Inventory; M, mean; SD, standard deviation</italic>.</p>
</table-wrap-foot>
</table-wrap>
</sec>
<sec id="s2-2">
<title>Stimuli and Procedure</title>
<p>Forty grayscale images (338 &#x000D7; 434 pixels) of 10 models (five females) with facial expressions of happiness, neutral, sadness and fear were selected from Ekman&#x02019;s Pictures of Facial Affect (POFA; Ekman and Friesen, <xref ref-type="bibr" rid="B9">1976</xref>). The experimental program was programmed with E-prime 2.0. The procedure was consisted of four blocks, while each block comprised of 40 trials, a total of 160 trials. As shown in Figure <xref ref-type="fig" rid="F1">1</xref>, each trial started with a 500 ms white fixation cross, followed by a blank (500 ms), the expression context (1000 ms), the target expression (133 ms), next, the same context was presented (1000 ms). After that, the labels of the four target expressions (happiness, neutral, fear and sadness) were presented, subjects were required to discriminate the target expression. They were instructed to press the &#x0201C;D&#x0201D; key with their left middle finger, if the target expression was happiness, key &#x0201C;F&#x0201D; with their left index finger when neutral, key &#x0201C;J&#x0201D; with their right index finger when sadness, key &#x0201C;K&#x0201D; with their right index finger when fear. The images used in each trial comes from the same model. Participants were told to complete the task as accurately as possible (up to 20,000 ms). Finally, a blank (1000 ms). The block design has been adopted in this study, only one type of contexts (neutral, sad, happy and fear) was adopted in each block. All stimuli were presented in the center of the screen.</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Illustration of one experimental trial.</p></caption>
<graphic xlink:href="fnbeh-11-00199-g0001.tif"/>
</fig>
<p>This study was conducted in a sound attenuated room, subjects sat in front of a 17-inch CRT monitor, with a resolution of 1280 &#x000D7; 1024 pixels and a refresh rate of 75 Hz, at a distance of 70 cm. In order to ensure subjects fully understood the flowchart, 16 practice trials were provided before the formal test. Feedback was provided for each trial in the practice phase, while no feedback was provided in the formal test. The flowchart of the practice trials were the same as the form test. In order to minimize fatigue effects, all subjects were asked to rest 2 min after each block.</p>
</sec>
<sec id="s2-3">
<title>Data Recording and Analysis</title>
<p>The data in this study was collected by E-prime 2.0. All statistical analyses were performed using SPSS 16.0, <italic>post hoc</italic> testing was conducted using the Bonferroni correction, while <italic>p</italic> values were corrected by Greenhouse-Geisser method.</p>
</sec>
</sec>
<sec sec-type="results" id="s3">
<title>Results</title>
<sec id="s3-1">
<title>Indicator 1: The ACC of Recognizing MEs</title>
<p>The results of the one-sample <italic>t</italic>-test showed that the ACC of recognizing all MEs was significantly (<italic>p</italic>s &#x0003C; 0.01) higher than chance (0.25), indicating that the ACC was not the result of random guessing. For the measures of ACC in recognizing MEs, three-way repeated-measures ANOVA was performed, with the type of context and ME as the within-subject factors and group (patients vs. controls) as the between-subjects factor. The results showed that the main effect of context was significant (<italic>F</italic><sub>(3,174)</sub> = 2.953, <italic>p</italic> = 0.034, &#x003B7;<sup>2</sup> = 0.048), the ACC under the neutral background expression condition tended to be higher than that under the fear background expression condition (<italic>p</italic> = 0.068), and the ACC under any other two background expression conditions showed no significant difference (<italic>p</italic>s &#x0003E; 0.327), indicating that the individuals&#x02019; ecological MEs recognition ACC was affected by the type of background expression. The main effect of ME was significant (<italic>F</italic><sub>(3,174)</sub> = 52.795, <italic>p</italic> &#x0003C; 0.001, &#x003B7;<sup>2</sup> = 0.477); the <italic>post hoc</italic> analysis revealed that the ACC under the happy ME condition was higher than that under the neutral, sad and fear ME conditions (<italic>p</italic> &#x0003C; 0.001), while the ACC under the neutral ME condition was higher than that under the sad (<italic>p</italic> = 0.001) and fear (<italic>p</italic> &#x0003C; 0.001) ME conditions. Indicating participants&#x02019; ecological MEs recognition ACC was affected by the type of ME. The main effect of group was not significant (<italic>F</italic><sub>(1,58)</sub> = 0.121, <italic>p</italic> = 0.729, &#x003B7;<sup>2</sup> = 0.002). The interaction of context with ME was significant (<italic>F</italic><sub>(9,522)</sub> = 24.062, <italic>p</italic> &#x0003C; 0.001, &#x003B7;<sup>2</sup> = 0.293), the interaction of context with group (<italic>F</italic><sub>(3,174)</sub> = 0.727, <italic>p</italic> = 0.537, &#x003B7;<sup>2</sup> = 0.012), ME with group (<italic>F</italic><sub>(3,174)</sub> = 1.340, <italic>p</italic> = 0.263, &#x003B7;<sup>2</sup> = 0.023), and context with ME with group (<italic>F</italic><sub>(9,522)</sub> = 1.318, <italic>p</italic> = 0.224, &#x003B7;<sup>2</sup> = 0.022) were not significant.</p>
<p>As the interaction effect of context with ME was significant, a simple effect analysis was conducted and the results were as follows. In the neutral background expression condition, the ACC of happy and neutral MEs was higher (<italic>p</italic>s &#x0003C; 0.001) than that of sad and fear MEs. In the happy background expression condition, the ACC of happy MEs was higher (<italic>p</italic>s &#x0003C; 0.001) than that of neutral, sad and fear MEs, while the ACC of fear MEs was higher (<italic>p</italic> = 0.037) than that of sad MEs. In the sad background expression condition, the ACC of happy MEs was higher than that of neutral (<italic>p</italic> &#x0003C; 0.001), sad (<italic>p</italic> = 0.015) and fear (<italic>p</italic> &#x0003C; 0.001) MEs, while the ACC of sad MEs was higher (<italic>p</italic>s &#x0003C; 0.001) than that of neutral and fear MEs. In the fear background expression condition, the ACC of happy MEs was higher than that of neutral (<italic>p</italic> = 0.016), sad (<italic>p</italic> &#x0003C; 0.001) and fear (<italic>p</italic> = 0.001) MEs, while the ACCs of neutral and fear MEs were higher (<italic>p</italic>s &#x0003C; 0.001) than that of sad MEs.</p>
<p>In conclusion, the type of context tended to influence individuals&#x02019; ACC of recognizing MEs, the type of ME significantly influenced the individuals&#x02019; ACC, depression had no significant influence on the individuals&#x02019; ACC of recognizing MEs. Additionally, the individuals&#x02019; ACC was significantly influenced by the interaction effect of context with ME.</p>
</sec>
<sec id="s3-2">
<title>Indicator 2: The RT of Recognizing MEs</title>
<p>For the measures of RT of recognizing MEs, three-way repeated-measures ANOVA was performed, with the type of context and ME as the within-subject factors and group (patients vs. controls) as the between-subjects factor. The results showed that the main effect of context was significant (<italic>F</italic><sub>(3,174)</sub> = 11.241, <italic>p</italic> &#x0003C; 0.001, &#x003B7;<sup>2</sup> = 0.162), the <italic>post hoc</italic> analysis showed that the RT under the neutral background expression condition was longer than those under the happy (<italic>p</italic> &#x0003C; 0.001), sad (<italic>p</italic> = 0.026), and fear (<italic>p</italic> &#x0003C; 0.001) background expression conditions; the RTs under any other two background expression conditions showed no significant difference (<italic>p</italic>s &#x0003E; 0.379). This indicated participants&#x02019; ecological MEs recognition RT was affected by the type of background expression condition. The main effect of ME was significant (<italic>F</italic><sub>(3,174)</sub> = 5.753, <italic>p</italic> = 0.002, &#x003B7;<sup>2</sup> = 0.090), the <italic>post hoc</italic> analysis revealed that the RT under the happy ME condition was shorter than those under the neutral (<italic>p</italic> = 0.010), sad (<italic>p</italic> &#x0003C; 0.001), and fear (<italic>p</italic> = 0.002) ME conditions; the RT under any other two ME conditions showed no significant difference (<italic>p</italic>s &#x0003E; 0.05). This indicated participants&#x02019; ecological MEs recognition RT was affected by the type of ME. The main effect of group was significant (<italic>F</italic><sub>(1,58)</sub> = 9.498, <italic>p</italic> = 0.003, &#x003B7;<sup>2</sup> = 0.141), the patients responded slower than did healthy individuals, suggesting that there were defects in the recognition speed of MEs in patients. The interaction of context with ME was significant (<italic>F</italic><sub>(9,522)</sub> = 5.345, <italic>p</italic> &#x0003C; 0.001, &#x003B7;<sup>2</sup> = 0.084). The interaction of context with group (<italic>F</italic><sub>(3,174)</sub> = 1.185, <italic>p</italic> = 0.317, &#x003B7;<sup>2</sup> = 0.020), ME with group (<italic>F</italic><sub>(3,174)</sub> = 0.428, <italic>p</italic> = 0.733, &#x003B7;<sup>2</sup> = 0.007), and context with ME with group (<italic>F</italic><sub>(9,522)</sub> = 1.036, <italic>p</italic> = 0.402, &#x003B7;<sup>2</sup> = 0.018) were not significant.</p>
<p>As the interaction effect of context with ME was significant, a simple effect analysis was conducted and the results are as follows: (1) happy ME: the patients&#x02019; RT under different background expression conditions showed significant difference (<italic>F</italic><sub>(3,174)</sub> = 5.41, <italic>p</italic> = 0.001); a <italic>post hoc</italic> analysis revealed that the RT under any two background expression conditions showed no significant difference (<italic>p</italic>s &#x0003E; 0.05). The healthy individuals&#x02019; RT under different background expression conditions showed no significant difference (<italic>F</italic><sub>(3,174)</sub> = 1.04, <italic>p</italic> = 0.375); (2) neutral ME: the patients&#x02019; RT under different background expression conditions showed significant difference (<italic>F</italic><sub>(3,174)</sub> = 4.31, <italic>p</italic> = 0.006), the RT under the happy background expression condition was longer (<italic>p</italic> = 0.017) than that under the fear background expression condition, suggesting it was difficult for patients to recognize MEs under the happy background expression condition. The healthy individuals&#x02019; RT under different background expression conditions showed significant difference (<italic>F</italic><sub>(3,174)</sub> = 3.57, <italic>p</italic> = 0.015), the RT under the happy background expression condition was longer (<italic>p</italic> = 0.002) than that under the fear background expression condition, suggesting that the happy background expressions decreased the speed of the healthy individuals in recognizing MEs compared to fear background expressions; and (3) sad ME: the patients&#x02019; RT under different background expression conditions showed no significant difference (<italic>F</italic><sub>(3,174)</sub> = 1.42, <italic>p</italic> = 0.238). The healthy individuals&#x02019; RT under different background expression conditions showed significant difference (<italic>F</italic><sub>(3,174)</sub> = 4.54, <italic>p</italic> = 0.004); the RT under the sad background expression condition was longer than those under the neutral (<italic>p</italic> = 0.002) and fear (<italic>p</italic> = 0.013) background expression conditions. (4) Fear ME: the patients&#x02019; RT under different background expressions showed significant difference (<italic>F</italic><sub>(3,174)</sub> = 8.44, <italic>p</italic> &#x0003C; 0.001); the RT under the happy background expression condition was longer than those under the neutral (<italic>p</italic> = 0.015) and fear (<italic>p</italic> = 0.048) background expression conditions, suggesting that the happy background expressions decreased the speed of the patients in recognizing fear MEs. The healthy individuals&#x02019; RT under different background expressions showed no significant difference (<italic>F</italic><sub>(3,174)</sub> = 1.94, <italic>p</italic> = 0.124). See Table <xref ref-type="table" rid="T2">2</xref> for details.</p>
<table-wrap id="T2" position="float">
<label>Table 2</label>
<caption><p>The simple effect analysis results of the reaction time (RT) of patients with depression and healthy individuals under different conditions.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">ME</th>
<th align="left">Group</th>
<th align="left">RT under different context</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Neutral</td>
<td align="left">Patients with depression</td>
<td align="left">Happy &#x0003E; fear</td>
</tr>
<tr>
<td/>
<td align="left">Healthy individuals</td>
<td align="left">Happy &#x0003E; fear</td>
</tr>
<tr>
<td align="left">Sad</td>
<td align="left">Healthy individuals</td>
<td align="left">Sad &#x0003E; neutral, fear</td>
</tr>
<tr>
<td align="left">Fear</td>
<td align="left">Patients with depression</td>
<td align="left">Happy &#x0003E; neutral, sad</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>In conclusion, the influence of different background expressions on RT of individuals recognizing MEs showed significant difference, the type of ME significantly influenced individuals&#x02019; RT, and the patients&#x02019; RT was longer than that of healthy individuals. Additionally, individuals&#x02019; RT was significantly influenced by the interaction effect of context with ME.</p>
</sec>
<sec id="s3-3">
<title>Indicator 3: Negative Bias</title>
<p>When complete the ecological MEs recognition task, there were four options (happy, neutral, sad and fear) for participants to choose; the probability of any one of the options to be selected was 0.25. Misjudgment refers to identifying one ME as another. One-sample <italic>t</italic>-test (test value = 0.25) was conducted, with the misjudgment mode that judge happy MEs as neutral, and judge neutral MEs as sad under different background expression conditions as the dependent variable, and the results are shown in Table <xref ref-type="table" rid="T3">3</xref>.</p>
<table-wrap id="T3" position="float">
<label>Table 3</label>
<caption><p>The patients&#x02019; misjudgment of the happy and neutral micro-expressions (<italic>df</italic> = 29).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left">Misjudgment mode</th>
<th align="center"><italic>M</italic></th>
<th align="center"><italic>SD</italic></th>
<th align="center"><italic>t</italic></th>
<th align="center"><italic>p</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Neutral-happy-neutral</td>
<td align="center">0.080</td>
<td align="center">0.132</td>
<td align="center">&#x02212;7.035</td>
<td align="center">0.000</td>
</tr>
<tr>
<td align="left">Sad-happy-neutral</td>
<td align="center">0.097</td>
<td align="center">0.161</td>
<td align="center">&#x02212;5.224</td>
<td align="center">0.000</td>
</tr>
<tr>
<td align="left">Fear-happy-neutral</td>
<td align="center">0.130</td>
<td align="center">0.137</td>
<td align="center">&#x02212;4.803</td>
<td align="center">0.000</td>
</tr>
<tr>
<td align="left">Sad-neutral-sad</td>
<td align="center">0.157</td>
<td align="center">0.230</td>
<td align="center">2.298</td>
<td align="center">0.029</td>
</tr>
<tr>
<td align="left">Happy-neutral-sad</td>
<td align="center">0.073</td>
<td align="center">0.166</td>
<td align="center">&#x02212;0.180</td>
<td align="center">0.858</td>
</tr>
<tr>
<td align="left">Fear-neutral-sad</td>
<td align="center">0.240</td>
<td align="center">0.222</td>
<td align="center">&#x02212;0.246</td>
<td align="center">0.807</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Note: &#x0201C;sad-happy-neutral&#x0201D; signifies that participants judged the happy micro-expression (ME) under the sad expression condition as neutral. The same applies for the other conditions. M, mean; SD, standard deviation</italic>.</p>
</table-wrap-foot>
</table-wrap>
</sec>
</sec>
<sec sec-type="discussion" id="s4">
<title>Discussion</title>
<p>Previous studies have explored the characteristics of facial expression processing of patients with depression using ordinary facial expressions as stimuli. No study has yet reported on those patients&#x02019; ecological MEs processing characteristics, although ecological MEs (vs. ordinary facial expressions) are more consistent with real-life expressions and could better elucidate the facial expression processing characteristics in patients with depression. By adopting an ecological MEs recognition paradigm (Zhang et al., <xref ref-type="bibr" rid="B40">2017</xref>), the present study explored the ecological MEs recognition characteristics of patients with depression for the first time.</p>
<p>Individuals&#x02019; performance in ecological MEs recognition was affected by background expressions. First, in terms of ACC, the results of this study showed that the ACC under the neutral background expression condition tended to be higher than that under the fear background expression condition, indicating that ignoring the influence of background expressions (Ekman and Friesen, <xref ref-type="bibr" rid="B8">1974</xref>) or taking into consideration only the influence of neutral background expressions (Matsumoto et al., <xref ref-type="bibr" rid="B25">2000</xref>) is inappropriate. When studying individuals&#x02019; ME recognition characteristics, the role of different background expressions should be fully considered. Second, in terms of RT, the results showed that individuals&#x02019; RT was longer when recognizing MEs under the neutral background expression condition than under the happy, sad and fear background expression conditions. This may be because neutral background expressions conveyed less emotional information and could not effectively promote individuals&#x02019; processing of MEs compared with the other three background expression conditions, resulting in a decrease in individuals&#x02019; processing speed while recognizing MEs under this condition. In short, both the ACC and RT of individuals&#x02019; recognition of MEs were influenced by background expressions, which confirmed our hypothesis. Therefore, the role of different background expressions should be fully considered when investigating individuals&#x02019; performance in recognizing ecological MEs.</p>
<p>Individuals&#x02019; performance in ecological MEs recognition was affected by the type of ME. In the present study, happy, neutral, sad, and fear MEs were studied and the results showed that, in terms of ACC, happy ME had the highest, neutral MEs had moderate and sad and fear MEs had the lowest ACC, and the ACC of recognizing sad and fear MEs showed no significant difference. This indicated that happy MEs are easier to recognize than the other studied MEs, which is consistent with previous study findings (Schaefer et al., <xref ref-type="bibr" rid="B32">2010</xref>; Kujawa et al., <xref ref-type="bibr" rid="B19">2014</xref>; Kluczniok et al., <xref ref-type="bibr" rid="B18">2016</xref>). In terms of RT, the RT of recognizing the happy ME were significantly shorter than the RTs of recognizing neutral, sad and fear MEs, while the RT of recognizing the latter three MEs showed no significant difference, suggesting individuals were more sensitive to happy ME than to the others.</p>
<p>Individuals&#x02019; ACC and RT were affected by both the type of background expression and by ME alone and by the interaction effect of those two factors. For example, compared with sad and fear MEs, individuals&#x02019; ACC of recognizing the happy ME under the neutral background expression condition was higher. When recognizing the neutral ME, the recognition speed under the fear background expression condition was quicker than that under the happy background expression condition. It is suggested that the influence of background expression and ME should not be considered in isolation when exploring individuals&#x02019; performance in recognizing ecological MEs. In addition, it should be noted that when the type of background expressions and ME are congruent, the expression recognition task may be considered to be an ordinary, as opposed to ecological, expression recognition task and these two tasks should be distinguished. For example, individuals&#x02019; ACC of recognizing neutral expression was higher than the ACCs of recognizing sad and fear expressions, under the neutral background expression condition; however, under this condition, the neutral expression is an ordinary expression, while the sad and fear expressions are considered to be ecological MEs. The difference between the ACC of recognizing neutral and sad/fear expressions was not necessarily caused by the interaction between the type of background expression and ME, but it was likely caused by the difference between the ordinary expression and the ME.</p>
<p>Individuals&#x02019; performance in ecological MEs recognition was affected by the presence of depression. In terms of ACC, there was no significant difference between the patients and healthy individuals, which was consistent with the findings of patients processing ordinary facial expressions (Lepp&#x000E4;nen et al., <xref ref-type="bibr" rid="B20">2004</xref>; Meyers et al., <xref ref-type="bibr" rid="B26">2015</xref>; Robinson et al., <xref ref-type="bibr" rid="B29">2015</xref>), which indicated that individuals&#x02019; ACC of recognizing ecological MEs was not affected by the presence of depression. In addition, a review (Bourke et al., <xref ref-type="bibr" rid="B4">2010</xref>) revealed that compared with healthy individuals, patients with depression allocate more (less) attention resources to sad (pleasant) facial expressions, while processing ordinary facial expressions, but numerous studies have shown that the ACC of recognizing ordinary facial expressions showed no significant difference between these two groups, which supports the results of the present study to some extent. In terms of RT, the patients&#x02019; RT was longer than that of healthy individuals, which was consistent with the results of patients processing ordinary facial expressions (Wu et al., <xref ref-type="bibr" rid="B35">2016</xref>; Zhang et al., <xref ref-type="bibr" rid="B39">2016</xref>). However, Dai et al. (<xref ref-type="bibr" rid="B6">2016</xref>) found that the RT of patients with depression and healthy individuals processing neutral expression showed no significant difference, while the patients&#x02019; RT of processing sad expression was shorter than that of healthy individuals. The difference between Dai et al. (<xref ref-type="bibr" rid="B6">2016</xref>) and the present study may have been caused by the different experimental task. The task in the Dai et al. (<xref ref-type="bibr" rid="B6">2016</xref>) study was an emotional valence evaluation task, and had no time limitations, while the participants in this study were asked to complete an emotional labeling task within a limited time. Researchers could replace the experimental task in this study with an emotional valence evaluation task in the future, so as to further compare the characteristics of processing speed of patients with depression completing different types of facial expression processing tasks. The results of this study show that RT could reflect the difference between patients with depression and healthy individuals, when comparing their ecological MEs recognition characteristics, indicating that RT is an indicator more sensitive than ACC. Our second hypothesis was partially confirmed. In addition, the present study further confirmed that the presence of clinical depression affects the RT of the patients&#x02019; while performing cognitive task.</p>
<p>A large number of previous studies (Dai and Feng, <xref ref-type="bibr" rid="B5">2012</xref>; Li et al., <xref ref-type="bibr" rid="B21">2015</xref>; Fonseka et al., <xref ref-type="bibr" rid="B10">2016</xref>; Zhang et al., <xref ref-type="bibr" rid="B39">2016</xref>) showed that when processing ordinary expressions, patients with depression showed an obvious negative bias; they tended to judge happy MEs as neutral (Bourke et al., <xref ref-type="bibr" rid="B4">2010</xref>; Bocharov et al., <xref ref-type="bibr" rid="B3">2017</xref>). The present study showed that under three different background expression conditions (neutral, sad and fear), patients with depression tended to judge happy MEs as neutral, which was consistent with the findings of these previous studies. These results also show that the patients judging happy MEs as neutral is a stable phenomenon, which does not depend on the background expression. Meanwhile, previous studies (Gollan et al., <xref ref-type="bibr" rid="B13">2008</xref>; Fonseka et al., <xref ref-type="bibr" rid="B10">2016</xref>) have shown that patients with depression tend to judge neutral expressions as sad. The present study showed that patients tended to misjudge neutral MEs under the sad background expression condition as sad, and the misjudge probability reached a statistically significant level. Therefore, our third hypothesis was confirmed. Based on the results of this study, we could draw the conclusion that there are two types of negative bias when patients with depression recognize ecological MEs: first, the ME recognition of patients with depression depends on negative background expressions. Judging neutral MEs under sad background expression conditions as sad suggests that the patients have lower emotional self-control ability, which is affected by the negative background (or environment). Second, the patients judging happy MEs as neutral under any background expression condition reflects their lack of pleasant experience (Yang and Jones, <xref ref-type="bibr" rid="B37">2008</xref>; Stuhrmann et al., <xref ref-type="bibr" rid="B33">2013</xref>). These two types of negative bias may be markers of depression. Our findings showed the unique characteristics of ME recognition in patients with depression and demonstrated the value of studying ecological MEs.</p>
<p>Based on previous studies on processing of ordinary facial expressions in patients with depression, by adopting the ecological MEs recognition paradigm, the present study explored the ME recognition characteristics in patients with depression for the first time. This study differed from ordinary facial expression tasks (previous studies) as it extended the task to ecological MEs, which may contribute to furthering the understanding of the processing characteristics of facial expressions in patients with depression. Compared with the classical ME recognition paradigm (only considering the role of neutral context in expression recognition), the ecological MEs recognition paradigm has been adopted in this study, which could simultaneously compare the influence of happy, sad and fearful contexts in ME recognition. Our paradigm is closer to MEs as they appear in daily life; thus, it has superior ecological validity. It both contributes to attaining an in-depth understanding of the role of context in ecological MEs recognition and may be used as an adjunct diagnostic indicator for depression. Though the diagnosis of depression is eminently clinical, having solid psychiatric, biochemical, and neurofunctional underpinnings, if the ecological MEs recognition characteristics of patients, as revealed in this study, could be combined with existing psychiatric and other indicators, the objectivity and comprehensiveness of depression diagnosis could be further enhanced.</p>
<p>Previous studies indicated that patients with schizophrenia showed abnormal emotional experience and facial expression recognition (Sanchez et al., <xref ref-type="bibr" rid="B31">2014</xref>; Zhu et al., <xref ref-type="bibr" rid="B42">2016</xref>); they experienced more negative and less positive emotion than did healthy individuals. Meanwhile, the ability of recognizing positive expressions was weaker in male patients with schizophrenia than in healthy individuals, while the ability of recognizing both positive and negative emotions was similar in female patients with schizophrenia and in healthy individuals. Based on what was mentioned above, the recognition characteristics of ecological MEs in patients with schizophrenia could be explored and, then, compared with their processing characteristics of ordinary facial expressions; thus, this line of research could contribute to further the understanding of processing characteristics of facial expressions in patients with schizophrenia. Additionally, the ecological MEs recognition characteristics of patients with depression and patients with schizophrenia could be compared.</p>
</sec>
<sec sec-type="conclusion" id="s5">
<title>Conclusion</title>
<p>Adopting the ecological MEs recognition paradigm, the present study revealed the ecological MEs recognition characteristics of patients with depression for the first time and extended the scope of facial expression processing in patients with depression. The results showed that: (1) the ecological MEs recognition of patients with depression could be influenced by background expressions and it is necessary to conduct research on ecological MEs recognition; (2) patients with depression showed defects in cognitive function when processing ecological MEs, manifested as a slower RT compared to the RT of healthy individuals; and (3) patients with depression had a negative bias when performing the ecological MEs recognition task; they tended to judge happy MEs under different background expression conditions, and judge the neutral ME under the sad background expression condition as sad, which may reflect their deficit in recognizing positive emotions and their tendency to generalize sad emotions.</p>
</sec>
<sec id="s6">
<title>Author Contributions</title>
<p>CZ, XC, JZ, ZL, ZT, YX, DZ and DL conceived the study and coordinated the experiments. XC, ZL, YX, DZ and ZT performed the experiment. CZ and XC analyzed the data. CZ wrote the manuscript. CZ, JZ and DL revised the manuscript, all authors read and approved the final manuscript.</p>
</sec>
<sec id="s7">
<title>Conflict of Interest Statement</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</body>
<back>
<fn-group>
<fn fn-type="financial-disclosure">
<p><bold>Funding.</bold> The work was supported by the National Natural Science Foundation of China (Grant No. 31271084).</p>
</fn>
</fn-group>
<ref-list>
<title>References</title>
<ref id="B1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ba&#x0015F;g&#x000F6;ze</surname> <given-names>Z.</given-names></name> <name><surname>G&#x000F6;n&#x000FC;l</surname> <given-names>A. S.</given-names></name> <name><surname>Baskak</surname> <given-names>B.</given-names></name> <name><surname>G&#x000F6;k&#x000E7;ay</surname> <given-names>D.</given-names></name></person-group> (<year>2015</year>). <article-title>Valence-based Word-Face Stroop task reveals differential emotional interference in patients with major depression</article-title>. <source>Psychiatry Res.</source> <volume>229</volume>, <fpage>960</fpage>&#x02013;<lpage>967</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2015.05.099</pub-id><pub-id pub-id-type="pmid">26272019</pub-id></citation></ref>
<ref id="B2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Beck</surname> <given-names>A. T.</given-names></name> <name><surname>Brown</surname> <given-names>G. K.</given-names></name> <name><surname>Steer</surname> <given-names>R. A.</given-names></name></person-group> (<year>1997</year>). <article-title>Psychometric characteristics of the Scale for Suicide Ideation with psychiatric outpatients</article-title>. <source>Behav. Res. Ther.</source> <volume>35</volume>, <fpage>1039</fpage>&#x02013;<lpage>1046</lpage>. <pub-id pub-id-type="doi">10.1016/s0005-7967(97)00073-9</pub-id><pub-id pub-id-type="pmid">9431735</pub-id></citation></ref>
<ref id="B3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bocharov</surname> <given-names>A. V.</given-names></name> <name><surname>Knyazev</surname> <given-names>G. G.</given-names></name> <name><surname>Savostyanov</surname> <given-names>A. N.</given-names></name></person-group> (<year>2017</year>). <article-title>Depression and implicit emotion processing: an EEG study</article-title>. <source>Neurophysiol. Clin.</source> <volume>47</volume>, <fpage>225</fpage>&#x02013;<lpage>230</lpage>. <pub-id pub-id-type="doi">10.1016/j.neucli.2017.01.009</pub-id><pub-id pub-id-type="pmid">28215469</pub-id></citation></ref>
<ref id="B4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bourke</surname> <given-names>C.</given-names></name> <name><surname>Douglas</surname> <given-names>K.</given-names></name> <name><surname>Porter</surname> <given-names>R.</given-names></name></person-group> (<year>2010</year>). <article-title>Processing of facial emotion expression in major depression: a review</article-title>. <source>Aust. N Z J. Psychiatry</source> <volume>44</volume>, <fpage>681</fpage>&#x02013;<lpage>696</lpage>. <pub-id pub-id-type="doi">10.3109/00048674.2010.496359</pub-id><pub-id pub-id-type="pmid">20636189</pub-id></citation></ref>
<ref id="B5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dai</surname> <given-names>Q.</given-names></name> <name><surname>Feng</surname> <given-names>Z.</given-names></name></person-group> (<year>2012</year>). <article-title>More excited for negative facial expressions in depression: evidence from an event-related potential study</article-title>. <source>Clin. Neurophysiol.</source> <volume>123</volume>, <fpage>2172</fpage>&#x02013;<lpage>2179</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2012.04.018</pub-id><pub-id pub-id-type="pmid">22727714</pub-id></citation></ref>
<ref id="B6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dai</surname> <given-names>Q.</given-names></name> <name><surname>Wei</surname> <given-names>J.</given-names></name> <name><surname>Shu</surname> <given-names>X.</given-names></name> <name><surname>Feng</surname> <given-names>Z.</given-names></name></person-group> (<year>2016</year>). <article-title>Negativity bias for sad faces in depression: an event-related potential study</article-title>. <source>Clin. Neurophysiol.</source> <volume>127</volume>, <fpage>3552</fpage>&#x02013;<lpage>3560</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2016.10.003</pub-id><pub-id pub-id-type="pmid">27833064</pub-id></citation></ref>
<ref id="B7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name></person-group> (<year>2003</year>). <article-title>Darwin, deception, and facial expression</article-title>. <source>Ann. N Y Acad. Sci.</source> <volume>1000</volume>, <fpage>205</fpage>&#x02013;<lpage>221</lpage>. <pub-id pub-id-type="doi">10.1196/annals.1280.010</pub-id><pub-id pub-id-type="pmid">14766633</pub-id></citation></ref>
<ref id="B8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name> <name><surname>Friesen</surname> <given-names>W. V.</given-names></name></person-group> (<year>1974</year>). <article-title>Detecting deception from the body or face</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>29</volume>, <fpage>288</fpage>&#x02013;<lpage>298</lpage>. <pub-id pub-id-type="doi">10.1037/h0036006</pub-id></citation></ref>
<ref id="B9"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name> <name><surname>Friesen</surname> <given-names>W. V.</given-names></name></person-group> (<year>1976</year>). <source>Pictures of Facial Affect.</source> <publisher-loc>Palo Alto, CA</publisher-loc>: <publisher-name>Consulting Psychologists Press</publisher-name>.</citation></ref>
<ref id="B10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fonseka</surname> <given-names>B. A.</given-names></name> <name><surname>Jaworska</surname> <given-names>N.</given-names></name> <name><surname>Courtright</surname> <given-names>A.</given-names></name> <name><surname>MacMaster</surname> <given-names>F. P.</given-names></name> <name><surname>MacQueen</surname> <given-names>G. M.</given-names></name></person-group> (<year>2016</year>). <article-title>Cortical thickness and emotion processing in young adults with mild to moderate depression: a preliminary study</article-title>. <source>BMC Psychiatry</source> <volume>16</volume>:<fpage>38</fpage>. <pub-id pub-id-type="doi">10.1186/s12888-016-0750-8</pub-id><pub-id pub-id-type="pmid">26911621</pub-id></citation></ref>
<ref id="B11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gollan</surname> <given-names>J. K.</given-names></name> <name><surname>Connolly</surname> <given-names>M.</given-names></name> <name><surname>Buchanan</surname> <given-names>A.</given-names></name> <name><surname>Hoxha</surname> <given-names>D.</given-names></name> <name><surname>Rosebrock</surname> <given-names>L.</given-names></name> <name><surname>John</surname> <given-names>C.</given-names></name> <etal/></person-group>. (<year>2015a</year>). <article-title>Neural substrates of negativity bias in women with and without major depression</article-title>. <source>Biol. Psychol.</source> <volume>109</volume>, <fpage>184</fpage>&#x02013;<lpage>191</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsycho.2015.06.003</pub-id><pub-id pub-id-type="pmid">26073417</pub-id></citation></ref>
<ref id="B12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gollan</surname> <given-names>J. K.</given-names></name> <name><surname>Hoxha</surname> <given-names>D.</given-names></name> <name><surname>Hunnicutt-Ferguson</surname> <given-names>K.</given-names></name> <name><surname>Norris</surname> <given-names>C. J.</given-names></name> <name><surname>Rosebrock</surname> <given-names>L.</given-names></name> <name><surname>Sankin</surname> <given-names>L.</given-names></name> <etal/></person-group>. (<year>2015b</year>). <article-title>Twice the negativity bias and half the positivity offset: evaluative responses to emotional information in depression</article-title>. <source>J. Behav. Ther. Exp. Psychiatry</source> <volume>52</volume>, <fpage>166</fpage>&#x02013;<lpage>170</lpage>. <pub-id pub-id-type="doi">10.1016/j.jbtep.2015.09.005</pub-id><pub-id pub-id-type="pmid">26434794</pub-id></citation></ref>
<ref id="B13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gollan</surname> <given-names>J. K.</given-names></name> <name><surname>Pane</surname> <given-names>H. T.</given-names></name> <name><surname>McCloskey</surname> <given-names>M. S.</given-names></name> <name><surname>Coccaro</surname> <given-names>E. F.</given-names></name></person-group> (<year>2008</year>). <article-title>Identifying differences in biased affective information processing in major depression</article-title>. <source>Psychiatry Res.</source> <volume>159</volume>, <fpage>18</fpage>&#x02013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2007.06.011</pub-id><pub-id pub-id-type="pmid">18342954</pub-id></citation></ref>
<ref id="B14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hall</surname> <given-names>J. A.</given-names></name> <name><surname>Matsumoto</surname> <given-names>D.</given-names></name></person-group> (<year>2004</year>). <article-title>Gender differences in judgments of multiple emotions from facial expressions</article-title>. <source>Emotion</source> <volume>4</volume>, <fpage>201</fpage>&#x02013;<lpage>206</lpage>. <pub-id pub-id-type="doi">10.1037/1528-3542.4.2.201</pub-id><pub-id pub-id-type="pmid">15222856</pub-id></citation></ref>
<ref id="B15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hurley</surname> <given-names>C. M.</given-names></name> <name><surname>Anker</surname> <given-names>A. E.</given-names></name> <name><surname>Frank</surname> <given-names>M. G.</given-names></name> <name><surname>Matsumoto</surname> <given-names>D.</given-names></name> <name><surname>Hwang</surname> <given-names>H. C.</given-names></name></person-group> (<year>2014</year>). <article-title>Background factors predicting accuracy and improvement in micro expression recognition</article-title>. <source>Mot. Emot.</source> <volume>38</volume>, <fpage>700</fpage>&#x02013;<lpage>714</lpage>. <pub-id pub-id-type="doi">10.1007/s11031-014-9410-9</pub-id></citation></ref>
<ref id="B16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jaworska</surname> <given-names>N.</given-names></name> <name><surname>Yang</surname> <given-names>X.</given-names></name> <name><surname>Knott</surname> <given-names>V.</given-names></name> <name><surname>MacQueen</surname> <given-names>G.</given-names></name></person-group> (<year>2015</year>). <article-title>A review of fMRI studies during visual emotive processing in major depressive disorder</article-title>. <source>World J. Biol. Psychiatry</source> <volume>16</volume>, <fpage>448</fpage>&#x02013;<lpage>471</lpage>. <pub-id pub-id-type="doi">10.3109/15622975.2014.885659</pub-id><pub-id pub-id-type="pmid">24635551</pub-id></citation></ref>
<ref id="B17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kerestes</surname> <given-names>R.</given-names></name> <name><surname>Segreti</surname> <given-names>A. M.</given-names></name> <name><surname>Pan</surname> <given-names>L. A.</given-names></name> <name><surname>Phillips</surname> <given-names>M. L.</given-names></name> <name><surname>Birmaher</surname> <given-names>B.</given-names></name> <name><surname>Brent</surname> <given-names>D. A.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Altered neural function to happy faces in adolescents with and at risk for depression</article-title>. <source>J. Affect. Disord.</source> <volume>192</volume>, <fpage>143</fpage>&#x02013;<lpage>152</lpage>. <pub-id pub-id-type="doi">10.1016/j.jad.2015.12.013</pub-id><pub-id pub-id-type="pmid">26724693</pub-id></citation></ref>
<ref id="B18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kluczniok</surname> <given-names>D.</given-names></name> <name><surname>Attar</surname> <given-names>C. H.</given-names></name> <name><surname>Fydrich</surname> <given-names>T.</given-names></name> <name><surname>Fuehrer</surname> <given-names>D.</given-names></name> <name><surname>Jaite</surname> <given-names>C.</given-names></name> <name><surname>Domes</surname> <given-names>G.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Transgenerational effects of maternal depression on affect recognition in children</article-title>. <source>J. Affect. Disord.</source> <volume>189</volume>, <fpage>233</fpage>&#x02013;<lpage>239</lpage>. <pub-id pub-id-type="doi">10.1016/j.jad.2015.09.051</pub-id><pub-id pub-id-type="pmid">26451509</pub-id></citation></ref>
<ref id="B19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kujawa</surname> <given-names>A.</given-names></name> <name><surname>Dougherty</surname> <given-names>L.</given-names></name> <name><surname>Durbin</surname> <given-names>C. E.</given-names></name> <name><surname>Laptook</surname> <given-names>R.</given-names></name> <name><surname>Torpey</surname> <given-names>D.</given-names></name> <name><surname>Klein</surname> <given-names>D. N.</given-names></name></person-group> (<year>2014</year>). <article-title>Emotion recognition in preschool children: associations with maternal depression and early parenting</article-title>. <source>Dev. Psychopathol.</source> <volume>26</volume>, <fpage>159</fpage>&#x02013;<lpage>170</lpage>. <pub-id pub-id-type="doi">10.1017/S0954579413000928</pub-id><pub-id pub-id-type="pmid">24444174</pub-id></citation></ref>
<ref id="B20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lepp&#x000E4;nen</surname> <given-names>J. M.</given-names></name> <name><surname>Milders</surname> <given-names>M.</given-names></name> <name><surname>Bell</surname> <given-names>J. S.</given-names></name> <name><surname>Terriere</surname> <given-names>E.</given-names></name> <name><surname>Hietanen</surname> <given-names>J. K.</given-names></name></person-group> (<year>2004</year>). <article-title>Depression biases the recognition of emotionally neutral faces</article-title>. <source>Psychiatry Res.</source> <volume>128</volume>, <fpage>123</fpage>&#x02013;<lpage>133</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2004.05.020</pub-id><pub-id pub-id-type="pmid">15488955</pub-id></citation></ref>
<ref id="B21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>Y.</given-names></name> <name><surname>Cao</surname> <given-names>D.</given-names></name> <name><surname>Wei</surname> <given-names>L.</given-names></name> <name><surname>Tang</surname> <given-names>Y.</given-names></name> <name><surname>Wang</surname> <given-names>J.</given-names></name></person-group> (<year>2015</year>). <article-title>Abnormal functional connectivity of EEG &#x003B3; band in patients with depression during emotional face processing</article-title>. <source>Clin. Neurophysiol.</source> <volume>126</volume>, <fpage>2078</fpage>&#x02013;<lpage>2089</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2014.12.026</pub-id><pub-id pub-id-type="pmid">25766267</pub-id></citation></ref>
<ref id="B22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liu</surname> <given-names>W.</given-names></name> <name><surname>Huang</surname> <given-names>J.</given-names></name> <name><surname>Wang</surname> <given-names>L.</given-names></name> <name><surname>Gong</surname> <given-names>Q.</given-names></name> <name><surname>Chan</surname> <given-names>R. C.</given-names></name></person-group> (<year>2012</year>). <article-title>Facial perception bias in patients with major depression</article-title>. <source>Psychiatry Res.</source> <volume>197</volume>, <fpage>217</fpage>&#x02013;<lpage>220</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2011.09.021</pub-id><pub-id pub-id-type="pmid">22357354</pub-id></citation></ref>
<ref id="B23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maniglio</surname> <given-names>R.</given-names></name> <name><surname>Gusciglio</surname> <given-names>F.</given-names></name> <name><surname>Lofrese</surname> <given-names>V.</given-names></name> <name><surname>Belvederi Murri</surname> <given-names>M.</given-names></name> <name><surname>Tamburello</surname> <given-names>A.</given-names></name> <name><surname>Innamorati</surname> <given-names>M.</given-names></name></person-group> (<year>2014</year>). <article-title>Biased processing of neutral facial expressions is associated with depressive symptoms and suicide ideation in individuals at risk for major depression due to affective temperaments</article-title>. <source>Compr. Psychiatry</source> <volume>55</volume>, <fpage>518</fpage>&#x02013;<lpage>525</lpage>. <pub-id pub-id-type="doi">10.1016/j.comppsych.2013.10.008</pub-id><pub-id pub-id-type="pmid">24238931</pub-id></citation></ref>
<ref id="B24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Matsumoto</surname> <given-names>D.</given-names></name> <name><surname>Hwang</surname> <given-names>H. S.</given-names></name></person-group> (<year>2011</year>). <article-title>Evidence for training the ability to read microexpressions of emotion</article-title>. <source>Mot. Emot.</source> <volume>35</volume>, <fpage>181</fpage>&#x02013;<lpage>191</lpage>. <pub-id pub-id-type="doi">10.1007/s11031-011-9212-2</pub-id></citation></ref>
<ref id="B25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Matsumoto</surname> <given-names>D.</given-names></name> <name><surname>Leroux</surname> <given-names>J.</given-names></name> <name><surname>Wilson-Cohn</surname> <given-names>C.</given-names></name> <name><surname>Raroque</surname> <given-names>J.</given-names></name> <name><surname>Kooken</surname> <given-names>K.</given-names></name> <name><surname>Ekman</surname> <given-names>P.</given-names></name> <etal/></person-group>. (<year>2000</year>). <article-title>A new test to measure emotion recognition ability: matsumoto and ekman&#x02019;s japanese and caucasian brief affect recognition test (JACBART)</article-title>. <source>J. Nonverbal Behav.</source> <volume>24</volume>, <fpage>179</fpage>&#x02013;<lpage>209</lpage>. <pub-id pub-id-type="doi">10.1023/A:1006668120583</pub-id></citation></ref>
<ref id="B26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Meyers</surname> <given-names>K.</given-names></name> <name><surname>Crane</surname> <given-names>N. A.</given-names></name> <name><surname>O&#x02019;Day</surname> <given-names>R.</given-names></name> <name><surname>Zubieta</surname> <given-names>J.-K.</given-names></name> <name><surname>Giordani</surname> <given-names>B.</given-names></name> <name><surname>Pomerleau</surname> <given-names>C. S.</given-names></name> <etal/></person-group>. (<year>2015</year>). <article-title>Smoking history, and not depression, is related to deficits in detection of happy and sad faces</article-title>. <source>Addict. Behav.</source> <volume>41</volume>, <fpage>210</fpage>&#x02013;<lpage>217</lpage>. <pub-id pub-id-type="doi">10.1016/j.addbeh.2014.10.012</pub-id><pub-id pub-id-type="pmid">25452067</pub-id></citation></ref>
<ref id="B27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Milders</surname> <given-names>M.</given-names></name> <name><surname>Bell</surname> <given-names>S.</given-names></name> <name><surname>Boyd</surname> <given-names>E.</given-names></name> <name><surname>Thomson</surname> <given-names>L.</given-names></name> <name><surname>Mutha</surname> <given-names>R.</given-names></name> <name><surname>Hay</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Reduced detection of positive expressions in major depression</article-title>. <source>Psychiatry Res.</source> <volume>240</volume>, <fpage>284</fpage>&#x02013;<lpage>287</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2016.04.075</pub-id><pub-id pub-id-type="pmid">27138819</pub-id></citation></ref>
<ref id="B28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mill</surname> <given-names>A.</given-names></name> <name><surname>Allik</surname> <given-names>J.</given-names></name> <name><surname>Realo</surname> <given-names>A.</given-names></name> <name><surname>Valk</surname> <given-names>R.</given-names></name></person-group> (<year>2009</year>). <article-title>Age-related differences in emotion recognition ability: a cross-sectional study</article-title>. <source>Emotion</source> <volume>9</volume>, <fpage>619</fpage>&#x02013;<lpage>630</lpage>. <pub-id pub-id-type="doi">10.1037/a0016562</pub-id><pub-id pub-id-type="pmid">19803584</pub-id></citation></ref>
<ref id="B29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Robinson</surname> <given-names>L. J.</given-names></name> <name><surname>Gray</surname> <given-names>J. M.</given-names></name> <name><surname>Burt</surname> <given-names>M.</given-names></name> <name><surname>Ferrier</surname> <given-names>I. N.</given-names></name> <name><surname>Gallagher</surname> <given-names>P.</given-names></name></person-group> (<year>2015</year>). <article-title>Processing of facial emotion in bipolar depression and euthymia</article-title>. <source>J. Int. Neuropsychol. Soc.</source> <volume>21</volume>, <fpage>709</fpage>&#x02013;<lpage>721</lpage>. <pub-id pub-id-type="doi">10.1017/S1355617715000909</pub-id><pub-id pub-id-type="pmid">26477679</pub-id></citation></ref>
<ref id="B30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Russell</surname> <given-names>T. A.</given-names></name> <name><surname>Chu</surname> <given-names>E.</given-names></name> <name><surname>Phillips</surname> <given-names>M. L.</given-names></name></person-group> (<year>2006</year>). <article-title>A pilot study to investigate the effectiveness of emotion recognition remediation in schizophrenia using the micro-expression training tool</article-title>. <source>Br. J. Clin. Psychol.</source> <volume>45</volume>, <fpage>579</fpage>&#x02013;<lpage>583</lpage>. <pub-id pub-id-type="doi">10.1348/014466505x90866</pub-id><pub-id pub-id-type="pmid">17076965</pub-id></citation></ref>
<ref id="B31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sanchez</surname> <given-names>A. H.</given-names></name> <name><surname>Lavaysse</surname> <given-names>L. M.</given-names></name> <name><surname>Starr</surname> <given-names>J. N.</given-names></name> <name><surname>Gard</surname> <given-names>D. E.</given-names></name></person-group> (<year>2014</year>). <article-title>Daily life evidence of environment-incongruent emotion in schizophrenia</article-title>. <source>Psychiatry Res.</source> <volume>220</volume>, <fpage>89</fpage>&#x02013;<lpage>95</lpage>. <pub-id pub-id-type="doi">10.1016/j.psychres.2014.07.041</pub-id><pub-id pub-id-type="pmid">25124684</pub-id></citation></ref>
<ref id="B32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schaefer</surname> <given-names>K. L.</given-names></name> <name><surname>Baumann</surname> <given-names>J.</given-names></name> <name><surname>Rich</surname> <given-names>B. A.</given-names></name> <name><surname>Luckenbaugh</surname> <given-names>D. A.</given-names></name> <name><surname>Zarate</surname> <given-names>C. A.</given-names></name></person-group> (<year>2010</year>). <article-title>Perception of facial emotion in adults with bipolar or unipolar depression and controls</article-title>. <source>J. Psychiatr. Res.</source> <volume>44</volume>, <fpage>1229</fpage>&#x02013;<lpage>1235</lpage>. <pub-id pub-id-type="doi">10.1016/j.jpsychires.2010.04.024</pub-id><pub-id pub-id-type="pmid">20510425</pub-id></citation></ref>
<ref id="B33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Stuhrmann</surname> <given-names>A.</given-names></name> <name><surname>Dohm</surname> <given-names>K.</given-names></name> <name><surname>Kugel</surname> <given-names>H.</given-names></name> <name><surname>Zwanzger</surname> <given-names>P.</given-names></name> <name><surname>Redlich</surname> <given-names>R.</given-names></name> <name><surname>Grotegerd</surname> <given-names>D.</given-names></name> <etal/></person-group>. (<year>2013</year>). <article-title>Mood-congruent amygdala responses to subliminally presented facial expressions in major depression: associations with anhedonia</article-title>. <source>J. Psychiatry Neurosci.</source> <volume>38</volume>, <fpage>249</fpage>&#x02013;<lpage>258</lpage>. <pub-id pub-id-type="doi">10.1503/jpn.120060</pub-id><pub-id pub-id-type="pmid">23171695</pub-id></citation></ref>
<ref id="B34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>Z.</given-names></name> <name><surname>Yuan</surname> <given-names>C.</given-names></name> <name><surname>Huang</surname> <given-names>J.</given-names></name> <name><surname>Li</surname> <given-names>Z.</given-names></name> <name><surname>Chen</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>H.</given-names></name> <etal/></person-group>. (<year>2011</year>). <article-title>Reliability and validity of the chinese version of beck depression inventory-II among depression patients</article-title>. <source>Chin. Ment. Health J.</source> <volume>25</volume>, <fpage>476</fpage>&#x02013;<lpage>480</lpage>. <pub-id pub-id-type="doi">10.3969/j.issn.1000-6729.2011.06.014</pub-id></citation></ref>
<ref id="B35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>X.</given-names></name> <name><surname>Chen</surname> <given-names>J.</given-names></name> <name><surname>Jia</surname> <given-names>T.</given-names></name> <name><surname>Ma</surname> <given-names>W.</given-names></name> <name><surname>Zhang</surname> <given-names>Y.</given-names></name> <name><surname>Deng</surname> <given-names>Z.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Cognitive bias by gender interaction on N170 response to emotional facial expressions in major and minor depression</article-title>. <source>Brain Topogr.</source> <volume>29</volume>, <fpage>232</fpage>&#x02013;<lpage>242</lpage>. <pub-id pub-id-type="doi">10.1007/s10548-015-0444-4</pub-id><pub-id pub-id-type="pmid">26239020</pub-id></citation></ref>
<ref id="B36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yan</surname> <given-names>W.</given-names></name> <name><surname>Wu</surname> <given-names>Q.</given-names></name> <name><surname>Liang</surname> <given-names>J.</given-names></name> <name><surname>Chen</surname> <given-names>Y.-H.</given-names></name> <name><surname>Fu</surname> <given-names>X.</given-names></name></person-group> (<year>2013</year>). <article-title>How Fast are the leaked facial expressions: the duration of micro-expressions</article-title>. <source>J. Nonverbal Behav.</source> <volume>37</volume>, <fpage>217</fpage>&#x02013;<lpage>230</lpage>. <pub-id pub-id-type="doi">10.1007/s10919-013-0159-8</pub-id></citation></ref>
<ref id="B37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>F. M.</given-names></name> <name><surname>Jones</surname> <given-names>R. N.</given-names></name></person-group> (<year>2008</year>). <article-title>Measurement differences in depression: chronic health-related and sociodemographic effects in older Americans</article-title>. <source>Psychosom. Med.</source> <volume>70</volume>, <fpage>993</fpage>&#x02013;<lpage>1004</lpage>. <pub-id pub-id-type="doi">10.1097/psy.0b013e31818ce4fa</pub-id><pub-id pub-id-type="pmid">18981269</pub-id></citation></ref>
<ref id="B38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yin</surname> <given-names>M.</given-names></name> <name><surname>Zhang</surname> <given-names>J.</given-names></name> <name><surname>Shi</surname> <given-names>A.</given-names></name> <name><surname>Liu</surname> <given-names>D.</given-names></name></person-group> (<year>2016</year>). <article-title>Characteristics, recognition, training of microexpressions and their influence factors</article-title>. <source>Adv. Psychol. Sci.</source> <volume>24</volume>, <fpage>1723</fpage>&#x02013;<lpage>1736</lpage>. <pub-id pub-id-type="doi">10.3724/sp.j.1042.2016.01723</pub-id></citation></ref>
<ref id="B41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>M.</given-names></name> <name><surname>Fu</surname> <given-names>Q.</given-names></name> <name><surname>Chen</surname> <given-names>Y.-H.</given-names></name> <name><surname>Fu</surname> <given-names>X.</given-names></name></person-group> (<year>2014</year>). <article-title>Emotional context influences micro-expression recognition</article-title>. <source>PLoS One</source> <volume>9</volume>:<fpage>e95018</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pone.0095018</pub-id><pub-id pub-id-type="pmid">24736491</pub-id></citation></ref>
<ref id="B39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>D.</given-names></name> <name><surname>He</surname> <given-names>Z.</given-names></name> <name><surname>Chen</surname> <given-names>Y.</given-names></name> <name><surname>Zhao</surname> <given-names>W.</given-names></name></person-group> (<year>2016</year>). <article-title>Deficits of unconscious emotional processing in patients with major depression: an ERP study</article-title>. <source>J. Affect. Disord.</source> <volume>199</volume>, <fpage>13</fpage>&#x02013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1016/j.jad.2016.03.056</pub-id><pub-id pub-id-type="pmid">27057648</pub-id></citation></ref>
<ref id="B40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>J.</given-names></name> <name><surname>Lu</surname> <given-names>L.</given-names></name> <name><surname>Yin</surname> <given-names>M.</given-names></name> <name><surname>Zhu</surname> <given-names>C.</given-names></name> <name><surname>Huang</surname> <given-names>C.</given-names></name> <name><surname>Liu</surname> <given-names>D.</given-names></name></person-group> (<year>2017</year>). <article-title>The establishment of ecological microexpressions recognition test: an improvement on JACBART microexpressions recognition test</article-title>. <source>Acta Psychol. Sin.</source> <volume>49</volume>, <fpage>886</fpage>&#x02013;<lpage>896</lpage>. <pub-id pub-id-type="doi">10.3724/sp.j.1041.2017.00886</pub-id></citation></ref>
<ref id="B42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhu</surname> <given-names>C.</given-names></name> <name><surname>Li</surname> <given-names>P.</given-names></name> <name><surname>Luo</surname> <given-names>W.</given-names></name> <name><surname>Qi</surname> <given-names>Z.</given-names></name> <name><surname>He</surname> <given-names>W.</given-names></name></person-group> (<year>2016</year>). <article-title>Emotion regulation in schizophrenia</article-title>. <source>Adv. Psychol. Sci.</source> <volume>24</volume>, <fpage>556</fpage>&#x02013;<lpage>572</lpage>. <pub-id pub-id-type="doi">10.3724/SP.J.1042.2016.00556</pub-id></citation></ref>
</ref-list>
<glossary>
<def-list>
<title>Abbreviations</title>
<def-item><term>ACC</term><def><p>accuracy</p></def></def-item>
<def-item><term>ME</term><def><p>micro-expression</p></def></def-item>
<def-item><term>RT</term><def><p>reaction time.</p></def></def-item>
</def-list>
</glossary>
</back>
</article>
