<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="2.3">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2021.638398</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>The Effect of the Intensity of Happy Expression on Social Perception of Chinese Faces</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Li</surname>
<given-names>Yaning</given-names>
</name>
<xref rid="aff1" ref-type="aff"><sup>1</sup></xref>
<xref rid="aff2" ref-type="aff"><sup>2</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1122347/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Jiang</surname>
<given-names>Zhongqing</given-names>
</name>
<xref rid="aff2" ref-type="aff"><sup>2</sup></xref>
<xref rid="c001" ref-type="corresp"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/368490/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Yang</surname>
<given-names>Yisheng</given-names>
</name>
<xref rid="aff1" ref-type="aff"><sup>1</sup></xref>
<xref rid="c002" ref-type="corresp"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1355803/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Leng</surname>
<given-names>Haizhou</given-names>
</name>
<xref rid="aff3" ref-type="aff"><sup>3</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/609892/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Pei</surname>
<given-names>Fuhua</given-names>
</name>
<xref rid="aff1" ref-type="aff"><sup>1</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1363311/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Wu</surname>
<given-names>Qi</given-names>
</name>
<xref rid="aff2" ref-type="aff"><sup>2</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1355691/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>School of Psychology, Inner Mongolia Normal University</institution>, <addr-line>Hohhot</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>College of Psychology, Liaoning Normal University</institution>, <addr-line>Dalian</addr-line>, <country>China</country></aff>
<aff id="aff3"><sup>3</sup><institution>Department of Elementary Education, Hebei Normal University</institution>, <addr-line>Shijiazhuang</addr-line>, <country>China</country></aff>
<author-notes>
<fn id="fn1" fn-type="edited-by"><p>Edited by: Fernando Barbosa, University of Porto, Portugal</p></fn>
<fn id="fn2" fn-type="edited-by"><p>Reviewed by: Patrice Rusconi, University of Messina, Italy; Beatrice Biancardi, TELECOM ParisTech, France</p></fn>
<corresp id="c001">&#x002A;Correspondence: Zhongqing Jiang, <email>jzqcjj@hotmail.com</email></corresp>
<corresp id="c002">Yisheng Yang, <email>yangys1965@163.com</email></corresp>
<fn id="fn3" fn-type="other"><p>This article was submitted to Personality and Social Psychology, a section of the journal Frontiers in Psychology</p></fn>
</author-notes>
<pub-date pub-type="epub">
<day>14</day>
<month>06</month>
<year>2021</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>12</volume>
<elocation-id>638398</elocation-id>
<history>
<date date-type="received">
<day>06</day>
<month>12</month>
<year>2020</year>
</date>
<date date-type="accepted">
<day>17</day>
<month>05</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2021 Li, Jiang, Yang, Leng, Pei and Wu.</copyright-statement>
<copyright-year>2021</copyright-year>
<copyright-holder>Li, Jiang, Yang, Leng, Pei and Wu</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>Numerous studies have shown that facial expressions influence trait impressions in the Western context. There are cultural differences in the perception and recognition rules of different intensities of happy expressions, and researchers have only explored the influence of the intensity of happy expressions on a few facial traits (warmth, trustworthiness, and competence). Therefore, we examined the effect of different intensities of Chinese happy expressions on the social perception of faces from 11 traits, namely trustworthiness, responsibility, attractiveness, sociability, confidence, intelligence, aggressiveness, dominance, competence, warmth, and tenacity. In this study, participants were asked to view a series of photographs of faces with high-intensity or low-intensity happy expressions and rate the 11 traits on a 7-point Likert scale (1 = &#x201C;<italic>not very &#x00D7;&#x00D7;</italic>,&#x201D; 7 = &#x201C;<italic>very &#x00D7;&#x00D7;</italic>&#x201D;). The results indicated that high-intensity happy expression had higher-rated scores for sociability and warmth but lower scores for dominance, aggressiveness, intelligence, and competence than the low-intensity happy expression; there was no significant difference in the rated scores for trustworthiness, attractiveness, responsibility, confidence, and tenacity between the high-intensity and low-intensity happy expressions. These results suggested that, compared to the low-intensity happy expression, the high-intensity happy expression will enhance the perceptual outcome of the traits related to approachability, reduce the perceptual outcome of traits related to capability, and have no significant effect on trustworthiness, attractiveness, responsibility, confidence, and tenacity.</p>
</abstract>
<kwd-group>
<kwd>happy expression</kwd>
<kwd>social perception</kwd>
<kwd>intensity</kwd>
<kwd>Chinese faces</kwd>
<kwd>trait impression</kwd>
</kwd-group>
<contract-num rid="cn1">19YJA850014</contract-num>
<contract-sponsor id="cn1">Ministry of Education<named-content content-type="fundref-id">10.13039/501100002701</named-content>
</contract-sponsor>
<counts>
<fig-count count="1"/>
<table-count count="9"/>
<equation-count count="0"/>
<ref-count count="72"/>
<page-count count="13"/>
<word-count count="10820"/>
</counts>
</article-meta>
</front>
<body>
<sec id="sec1" sec-type="intro">
<title>Introduction</title>
<p>Cultural wisdom warns us not to judge a book by its cover. This suggests that the natural inclination is to judge people by their appearance. Indeed, when meeting strangers for the first time, people infer many characteristics about them based on their facial information (e.g., facial expressions), even in 34 ms (<xref ref-type="bibr" rid="ref67">Willis and Todorov, 2006</xref>; <xref ref-type="bibr" rid="ref57">Todorov et al., 2015</xref>). This inference process is called &#x201C;social perception of faces&#x201D; (<xref ref-type="bibr" rid="ref48">Oosterhof and Todorov, 2008</xref>), and the inference results can affect the decisions of people, such as mate selection (<xref ref-type="bibr" rid="ref47">Olivola et al., 2014</xref>; <xref ref-type="bibr" rid="ref60">Valentine et al., 2014</xref>), trial outcomes (<xref ref-type="bibr" rid="ref68">Wilson and Rule, 2015</xref>; <xref ref-type="bibr" rid="ref32">Jaeger et al., 2020</xref>), and election outcomes (<xref ref-type="bibr" rid="ref44">Na and Huh, 2016</xref>; <xref ref-type="bibr" rid="ref69">Wong and Zeng, 2017</xref>).</p>
<sec id="sec2">
<title>Cultural Similarity and Difference in the Social Perception of Faces</title>
<p>Recently, researchers have started to model the structure underlying the social perception of faces. <xref ref-type="bibr" rid="ref48">Oosterhof and Todorov (2008)</xref> used the trait assessment task to identify two evaluative dimensions: (1) valence related to approach-avoidance and (2) dominance related to physical strength-weakness. Based on the principal component analysis, the trustworthiness score can be used as the representative of the valence dimension, which refers to the behavioral intention of the target face to benefit or harm others. On the other hand, the dominance dimension refers to the ability of the target face to harm others (<xref ref-type="bibr" rid="ref48">Oosterhof and Todorov, 2008</xref>). <xref ref-type="bibr" rid="ref70">Wu et al. (2020)</xref> recruited local Chinese participants and used the trait assessment task to identify an approach-avoidance dimension, which was held cross-culturally, as well as a broader &#x201C;capability&#x201D; dimension that included dominance and tenacity related to physical and intellectual strength. The rating of the &#x201C;capability&#x201D; dimension was crucial for the survival of individuals and to obtain resources and a high social status, which might be considered more typical in collective societies such as China.</p>
<p>Additionally, the top-down stereotype content model has established that perceived warmth and competence are the two universal dimensions of human social cognition both at the individual and group levels. The warmth dimension includes traits that relate to perceived intent, which aligns with the approach-avoidance dimension which includes trustworthiness (<xref ref-type="bibr" rid="ref21">Fiske et al., 2007</xref>). However, some researchers proposed the &#x201C;morality differentiation hypothesis,&#x201D; which suggests that trustworthiness and warmth are separate dimensions. These researchers define trustworthiness related to morality as the behavioral intention to categorize others as either enemies or friends. Conversely, warmth, considered unrelated to morality, has been defined as the proficiency of an individual in recruiting support for their intentions (<xref ref-type="bibr" rid="ref24">Goodwin et al., 2014</xref>; <xref ref-type="bibr" rid="ref38">Landy et al., 2016</xref>; <xref ref-type="bibr" rid="ref46">Oliveira et al., 2020</xref>). Although others do not strongly argue for such distinction and consider trustworthiness and sociability as subcomponents of the warmth dimension, according to them, trustworthiness related to morality can be viewed to be distinct from, and primary compared to, sociability. Sociability implies being benevolent to people in ways that facilitate affectionate relations with them, but trustworthiness refers to being benevolent to people in ways that facilitate correct and principled relations with them (<xref ref-type="bibr" rid="ref5">Brambilla et al., 2011</xref>, <xref ref-type="bibr" rid="ref6">2012</xref>; <xref ref-type="bibr" rid="ref4">Brambilla and Leach, 2014</xref>). However, because the stereotype content model (two-dimension theories) agglomerated moral and amoral traits within a single dimension, people do not predict that the moral relevance of traits (as opposed to their warmth relevance) should have any special importance for person perception, and the omission of this information from two-dimensional models may therefore lead to a loss of predictive power (<xref ref-type="bibr" rid="ref24">Goodwin et al., 2014</xref>).</p>
<p>Similar to the warmth-competence stereotype content models, the approach-avoidance dimension in the social perception of faces also agglomerated moral and amoral traits (e.g., trustworthiness and sociability) within a single dimension (<xref ref-type="bibr" rid="ref48">Oosterhof and Todorov, 2008</xref>; <xref ref-type="bibr" rid="ref57">Todorov et al., 2015</xref>; <xref ref-type="bibr" rid="ref56">Todorov and Oh, 2021</xref>), and this might also obscure the information from the moral-relevance and warmth-relevance of traits, which would not reflect their special importance for person perception. Therefore, in the present study, we used the traits assessment task to rate multiple traits rather than dimensions and to explore the effects of happy expression intensity on the social perception of Chinese faces, which would provide more information about the social perception of Chinese faces. This would be a novel perspective in the study of the first impression of strange faces in the Chinese context to explore the &#x201C;morality differentiation hypothesis.&#x201D;</p>
</sec>
<sec id="sec3">
<title>The Effect of Happy Expressions on the Social Perception of Faces</title>
<p>Facial cues in the social perception of faces include immutable (e.g., identity, gender, and race) and variable (e.g., expressions) cues (<xref ref-type="bibr" rid="ref26">Haxby et al., 2000</xref>). In contrast to immutable cues, variable facial expressions provide critical clues while the social perception of faces is formed (<xref ref-type="bibr" rid="ref54">Sutherland et al., 2017</xref>). In daily life, happy and neutral expressions are most frequently present on the faces of people. Compared to neutral expressions, happy expressions increase face value in interpersonal communication, resulting in a halo effect. This is the tendency for the positive traits of an individual to &#x201C;overflow&#x201D; into additional trait areas in perceptions of others of them (<xref ref-type="bibr" rid="ref55">Thompson and Meltzer, 1964</xref>). Smiling faces have been rated as more trustworthy, attractive, and popular (<xref ref-type="bibr" rid="ref27">Hehman et al., 2019</xref>; <xref ref-type="bibr" rid="ref40">Li et al., 2020</xref>), and less aggressive (<xref ref-type="bibr" rid="ref48">Oosterhof and Todorov, 2008</xref>). Previous research indicates that facial expressions influence the perception of a single specific dimension of trustworthiness (<xref ref-type="bibr" rid="ref9">Caulfield et al., 2016</xref>; <xref ref-type="bibr" rid="ref52">Sandy et al., 2017</xref>), dominance (<xref ref-type="bibr" rid="ref35">Kim et al., 2016</xref>; <xref ref-type="bibr" rid="ref59">Ueda and Yoshikawa, 2018</xref>), warmth (<xref ref-type="bibr" rid="ref65">Wang et al., 2017</xref>), and capability (<xref ref-type="bibr" rid="ref1">Beall, 2007</xref>; <xref ref-type="bibr" rid="ref22">Gao et al., 2016</xref>). However, a few studies have directly evaluated the expression effects of multiple traits. Referring to the research by <xref ref-type="bibr" rid="ref40">Li et al. (2020)</xref> and <xref ref-type="bibr" rid="ref70">Wu et al. (2020)</xref> was the first group to directly compare the effects of happy Chinese expressions on multiple traits. The results indicated that the evaluation scores of trustworthiness and warmth regarding happy facial expressions varied, which supported the &#x201C;morality differentiation hypothesis.&#x201D; These results indicate that it is necessary to explore the effects of happy expressions on multiple traits rather than just single dimensions of the social perception of faces.</p>
<p>In the context of Western culture, mounting evidence indicates that happy expressions of different intensities convey different types of social information. Researchers believe that the intensity of expression corresponds to the intensity of behavioral tendencies (<xref ref-type="bibr" rid="ref18">Ekman et al., 1980</xref>). Studies have reported that, compared with neutral facial expressions, happy facial expressions at different intensities (25 and 50%) increase the perception scores of trustworthiness among children above 10 years old and that the degree of influence proportionally increases with emotional intensity (<xref ref-type="bibr" rid="ref29">Hess et al., 2000</xref>; <xref ref-type="bibr" rid="ref10">Caulfield et al., 2014</xref>, <xref ref-type="bibr" rid="ref9">2016</xref>). Furthermore, compared to a low-intensity smile, a high-intensity teeth-showing smile increases the friendly and approachable behavioral tendency of the face, enhancing affinity to the individual. When people are eager to build cooperative relationships with others (<xref ref-type="bibr" rid="ref42">Mehu et al., 2008</xref>; <xref ref-type="bibr" rid="ref2">Bell et al., 2017</xref>) or are in search of harmonious interpersonal relationships (<xref ref-type="bibr" rid="ref28">Hennig-Thurau et al., 2006</xref>), they tend to display a wider smile. Rhesus monkeys also display a toothy smile in subordinate environments, a defensive gesture showing friendly intentions (<xref ref-type="bibr" rid="ref13">de Waal and Luttrell, 1985</xref>); whereas the bared-teeth display of chimpanzees communicates a benign and non-aggressive intent in affiliate environments (<xref ref-type="bibr" rid="ref49">Parr and Waller, 2006</xref>). Thus, positive traits associated with sociality (e.g., trustworthiness, submissiveness, and warmth) have been positively correlated with the intensity of happy expressions. However, grins are considered to signal incapability. For example, professional fighters who laugh in pre-match photos are perceived to be less aggressive, less dominant, and more likely to lose than low-intensity smiling fighters (<xref ref-type="bibr" rid="ref36">Kraus and Chen, 2013</xref>).</p>
<p>While many studies have been conducted on the influence of the intensity of happy expressions on the social perception of faces in Western culture, there are numerous necessary reasons for studying how expression intensity influences Eastern cultures. First, it should be noted that cultural differences exist in the frequency and rules of happy expressions. For example, when comparing photos of Western and Eastern leaders before and after elections, it was found that regardless of the election results, Western leaders presented a high-intensity smile, while Eastern leaders presented a calm and weak smile (<xref ref-type="bibr" rid="ref58">Tsai et al., 2016</xref>; <xref ref-type="bibr" rid="ref19">Fang et al., 2019</xref>). Furthermore, when articulating happy expressions <italic>via</italic> texting, Westerners often use parenthesis and a colon, such as in :-) or :), to exaggerate the mouth and reduce the eyes, respectively. In contrast, Easterners often use emoticons, such as (^.^) or (^_^), where the mouth is simplified but the eyes are expressive (<xref ref-type="bibr" rid="ref41">Liu et al., 2010</xref>). Second, it should also be noted that cultural differences exist in the interpretation of happy expressions. For example, Chinese people believe that a smiling face signals emotional instability, while Americans do not (<xref ref-type="bibr" rid="ref61">Walker et al., 2011</xref>). Third, although the &#x201C;approachability&#x201D; dimension displays cross-cultural consistency (<xref ref-type="bibr" rid="ref53">Sutherland et al., 2018</xref>; <xref ref-type="bibr" rid="ref70">Wu et al., 2020</xref>; <xref ref-type="bibr" rid="ref34">Jones et al., 2021</xref>), contradictory perspectives exist regarding how the meaning of trustworthiness and warmth in the &#x201C;approachability&#x201D; dimension is interpreted. In the context of Western culture, some researchers believed that the meanings of these two traits are similar (<xref ref-type="bibr" rid="ref21">Fiske et al., 2007</xref>; <xref ref-type="bibr" rid="ref63">Wang et al., 2019</xref>), while others supported the &#x201C;morality differentiation hypothesis&#x201D; (<xref ref-type="bibr" rid="ref24">Goodwin et al., 2014</xref>; <xref ref-type="bibr" rid="ref38">Landy et al., 2016</xref>). Trustworthiness focuses on morality, while warmth focuses on social interaction (<xref ref-type="bibr" rid="ref62">Wang and Cui, 2003</xref>), which has been supported by comparing the scores of the traits in happy and neutral expressions (<xref ref-type="bibr" rid="ref40">Li et al., 2020</xref>). This suggests that displaying happy expressions might be a possible way to separate the two traits, but previous research still lacks relevant in-depth exploration. Additionally, content differences exist in the &#x201C;capability&#x201D; dimension of the social perception of the Chinese faces model and the &#x201C;dominance&#x201D; dimension of the valence-dominance model. Fourth, in previous studies, researchers used composite software that combined images of neutral and happy facial expressions in different proportions to form experimental materials with two different physical strengths (25 and 50%; <xref ref-type="bibr" rid="ref10">Caulfield et al., 2014</xref>, <xref ref-type="bibr" rid="ref9">2016</xref>). For example, 25% of happy expressions were a 75/25 combination of neutral and happy expressions. The researchers then used the materials to investigate how the intensity of happy expressions affected the social perception of faces. However, the physical intensity of happy expressions did not strictly correspond to its perceived emotional intensity (<xref ref-type="bibr" rid="ref29">Hess et al., 2000</xref>; <xref ref-type="bibr" rid="ref9">Caulfield et al., 2016</xref>). Moreover, the composite images were more likely different from the natural faces that participants would encounter in daily life; therefore, they might not have matched with the mental representations of the participants (<xref ref-type="bibr" rid="ref30">Hu et al., 2018</xref>). It is thus necessary to compare the influence of different intensities of happy expressions on the social perception of faces in the Chinese context with more natural photos.</p>
</sec>
<sec id="sec4">
<title>The Present Study</title>
<p>In the present study, we investigated the effect of different intensities of happy expressions on the social perception of Chinese faces, which has not been previously addressed. We selected a series of high- and low-intensity happy face images. Participants were asked to rate these face images according to the traits of trustworthiness, responsibility, attractiveness, sociability, confidence, intelligence, aggressiveness, dominance, competence, warmth, and tenacity. These traits were derived from the study by <xref ref-type="bibr" rid="ref70">Wu et al. (2020)</xref>, and we chose 11 of them instead of 15 for the following reasons: these 11 traits had high internal consistency and overlapped with the traits included in the studies of <xref ref-type="bibr" rid="ref21">Fiske et al. (2007)</xref> and <xref ref-type="bibr" rid="ref48">Oosterhof and Todorov (2008)</xref>, so they could be used as representative traits in the study of social perception of faces. The four traits of masculinity, femininity, emotional stability, and likeability were not included. The traits of masculinity and femininity were excluded because <xref ref-type="bibr" rid="ref70">Wu et al. (2020)</xref> performed the principal component analysis of traits without the femininity and masculinity ratings. The trait of emotional stability was excluded due to low internal consistency (<xref ref-type="bibr" rid="ref40">Li et al., 2020</xref>). Likeability was excluded to avoid overlap with sociability (<xref ref-type="bibr" rid="ref40">Li et al., 2020</xref>; <xref ref-type="bibr" rid="ref7">Brambilla et al., 2021</xref>). Although the cultural consensus regarding the meaning of the &#x201C;approachability&#x201D; dimension, based on the &#x201C;moral differentiation hypothesis,&#x201D; we hypothesized that the intensity of happy expressions had different effects on trustworthiness-related traits and warmth-related traits (Hypothesis 1). In addition, since Chinese leaders presented calm, weak smiles in political elections, we hypothesized that low-intensity happy expressions would be rated as more capable than high-intensity happy expressions (Hypothesis 2).</p>
</sec>
</sec>
<sec id="sec5" sec-type="materials|methods">
<title>Materials and Methods</title>
<sec id="sec6">
<title>Participants</title>
<p>A total of 32 Chinese college students aged 18&#x2013;25 years (16 males and 16 females, mean age 22.06 &#x00B1; 2.17 years) from Liaoning Normal University participated in the face photo trait-rating experiment. All participants reported normal or corrected-to-normal visual acuity and normal color vision, claimed to be free of current and previous neurological and psychiatric disorders and were not currently using psychotropic medication. All participants were right-handed according to a self-report questionnaire. The sample size for the main study (<italic>N</italic> = 27) was considered appropriate to conduct a 2 &#x00D7; 2 repeated-measures ANOVA since the focus was on the main effect of only one variable (<xref ref-type="bibr" rid="ref8">Brysbaert, 2019</xref>). The present study only focused on the main effect of the happy expression intensity; therefore, <italic>post hoc</italic> analysis was performed using the G&#x2217;Power software. The analysis indicated that the sample of the study (<italic>N</italic> = 32) was sufficient to detect an effect size of <italic>f</italic> = 0.40 (median effect) with a power of 1 &#x2212; &#x03B2; = 0.8 (<xref ref-type="bibr" rid="ref8">Brysbaert, 2019</xref>). All participants provided written informed consent and were paid CHN&#x00A5;40 for their participation in the 1 h experiment. The study was previously approved by the Academic Ethics Committee of Liaoning Normal University.</p>
</sec>
<sec id="sec7">
<title>Stimuli</title>
<sec id="sec8">
<title>Stimuli Development</title>
<p>A total of 76 smiling face photos were randomly selected from the Taiwan Facial Expression Image Database (TFEID; <xref ref-type="bibr" rid="ref11">Chen and Yen, 2007</xref>). These photos were recorded from 38 Chinese people (19 males and 19 females). Two photographs were taken of each individual, one depicting a high-intensity smile and the other a low-intensity smile.</p>
<p>To have enough stimuli for the formal experiment, another 56 smiling face images were collected by taking photos of 28 additional Chinese college students (14 males and 14 females, mean age = 24.46 &#x00B1; 1.45 years) using the procedure defined by <xref ref-type="bibr" rid="ref11">Chen and Yen (2007)</xref> for TFEID. Before their photos were taken, the participants were shown sample photos of happy expressions in different intensities (high-intensity and low-intensity; selected from the TFEID, facial recognition rate &#x003E;90%). The sample photos of happy expressions were formed according to the instructions of the Facial Action Coding System (FACS; <xref ref-type="bibr" rid="ref16">Ekman et al., 2002</xref>). The facial movements of happy expression included (a) pushed-up cheeks, in which skin gathers under the eye, and a narrowed eye aperture and (b) pulled-up lip corners. Prior literature has determined that at a muscular level, smile intensity is indicated by the amplitude of the zygomatic major movement (the muscle group responsible for pulling the lips upwards; <xref ref-type="bibr" rid="ref17">Ekman, 1993</xref>). Happy expressions of different intensities are mainly different in their zygomatic major movement levels. A low-intensity happy expression displays a slight contraction of the zygomatic major, which is not enough to show the teeth; a high-intensity happy expression involves displaying an intense contraction of the zygomatic major, which leads to a toothy smile. Participants relaxed their facial muscles and made corresponding expressions by imitating facial muscle movements of happy expressions of different intensities, as depicted in the photos. The location and light were identical, the participants wore the same clothes (white lab coat), the camera parameters were fixed (ISO 1600, 1/100 s, F/4.5), the camera was parallel to the faces of the participants, and the distance from the camera to the participant was 150 cm. After taking the photos, the image standardization process was also conducted according to the criteria of TFEID using Adobe Photoshop (Adobe, 2018) to remove the hair, ears, neck, accessories, and other external features, leaving only facial information. The unified image size was 480 pixels &#x00D7; 600 pixels, and a 4.05 cm &#x00D7; 5.85 cm black circle was applied around each face so that each face only displayed the internal information of the face, such as the eyebrows, eyes, nose, and mouth (example shown in <xref rid="fig1" ref-type="fig">Figure 1</xref>).</p>
<fig position="float" id="fig1">
<label>Figure 1</label>
<caption><p>Example of the experimental picture. <bold>(A)</bold> High-intensity happy expression. <bold>(B)</bold> Low-intensity happy expression.</p></caption>
<graphic xlink:href="fpsyg-12-638398-g001.tif"/>
</fig>
</sec>
<sec id="sec9">
<title>Stimuli Validation</title>
<p>To ensure that the photos presented happy expressions with a high- or low-intensity smile, a screening assessment was conducted. An additional 30 college students were recruited (15 males and 15 females, mean age = 22.17 &#x00B1; 2.45 years) to assess all 132 photos that were selected from the TFEID and taken by the lab of the researchers. All participants were unfamiliar with the faces in the photos.</p>
<p>The program for assessment was compiled and presented in E-prime 2.0 (PST, 2013), and then divided into two phases: a practice phase (eight trials) and a formal phase (132 trials). The procedure in both phases was identical. For the practice phase, eight additional photos of Chinese people were selected from the TFEID, but they were not used in the formal phase. All eight selected photos corresponded to an emotion type including anger, sadness, fear, happiness, disgust, surprise, contempt, and a neutral expression. For the formal phase, the stimuli were selected from the TFEID and taken from the lab. The participants were tasked with judging the expression type and rating the intensity level of the face. After participants reached a 90% accuracy rate of judging the facial expression type in the practice phase, they entered the formal phase. If the participants failed to reach the 90% threshold within the practice phase, then they remained in that phase. The average number of practice trials was eight.</p>
<p>In each trial, the fixation point was initially presented for 1,000 ms, and facial expression photos were then presented randomly. For each face photo, the participants were first requested to classify the emotion types of the faces by pressing one of eight emotions labeled on the keys of a numeric keypad (1 for &#x201C;angry,&#x201D; 2 for &#x201C;sad,&#x201D; 3 for &#x201C;fear,&#x201D; 4 for &#x201C;happy,&#x201D; 5 for &#x201C;disgusting,&#x201D; 6 for &#x201C;surprise,&#x201D; 7 for &#x201C;contempt,&#x201D; and 8 for &#x201C;neutral&#x201D;). They were then asked to rate the emotional intensity of the faces on a 9-point Likert scale ranging from 0 (<italic>no emotion</italic>) to 8 (<italic>very strong emotion</italic>) on the alphanumeric keys. The image disappeared after the participant pressed the button, which was followed by a blank screen for 1,000 ms. The participants were given a 30-s break after completing 50 trials before they continued the experiment.</p>
<p>After collecting the assessment data, the recognition accuracy and intensity of happy expressions of each participant were calculated. Recognition accuracy denotes the percentage of the number of photos rated as the happy expression type compared to the total number of photos. Happy emotional intensity refers to the average value of the emotional intensity scores of all photos. A normal distribution test and homogeneity of variance test were conducted for the recognition accuracy and emotional intensity of TFEID, as well as for the newly collected images. The results indicated that the data satisfied normal distribution (Kolmogorov&#x2013;Smirnov: <italic>p</italic> &#x003E; 0.05) and homogeneity of variance (Levene&#x2019;s statistic: <italic>p</italic> &#x003E; 0.05). SPSS 24.0 software (IBM, 2018) was used to perform paired <italic>t</italic>-tests on the means of the recognition accuracy and intensity of happy expressions for the image sources and facial gender obtained from the responses of the same participants (<italic>N</italic> = 30). The results displayed no significant differences in the image sources (i.e., the TFEID facial expressions and photographed facial expressions by the lab of the researchers; as shown in <xref rid="tab1" ref-type="table">Table 1</xref>) or facial gender (as shown in <xref rid="tab2" ref-type="table">Table 2</xref>); this indicates that the images taken by the lab of the authors were equivalent to the TFEID images.</p>
<table-wrap position="float" id="tab1">
<label>Table 1</label>
<caption><p>Evaluation scores of stimuli from different sources (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="top"/>
<th align="center" valign="top">Taiwan emotional faces</th>
<th align="center" valign="top">Emotional faces photographed</th>
<th align="center" valign="top"><italic>df</italic></th>
<th align="center" valign="top"><italic>t</italic></th>
<th align="center" valign="top"><italic>p</italic></th>
<th align="center" valign="top">Cohen&#x2019;s <italic>d</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Recognition accuracy</td>
<td align="left" valign="top">0.94 &#x00B1; 0.09</td>
<td align="left" valign="top">0.93 &#x00B1; 0.10</td>
<td align="center" valign="top">29</td>
<td align="left" valign="top">1.96</td>
<td align="left" valign="top">0.060</td>
<td align="left" valign="top">0.36</td>
</tr>
<tr>
<td align="left" valign="top">Intensity</td>
<td align="left" valign="top">3.96 &#x00B1; 0.89</td>
<td align="left" valign="top">3.88 &#x00B1; 0.91</td>
<td align="center" valign="top">29</td>
<td align="left" valign="top">1.95</td>
<td align="left" valign="top">0.061</td>
<td align="left" valign="top">0.36</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.05 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
<table-wrap position="float" id="tab2">
<label>Table 2</label>
<caption><p>Evaluation scores of stimuli from different facial gender (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="center" valign="top"/>
<th align="center" valign="top">Male faces</th>
<th align="center" valign="top">Female faces</th>
<th align="center" valign="top"><italic>df</italic></th>
<th align="center" valign="top"><italic>t</italic></th>
<th align="center" valign="top"><italic>p</italic></th>
<th align="center" valign="top">Cohen&#x2019;s <italic>d</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Recognition accuracy</td>
<td align="left" valign="top">0.93 &#x00B1; 0.10</td>
<td align="left" valign="top">0.95 &#x00B1; 0.09</td>
<td align="center" valign="top">29</td>
<td align="left" valign="top">&#x2212;1.69</td>
<td align="left" valign="top">0.101</td>
<td align="left" valign="top">0.31</td>
</tr>
<tr>
<td align="left" valign="top">Intensity</td>
<td align="left" valign="top">3.94 &#x00B1; 0.87</td>
<td align="left" valign="top">3.90 &#x00B1; 0.92</td>
<td align="center" valign="top">29</td>
<td align="left" valign="top">0.99</td>
<td align="left" valign="top">0.332</td>
<td align="left" valign="top">0.18</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.05 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
<p>Additionally, the 132 face photos of 66 people that were taken (one photo with a high-intensity smile and another with a low-intensity smile for each person) were divided into two equal groups. In each group, no persons were represented in more than one face photo, and thus, the participants did not view two photos of the same person. These two groups of face photos were used separately to compile one version of a trait rating program; this was done to avoid interference with the identity information in the trait rating. Thus, Version 1 of the trait rating program included the face photos of 19 people with high-intensity happy expressions from TFEID and the face photos of 14 people with high-intensity happy expressions from the photos taken by the lab. The photos with low-intensity happy expressions of these 33 people were assigned to Version 2. The other photos were assigned in this same way among Versions 1 and 2.</p>
<p>After dividing the photos into two versions, paired <italic>t</italic>-tests were conducted on the means of recognition accuracy and intensity of happy expressions for the two versions. The results showed significant differences for the high- and low-intensity smiling faces in each version [Version 1: (5.33 &#x00B1; 1.01) vs (2.47 &#x00B1; 0.93), <italic>t</italic> (29) = 20.88, <italic>p</italic> &#x003C; 0.001, Cohen&#x2019;s <italic>d</italic> = 3.81; Version 2: (5.36 &#x00B1; 1.01) vs (2.55 &#x00B1; 0.94), <italic>t</italic> (29) = 19.67, <italic>p</italic> &#x003C; 0.001, Cohen&#x2019;s <italic>d</italic> = 3.59]. These two versions were well matched because no significant difference was observed in the smile intensity between the high-intensity smile faces across the two versions [(5.33 &#x00B1; 1.01) vs (5.36 &#x00B1; 1.01), <italic>t</italic> (29) = &#x2212;0.66, <italic>p</italic> = 0.516, Cohen&#x2019;s <italic>d</italic> = 0.12], nor between the low-intensity smile faces across the two versions [(2.47 &#x00B1; 0.93) vs (2.55 &#x00B1; 0.94), <italic>t</italic> (29) = &#x2212;1.69, <italic>p</italic> = 0.102, Cohen&#x2019;s <italic>d</italic> = 0.31].</p>
</sec>
</sec>
<sec id="sec10">
<title>Trait Assessment Task Procedure</title>
<p>The participants were tested in a quiet and comfortable laboratory with good sound insulation. They were introduced to the trait assessment task and were informed that their task would be to rate a series of face photos using a list of trait adjectives. As previously described, two groups of face photos were used to create two equivalent versions of a program for rating traits. One of the two versions was randomly selected for each participant. Each version comprised eight trials in the practice stage and 726 trials in the formal experimental stage. The stimuli in the practice stage were selected from the Compound Facial Expressions of Emotion (CFEE) Database (<xref ref-type="bibr" rid="ref15">Du et al., 2014</xref>). Eight Asian faces were randomly selected, including four neutral expressions and four happy expressions. The stimuli in the formal experimental stage were selected from the face photos of 66 people (focusing on expression intensity and gender information). The 726 trials comprised 11 blocks, in which the face photos of the 66 people were repeatedly presented 11 times. Each block was assigned so that the participants rated each of the 11 traits, namely trustworthiness, responsibility, attractiveness, sociability, confidence, intelligence, aggressiveness, dominance, competence, warmth, and tenacity on a 7-point Likert scale ranging from 1 (<italic>not very &#x00D7;&#x00D7;</italic>) to 7 (<italic>very &#x00D7;&#x00D7;</italic>).</p>
<p>Each block was presented in a random order among the participants. After completing each block, the participants rested for at least 60 s so that they could have a break before proceeding to the next experiment block. The entire experiment lasted for approximately 1 h.</p>
</sec>
</sec>
<sec id="sec11" sec-type="results">
<title>Results</title>
<p>The present study was designed as a 2 (expression intensity: high and low) &#x00D7; 2 (face gender: male and female) within-subjects design, and the dependent variable was the evaluation score of 11 traits: trustworthiness, responsibility, sociability, attractiveness, confidence, intelligence, aggressiveness, dominance, competence, warmth, and tenacity. The SPSS 24.0 (IBM, 2018) statistical software was used for data processing and analysis. First, Cronbach&#x2019;s <italic>&#x03B1;</italic> coefficient was calculated to test the stability and consistency of the evaluations of different participants of each trait, which determined that all Cronbach&#x2019;s alphas were above 0.77 (as shown in <xref rid="tab3" ref-type="table">Table 3</xref>). This indicated that the evaluation scores of these 11 traits had good internal consistency, even though participants were judging different intensities of natural photographs (<xref ref-type="bibr" rid="ref45">Nunnally, 1978</xref>).</p>
<table-wrap position="float" id="tab3">
<label>Table 3</label>
<caption><p>The Cronbach alphas of 11 trait rating scores.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">Type of traits</th>
<th align="center" valign="top">High-intensity happy</th>
<th align="center" valign="top">Low-intensity happy</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="middle">Trustworthiness</td>
<td align="left" valign="middle">0.88</td>
<td align="left" valign="middle">0.82</td>
</tr>
<tr>
<td align="left" valign="middle">Responsibility</td>
<td align="left" valign="middle">0.91</td>
<td align="left" valign="middle">0.80</td>
</tr>
<tr>
<td align="left" valign="middle">Sociability</td>
<td align="left" valign="middle">0.88</td>
<td align="left" valign="middle">0.77</td>
</tr>
<tr>
<td align="left" valign="middle">Attractiveness</td>
<td align="left" valign="middle">0.84</td>
<td align="left" valign="middle">0.82</td>
</tr>
<tr>
<td align="left" valign="middle">Confidence</td>
<td align="left" valign="middle">0.87</td>
<td align="left" valign="middle">0.85</td>
</tr>
<tr>
<td align="left" valign="middle">Intelligence</td>
<td align="left" valign="middle">0.88</td>
<td align="left" valign="middle">0.82</td>
</tr>
<tr>
<td align="left" valign="middle">Aggressiveness</td>
<td align="left" valign="middle">0.93</td>
<td align="left" valign="middle">0.88</td>
</tr>
<tr>
<td align="left" valign="middle">Dominance</td>
<td align="left" valign="middle">0.91</td>
<td align="left" valign="middle">0.87</td>
</tr>
<tr>
<td align="left" valign="middle">Competence</td>
<td align="left" valign="middle">0.83</td>
<td align="left" valign="middle">0.80</td>
</tr>
<tr>
<td align="left" valign="middle">Warmth</td>
<td align="left" valign="middle">0.93</td>
<td align="left" valign="middle">0.87</td>
</tr>
<tr>
<td align="left" valign="middle">Tenacity</td>
<td align="left" valign="middle">0.86</td>
<td align="left" valign="middle">0.87</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Second, to verify the data satisfied the assumptions for the ANOVA, we had conducted a normal distribution test and homogeneity of variance test for the rating scores for each trait. The results indicated that the data satisfied normal distribution (Kolmogorov&#x2013;Smirnov: <italic>p</italic> &#x003E; 0.05) and Bartlett&#x2019;s test of sphericity, which suggested that the ANOVA hypothesis had been satisfied. Therefore, the rating scores for each trait were analyzed separately in a 2 (happy expression intensity: high or low) &#x00D7; 2 (facial gender: male or female) repeated measures ANOVA. Because there were many dependent variables in this study, the probability of type I error through multiple comparisons might be increased. To decrease the probability of type I error, the significance thresholds for the <italic>p</italic>-values reported below were adjusted and the Bonferroni correction method of multiple tests was conducted according to the following formula: <italic>&#x03B1;</italic> = <italic>&#x03B1;</italic>/k (<italic>&#x03B1;</italic> = 0.05, <italic>k</italic> = 11). The difference was statistically significant with <italic>p</italic> &#x003C; 0.0045 (as shown in <xref rid="tab4" ref-type="table">Tables 4</xref>&#x2013;<xref rid="tab6" ref-type="table">6</xref>; <xref ref-type="bibr" rid="ref50">Rezlescu et al., 2015</xref>).</p>
<table-wrap position="float" id="tab4">
<label>Table 4</label>
<caption><p>Evaluation scores for the different social traits under high- and low-intensity happy expressions (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top" rowspan="2">Type of traits</th>
<th align="center" valign="top" colspan="2">Happy expression intensity</th>
<th align="center" valign="top" rowspan="2"><italic>F</italic></th>
<th align="center" valign="top" rowspan="2"><italic>p</italic></th>
<th align="center" valign="top" rowspan="2"><italic>&#x03B7;<sub>p</sub></italic><sup>2</sup></th>
</tr>
<tr>
<th align="center" valign="top">High</th>
<th align="center" valign="top">Low</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Trustworthiness</td>
<td align="left" valign="top">4.54 &#x00B1; 0.63</td>
<td align="left" valign="top">4.39 &#x00B1; 0.51</td>
<td align="left" valign="top">1.28</td>
<td align="left" valign="top">0.2670</td>
<td align="left" valign="top">0.04</td>
</tr>
<tr>
<td align="left" valign="top">Responsibility</td>
<td align="left" valign="top">4.44 &#x00B1; 0.74</td>
<td align="left" valign="top">4.83 &#x00B1; 0.46</td>
<td align="left" valign="top">8.67</td>
<td align="left" valign="top">0.0060</td>
<td align="left" valign="top">0.22</td>
</tr>
<tr>
<td align="left" valign="top">Sociability</td>
<td align="left" valign="top">4.78 &#x00B1; 0.64</td>
<td align="left" valign="top">4.06 &#x00B1; 0.45</td>
<td align="left" valign="top">39.61</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.56</td>
</tr>
<tr>
<td align="left" valign="top">Attractiveness</td>
<td align="left" valign="top">3.70 &#x00B1; 0.88</td>
<td align="left" valign="top">3.97 &#x00B1; 0.64</td>
<td align="left" valign="top">3.97</td>
<td align="left" valign="top">0.0550</td>
<td align="left" valign="top">0.11</td>
</tr>
<tr>
<td align="left" valign="top">Confidence</td>
<td align="left" valign="top">4.83 &#x00B1; 0.58</td>
<td align="left" valign="top">4.46 &#x00B1; 0.55</td>
<td align="left" valign="top">8.52</td>
<td align="left" valign="top">0.0060</td>
<td align="left" valign="top">0.22</td>
</tr>
<tr>
<td align="left" valign="top">Intelligence</td>
<td align="left" valign="top">4.01 &#x00B1; 0.66</td>
<td align="left" valign="top">4.45 &#x00B1; 0.52</td>
<td align="left" valign="top">19.63</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.39</td>
</tr>
<tr>
<td align="left" valign="top">Aggressiveness</td>
<td align="left" valign="top">2.99 &#x00B1; 0.82</td>
<td align="left" valign="top">3.69 &#x00B1; 0.68</td>
<td align="left" valign="top">23.23</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.43</td>
</tr>
<tr>
<td align="left" valign="top">Dominance</td>
<td align="left" valign="top">3.40 &#x00B1; 0.71</td>
<td align="left" valign="top">4.18 &#x00B1; 0.68</td>
<td align="left" valign="top">21.53</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.41</td>
</tr>
<tr>
<td align="left" valign="top">Competence</td>
<td align="left" valign="top">4.21 &#x00B1; 0.53</td>
<td align="left" valign="top">4.60 &#x00B1; 0.47</td>
<td align="left" valign="top">14.76</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.32</td>
</tr>
<tr>
<td align="left" valign="top">Warmth</td>
<td align="left" valign="top">4.75 &#x00B1; 0.80</td>
<td align="left" valign="top">4.05 &#x00B1; 0.59</td>
<td align="left" valign="top">30.64</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.50</td>
</tr>
<tr>
<td align="left" valign="top">Tenacity</td>
<td align="left" valign="top">4.53 &#x00B1; 0.61</td>
<td align="left" valign="top">4.71 &#x00B1; 0.61</td>
<td align="left" valign="top">1.43</td>
<td align="left" valign="top">0.2410</td>
<td align="left" valign="top">0.04</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.0045 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
<table-wrap position="float" id="tab5">
<label>Table 5</label>
<caption><p>Evaluation scores for the different social traits under facial gender (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top" rowspan="2">Type of traits</th>
<th align="center" valign="top" colspan="2">Facial gender</th>
<th align="center" valign="top" rowspan="2"><italic>F</italic></th>
<th align="center" valign="top" rowspan="2"><italic>p</italic></th>
<th align="center" valign="top" rowspan="2"><italic>&#x03B7;<sub>p</sub></italic><sup>2</sup></th>
</tr>
<tr>
<th align="center" valign="top">Male</th>
<th align="center" valign="top">Female</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Trustworthiness</td>
<td align="left" valign="top">4.13 &#x00B1; 0.55</td>
<td align="left" valign="top">4.80 &#x00B1; 0.50</td>
<td align="left" valign="top">36.15</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.54</td>
</tr>
<tr>
<td align="left" valign="top">Responsibility</td>
<td align="left" valign="top">4.39 &#x00B1; 0.53</td>
<td align="left" valign="top">4.88 &#x00B1; 0.56</td>
<td align="left" valign="top">27.83</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.47</td>
</tr>
<tr>
<td align="left" valign="top">Sociability</td>
<td align="left" valign="top">4.48 &#x00B1; 0.40</td>
<td align="left" valign="top">4.37 &#x00B1; 0.60</td>
<td align="left" valign="top">1.50</td>
<td align="left" valign="top">0.2290</td>
<td align="left" valign="top">0.05</td>
</tr>
<tr>
<td align="left" valign="top">Attractiveness</td>
<td align="left" valign="top">3.62 &#x00B1; 0.72</td>
<td align="left" valign="top">4.05 &#x00B1; 0.70</td>
<td align="left" valign="top">20.91</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.40</td>
</tr>
<tr>
<td align="left" valign="top">Confidence</td>
<td align="left" valign="top">4.68 &#x00B1; 0.43</td>
<td align="left" valign="top">4.61 &#x00B1; 0.53</td>
<td align="left" valign="top">0.67</td>
<td align="left" valign="top">0.4180</td>
<td align="left" valign="top">0.02</td>
</tr>
<tr>
<td align="left" valign="top">Intelligence</td>
<td align="left" valign="top">4.18 &#x00B1; 0.60</td>
<td align="left" valign="top">4.28 &#x00B1; 0.55</td>
<td align="left" valign="top">1.22</td>
<td align="left" valign="top">0.2770</td>
<td align="left" valign="top">0.04</td>
</tr>
<tr>
<td align="left" valign="top">Aggressiveness</td>
<td align="left" valign="top">3.83 &#x00B1; 0.79</td>
<td align="left" valign="top">2.85 &#x00B1; 0.71</td>
<td align="left" valign="top">46.03</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.60</td>
</tr>
<tr>
<td align="left" valign="top">Dominance</td>
<td align="left" valign="top">4.09 &#x00B1; 0.62</td>
<td align="left" valign="top">3.49 &#x00B1; 0.53</td>
<td align="left" valign="top">36.33</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.54</td>
</tr>
<tr>
<td align="left" valign="top">Competence</td>
<td align="left" valign="top">4.26 &#x00B1; 0.50</td>
<td align="left" valign="top">4.56 &#x00B1; 0.46</td>
<td align="left" valign="top">12.04</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.28</td>
</tr>
<tr>
<td align="left" valign="top">Warmth</td>
<td align="left" valign="top">4.10 &#x00B1; 0.69</td>
<td align="left" valign="top">4.70 &#x00B1; 0.60</td>
<td align="left" valign="top">47.92</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">0.61</td>
</tr>
<tr>
<td align="left" valign="top">Tenacity</td>
<td align="left" valign="top">4.49 &#x00B1; 0.52</td>
<td align="left" valign="top">4.75 &#x00B1; 0.66</td>
<td align="left" valign="top">3.64</td>
<td align="left" valign="top">0.0660</td>
<td align="left" valign="top">0.11</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.0045 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
<table-wrap position="float" id="tab6">
<label>Table 6</label>
<caption><p>Evaluation scores for the different social traits under happy expression intensity and facial gender (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top" rowspan="3">Type of traits</th>
<th align="center" valign="top" colspan="4">Happy expression intensity</th>
<th align="center" valign="top" rowspan="3"><italic>F</italic></th>
<th align="center" valign="top" rowspan="3"><italic>p</italic></th>
<th align="center" valign="top" rowspan="3"><italic>&#x03B7;<sub>p</sub></italic><sup>2</sup></th>
</tr>
<tr>
<th align="center" valign="top" colspan="2">High</th>
<th align="center" valign="top" colspan="2">Low</th>
</tr>
<tr>
<th align="center" valign="top">Male</th>
<th align="center" valign="top">Female</th>
<th align="center" valign="top">Male</th>
<th align="center" valign="top">Female</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Trustworthiness</td>
<td align="left" valign="top">4.21 &#x00B1; 0.78</td>
<td align="left" valign="top">4.88 &#x00B1; 0.65</td>
<td align="left" valign="top">4. 50 &#x00B1; 0.65</td>
<td align="left" valign="top">4.73 &#x00B1; 0.61</td>
<td align="left" valign="top">0.03</td>
<td align="left" valign="top">0.8740</td>
<td align="left" valign="top">0.001</td>
</tr>
<tr>
<td align="left" valign="top">Responsibility</td>
<td align="left" valign="top">4.16 &#x00B1; 0.75</td>
<td align="left" valign="top">4.72 &#x00B1; 0.84</td>
<td align="left" valign="top">4.63 &#x00B1; 0.55</td>
<td align="left" valign="top">5.03 &#x00B1; 0.56</td>
<td align="left" valign="top">2.42</td>
<td align="left" valign="top">0.1300</td>
<td align="left" valign="top">0.072</td>
</tr>
<tr>
<td align="left" valign="top">Sociability</td>
<td align="left" valign="top">4.83 &#x00B1; 0.59</td>
<td align="left" valign="top">4.74 &#x00B1; 0.77</td>
<td align="left" valign="top">4.12 &#x00B1; 0.49</td>
<td align="left" valign="top">4.00 &#x00B1; 0.59</td>
<td align="left" valign="top">0.03</td>
<td align="left" valign="top">0.8600</td>
<td align="left" valign="top">0.001</td>
</tr>
<tr>
<td align="left" valign="top">Attractiveness</td>
<td align="left" valign="top">3.46 &#x00B1; 0.92</td>
<td align="left" valign="top">3.94 &#x00B1; 0.96</td>
<td align="left" valign="top">3.78 &#x00B1; 0.73</td>
<td align="left" valign="top">4.16 &#x00B1; 0.66</td>
<td align="left" valign="top">0.77</td>
<td align="left" valign="top">0.3870</td>
<td align="left" valign="top">0.024</td>
</tr>
<tr>
<td align="left" valign="top">Confidence</td>
<td align="left" valign="top">4.85 &#x00B1; 0.65</td>
<td align="left" valign="top">4.82 &#x00B1; 0.63</td>
<td align="left" valign="top">4.51 &#x00B1; 0.52</td>
<td align="left" valign="top">4.41 &#x00B1; 0.66</td>
<td align="left" valign="top">0.70</td>
<td align="left" valign="top">0.4080</td>
<td align="left" valign="top">0.022</td>
</tr>
<tr>
<td align="left" valign="top">Intelligence</td>
<td align="left" valign="top">3.93 &#x00B1; 0.76</td>
<td align="left" valign="top">4.08 &#x00B1; 0.66</td>
<td align="left" valign="top">4.43 &#x00B1; 0.61</td>
<td align="left" valign="top">4.47 &#x00B1; 0.60</td>
<td align="left" valign="top">1.03</td>
<td align="left" valign="top">0.3180</td>
<td align="left" valign="top">0.032</td>
</tr>
<tr>
<td align="left" valign="top">Aggressiveness</td>
<td align="left" valign="top">3.44 &#x00B1; 1.04</td>
<td align="left" valign="top">2.55 &#x00B1; 0.83</td>
<td align="left" valign="top">4.22 &#x00B1; 0.78</td>
<td align="left" valign="top">3.16 &#x00B1; 0.84</td>
<td align="left" valign="top">1.23</td>
<td align="left" valign="top">0.2770</td>
<td align="left" valign="top">0.038</td>
</tr>
<tr>
<td align="left" valign="top">Dominance</td>
<td align="left" valign="top">3.68 &#x00B1; 0.89</td>
<td align="left" valign="top">3.12 &#x00B1; 0.67</td>
<td align="left" valign="top">4.49 &#x00B1; 0.80</td>
<td align="left" valign="top">3.86 &#x00B1; 0.71</td>
<td align="left" valign="top">0.24</td>
<td align="left" valign="top">0.6300</td>
<td align="left" valign="top">0.008</td>
</tr>
<tr>
<td align="left" valign="top">Competence</td>
<td align="left" valign="top">4.01 &#x00B1; 0.60</td>
<td align="left" valign="top">4.42 &#x00B1; 0.56</td>
<td align="left" valign="top">4.50 &#x00B1; 0.57</td>
<td align="left" valign="top">4.70 &#x00B1; 0.58</td>
<td align="left" valign="top">4.07</td>
<td align="left" valign="top">0.0530</td>
<td align="left" valign="top">0.116</td>
</tr>
<tr>
<td align="left" valign="top">Warmth</td>
<td align="left" valign="top">4.45 &#x00B1; 0.86</td>
<td align="left" valign="top">5.06 &#x00B1; 0.82</td>
<td align="left" valign="top">3.76 &#x00B1; 0.71</td>
<td align="left" valign="top">4.34 &#x00B1; 0.61</td>
<td align="left" valign="top">0.07</td>
<td align="left" valign="top">0.7980</td>
<td align="left" valign="top">0.002</td>
</tr>
<tr>
<td align="left" valign="top">Tenacity</td>
<td align="left" valign="top">4.38 &#x00B1; 0.70</td>
<td align="left" valign="top">4.69 &#x00B1; 0.81</td>
<td align="left" valign="top">4.60 &#x00B1; 0.66</td>
<td align="left" valign="top">4.81 &#x00B1; 0.78</td>
<td align="left" valign="top">1.08</td>
<td align="left" valign="top">0.3080</td>
<td align="left" valign="top">0.034</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.0045 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
<p>For the judgments of trustworthiness, responsibility, and attractiveness, non-significant main effects of the happy expression intensity were observed, and the interaction between happy expression intensity and facial gender was not significant. However, the main effects of facial gender were significant, and the ratings of trustworthiness, responsibility, and attractiveness of the female faces were higher than those of male faces.</p>
<p>For sociability and intelligence, the main effects of the happy expression intensity were significant, and the ratings of sociability of the high-intensity happy faces were higher than the low-intensity happy faces, but the ratings of intelligence of the high-intensity happy faces were lower than the low-intensity happy faces. However, the main effects of facial gender and the interaction between happy expression intensity and facial gender were not significant.</p>
<p>For warmth, aggressiveness, dominance, and competence, the main effects of the happy expression intensity and facial gender were significant. For the happy expression intensity, the ratings of the warmth of the high-intensity happy faces were higher than the low-intensity happy faces, but the ratings of aggressiveness, dominance, and competence of the high-intensity happy faces were lower than the low-intensity happy faces. For the facial gender, the ratings of warmth and competence of the female faces were higher than the male faces, but the ratings of aggressiveness and dominance of the female faces were lower than the male faces. However, the interaction between happy expression intensity and facial gender was not significant.</p>
<p>For confidence and tenacity, the main effects of the happy expression intensity, facial gender, and the interaction between happy expression intensity and facial gender were not significant.</p>
<p>Further, this study reported the results of a one-sample <italic>t</italic>-test against the scale midpoints in addition to the relative comparisons between low- and high-intensity happy facial expressions (as shown in <xref rid="tab7" ref-type="table">Tables 7</xref> and <xref rid="tab8" ref-type="table">8</xref>).</p>
<table-wrap position="float" id="tab7">
<label>Table 7</label>
<caption><p>Evaluation scores for the traits under high-intensity happy expression and scale midpoints (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top" rowspan="2">Type of traits</th>
<th align="center" valign="top" colspan="2">Facial expressions intensity</th>
<th align="center" valign="top" rowspan="2"><italic>T</italic></th>
<th align="center" valign="top" rowspan="2"><italic>p</italic></th>
<th align="center" valign="top" rowspan="2">Cohen&#x2019;s <italic>d</italic></th>
</tr>
<tr>
<th align="center" valign="top">High</th>
<th align="center" valign="top">The scale midpoints</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Trustworthiness</td>
<td align="left" valign="top">4.54 &#x00B1; 0.63</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">4.91</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.76</td>
</tr>
<tr>
<td align="left" valign="top">Responsibility</td>
<td align="left" valign="top">4.44 &#x00B1; 0.74</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">3.37</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.21</td>
</tr>
<tr>
<td align="left" valign="top">Sociability</td>
<td align="left" valign="top">4.78 &#x00B1; 0.64</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">6.69</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">2.40</td>
</tr>
<tr>
<td align="left" valign="top">Attractiveness</td>
<td align="left" valign="top">3.70 &#x00B1; 0.88</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">&#x2212;1.95</td>
<td align="left" valign="top">0.0610</td>
<td align="left" valign="top">0.70</td>
</tr>
<tr>
<td align="left" valign="top">Confidence</td>
<td align="left" valign="top">4.83 &#x00B1; 0.58</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">8.15</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">2.93</td>
</tr>
<tr>
<td align="left" valign="top">Intelligence</td>
<td align="left" valign="top">4.01 &#x00B1; 0.66</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">0.05</td>
<td align="left" valign="top">0.9580</td>
<td align="left" valign="top">0.02</td>
</tr>
<tr>
<td align="left" valign="top">Aggressiveness</td>
<td align="left" valign="top">2.99 &#x00B1; 0.82</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">&#x2212;6.96</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">2.50</td>
</tr>
<tr>
<td align="left" valign="top">Dominance</td>
<td align="left" valign="top">3.39 &#x00B1; 0.71</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">&#x2212;4.78</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.72</td>
</tr>
<tr>
<td align="left" valign="top">Competence</td>
<td align="left" valign="top">4.21 &#x00B1; 0.53</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">2.29</td>
<td align="left" valign="top">0.0290</td>
<td align="left" valign="top">0.82</td>
</tr>
<tr>
<td align="left" valign="top">Warmth</td>
<td align="left" valign="top">4.75 &#x00B1; 0.80</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">5.34</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.92</td>
</tr>
<tr>
<td align="left" valign="top">Tenacity</td>
<td align="left" valign="top">4.53 &#x00B1; 0.61</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">4.94</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.77</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.0045 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
<table-wrap position="float" id="tab8">
<label>Table 8</label>
<caption><p>Evaluation scores for the traits under low-intensity happy expression and scale midpoints (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top" rowspan="2">Type of traits</th>
<th align="center" valign="top" colspan="2">Facial expressions intensity</th>
<th align="center" valign="top" rowspan="2"><italic>T</italic></th>
<th align="center" valign="top" rowspan="2"><italic>p</italic></th>
<th align="center" valign="top" rowspan="2">Cohen&#x2019;s <italic>d</italic></th>
</tr>
<tr>
<th align="center" valign="top">Low</th>
<th align="center" valign="top">The scale midpoints</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Trustworthiness</td>
<td align="left" valign="top">4.39 &#x00B1; 0.51</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">4.26</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.53</td>
</tr>
<tr>
<td align="left" valign="top">Responsibility</td>
<td align="left" valign="top">4.83 &#x00B1; 0.46</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">10.32</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">3.70</td>
</tr>
<tr>
<td align="left" valign="top">Sociability</td>
<td align="left" valign="top">4.06 &#x00B1; 0.45</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">0.82</td>
<td align="left" valign="top">0.421</td>
<td align="left" valign="top">0.29</td>
</tr>
<tr>
<td align="left" valign="top">Attractiveness</td>
<td align="left" valign="top">3.97 &#x00B1; 0.64</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">&#x2212;0.23</td>
<td align="left" valign="top">0.818</td>
<td align="left" valign="top">0.08</td>
</tr>
<tr>
<td align="left" valign="top">Confidence</td>
<td align="left" valign="top">4.46 &#x00B1; 0.55</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">4.71</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.69</td>
</tr>
<tr>
<td align="left" valign="top">Intelligence</td>
<td align="left" valign="top">4.45 &#x00B1; 0.52</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">4.89</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">1.76</td>
</tr>
<tr>
<td align="left" valign="top">Aggressiveness</td>
<td align="left" valign="top">3.69 &#x00B1; 0.68</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">&#x2212;2.61</td>
<td align="left" valign="top">0.014</td>
<td align="left" valign="top">0.94</td>
</tr>
<tr>
<td align="left" valign="top">Dominance</td>
<td align="left" valign="top">4.18 &#x00B1; 0.68</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">1.48</td>
<td align="left" valign="top">0.150</td>
<td align="left" valign="top">0.53</td>
</tr>
<tr>
<td align="left" valign="top">Competence</td>
<td align="left" valign="top">4.60 &#x00B1; 0.47</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">7.18</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">2.58</td>
</tr>
<tr>
<td align="left" valign="top">Warmth</td>
<td align="left" valign="top">4.05 &#x00B1; 0.59</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">0.46</td>
<td align="left" valign="top">0.646</td>
<td align="left" valign="top">0.17</td>
</tr>
<tr>
<td align="left" valign="top">Tenacity</td>
<td align="left" valign="top">4.71 &#x00B1; 0.61</td>
<td align="left" valign="top">4.00 &#x00B1; 0.00</td>
<td align="left" valign="top">6.54</td>
<td align="left" valign="top">&#x003C;0.0045</td>
<td align="left" valign="top">2.35</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.0045 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
<p>Finally, a normal distribution test and homogeneity of variance test were conducted for the rating scores of each trait in both versions. The results indicated that the data satisfied normal distribution (Kolmogorov&#x2013;Smirnov: <italic>p</italic> &#x003E; 0.05) and homogeneity of variance (Levene&#x2019;s Statistic: <italic>p</italic> &#x003E; 0.05). SPSS 24.0 (IBM, 2018) was used to perform independent sample <italic>t</italic>-tests on the rating scores of each trait in both versions. The results showed no significant differences in either Version 1 or Version 2 (as shown in <xref rid="tab9" ref-type="table">Table 9</xref>), indicating that the evaluation scores of traits in both versions were generally homogeneous.</p>
<table-wrap position="float" id="tab9">
<label>Table 9</label>
<caption><p>Evaluation scores for the different social traits under Version 1 and Version 2 (M &#x00B1; SD).</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top" rowspan="2">Type of traits</th>
<th align="center" valign="top" colspan="2">Versions</th>
<th align="center" valign="top" rowspan="2"><italic>t</italic></th>
<th align="center" valign="top" rowspan="2"><italic>p</italic></th>
<th align="center" valign="top" rowspan="2">Cohen&#x2019;s <italic>d</italic></th>
</tr>
<tr>
<th align="center" valign="top">1</th>
<th align="center" valign="top">2</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Trustworthiness</td>
<td align="left" valign="top">4.49 &#x00B1; 0.44</td>
<td align="left" valign="top">4.44 &#x00B1; 0.41</td>
<td align="left" valign="top">0.35</td>
<td align="left" valign="top">0.7320</td>
<td align="left" valign="top">0.12</td>
</tr>
<tr>
<td align="left" valign="top">Responsibility</td>
<td align="left" valign="top">4.61 &#x00B1; 0.57</td>
<td align="left" valign="top">4.67 &#x00B1; 0.40</td>
<td align="left" valign="top">&#x2212;0.33</td>
<td align="left" valign="top">0.7430</td>
<td align="left" valign="top">0.26</td>
</tr>
<tr>
<td align="left" valign="top">Sociability</td>
<td align="left" valign="top">4.43 &#x00B1; 0.45</td>
<td align="left" valign="top">4.41 &#x00B1; 0.46</td>
<td align="left" valign="top">0.13</td>
<td align="left" valign="top">0.8970</td>
<td align="left" valign="top">0.04</td>
</tr>
<tr>
<td align="left" valign="top">Attractiveness</td>
<td align="left" valign="top">3.75 &#x00B1; 0.67</td>
<td align="left" valign="top">3.92 &#x00B1; 0.66</td>
<td align="left" valign="top">&#x2212;0.73</td>
<td align="left" valign="top">0.4710</td>
<td align="left" valign="top">0.04</td>
</tr>
<tr>
<td align="left" valign="top">Confidence</td>
<td align="left" valign="top">4.72 &#x00B1; 0.44</td>
<td align="left" valign="top">4.57 &#x00B1; 0.43</td>
<td align="left" valign="top">1.04</td>
<td align="left" valign="top">0.3080</td>
<td align="left" valign="top">0.12</td>
</tr>
<tr>
<td align="left" valign="top">Intelligence</td>
<td align="left" valign="top">4.30 &#x00B1; 0.53</td>
<td align="left" valign="top">4.16 &#x00B1; 0.52</td>
<td align="left" valign="top">0.77</td>
<td align="left" valign="top">0.4500</td>
<td align="left" valign="top">0.21</td>
</tr>
<tr>
<td align="left" valign="top">Aggressiveness</td>
<td align="left" valign="top">3.28 &#x00B1; 0.68</td>
<td align="left" valign="top">3.41 &#x00B1; 0.59</td>
<td align="left" valign="top">&#x2212;0.57</td>
<td align="left" valign="top">0.5720</td>
<td align="left" valign="top">0.20</td>
</tr>
<tr>
<td align="left" valign="top">Dominance</td>
<td align="left" valign="top">3.73 &#x00B1; 0.61</td>
<td align="left" valign="top">3.84 &#x00B1; 0.39</td>
<td align="left" valign="top">&#x2212;0.66</td>
<td align="left" valign="top">0.5120</td>
<td align="left" valign="top">0.17</td>
</tr>
<tr>
<td align="left" valign="top">Competence</td>
<td align="left" valign="top">4.43 &#x00B1; 0.49</td>
<td align="left" valign="top">4.39 &#x00B1; 0.34</td>
<td align="left" valign="top">0.24</td>
<td align="left" valign="top">0.8150</td>
<td align="left" valign="top">0.27</td>
</tr>
<tr>
<td align="left" valign="top">Warmth</td>
<td align="left" valign="top">4.45 &#x00B1; 0.70</td>
<td align="left" valign="top">4.35 &#x00B1; 0.49</td>
<td align="left" valign="top">0.44</td>
<td align="left" valign="top">0.6630</td>
<td align="left" valign="top">0.09</td>
</tr>
<tr>
<td align="left" valign="top">Tenacity</td>
<td align="left" valign="top">4.63 &#x00B1; 0.54</td>
<td align="left" valign="top">4.61 &#x00B1; 0.36</td>
<td align="left" valign="top">0.18</td>
<td align="left" valign="top">0.8620</td>
<td align="left" valign="top">0.34</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>p &#x003C; 0.0045 was considered statistically significant</italic>.</p>
</table-wrap-foot>
</table-wrap>
</sec>
<sec id="sec12" sec-type="discussions">
<title>Discussion</title>
<p>Based on the trait assessment task, this study manipulated the intensity of happy expressions (high, low) of target faces to explore the effect of happy emotional intensity on the social perception of Chinese faces. The results indicated that compared to the low-intensity happy expression, the high-intensity happy expression led to an enhanced perceptual outcome of the traits related to approachability, such as sociability and warmth, but not trustworthiness. Furthermore, compared to the low-intensity happy expression, the high-intensity happy expression reduced the perceptual outcome of traits related to capability.</p>
<sec id="sec13">
<title>The Effect of the Intensities of Happy Expressions on Approachability</title>
<p>The &#x201C;approachability&#x201D; dimension represents a welcoming behavioral tendency of the target face. Happy expressions not only indicate positive emotional states but also convey friendly behavioral tendencies (<xref ref-type="bibr" rid="ref43">Montepare and Dobish, 2003</xref>). Researchers have suggested that the intensity of expression corresponds to the intensity of behavioral tendencies (<xref ref-type="bibr" rid="ref18">Ekman et al., 1980</xref>). Similar to the &#x201C;morality differentiation hypothesis&#x201D; (<xref ref-type="bibr" rid="ref24">Goodwin et al., 2014</xref>; <xref ref-type="bibr" rid="ref38">Landy et al., 2016</xref>), the &#x201C;approachability&#x201D; dimension of Chinese faces also includes two sub-dimensions: warmth and trustworthiness.</p>
<p>The results of this study showed that the sociability and warmth of high-intensity happy faces were rated higher than low-intensity happy faces, supporting the results of previous studies (<xref ref-type="bibr" rid="ref25">Harker and Keltner, 2001</xref>; <xref ref-type="bibr" rid="ref42">Mehu et al., 2008</xref>). Toothy smiles convey the behavioral tendency of an expressive person to build social ties and higher social intentions (<xref ref-type="bibr" rid="ref42">Mehu et al., 2008</xref>; <xref ref-type="bibr" rid="ref2">Bell et al., 2017</xref>) as well as increase the sense of friendliness, approachability, and warmth of the individual. Thus, it is believed that positive traits associated with social skills (e.g., sociability and warmth) tend to increase with the intensity of happy expressions. Some researchers believe that the positive effects of happy expressions of different intensities on the social perception of faces are derived from the baby-face overgeneralization effect, indicating that people tend to believe that adults with baby-face facial features have the same traits as infants, such as meekness, innocence, and enthusiasm. The intensity of happy expressions is associated with zygomatic muscle intensity (<xref ref-type="bibr" rid="ref64">Wang et al., 2015</xref>). The typical facial features of high-intensity happy expressions (i.e., a widened nose, upturned mouth, shortened chin, and round face) are similar to the face of a baby (e.g., small, round, and small jaw; <xref ref-type="bibr" rid="ref14">Dou et al., 2014</xref>). With the increase in the intensity of a happy expression, the facial features become more similar to the face of a baby (<xref ref-type="bibr" rid="ref61">Walker et al., 2011</xref>; <xref ref-type="bibr" rid="ref64">Wang et al., 2015</xref>), and the baby-face overgeneralization effect is more obvious. Therefore, the score of high-intensity happy expressions is higher than that of low-intensity happy expressions for sociability and warmth.</p>
<p>In addition to the &#x201C;warmth&#x201D; dimension, the &#x201C;approachability&#x201D; dimension of Chinese faces also includes the subdimension of trustworthiness, which is a representative trait of valence and includes responsibility, attractiveness, and confidence (<xref ref-type="bibr" rid="ref48">Oosterhof and Todorov, 2008</xref>). The intensity of a happy expression did not affect the rated scores for trustworthiness, attractiveness, confidence, or responsibility. Based on the perceptual fluency hypothesis (<xref ref-type="bibr" rid="ref66">Westerman et al., 2015</xref>), happy expressions of different intensities (positive emotional valence) correspond with the valence of trustworthiness, responsibility, attractiveness, and confidence (positive traits). The perceptual process is simple and does not vary with the intensity of happy expressions. However, the result for trustworthiness was inconsistent with previous studies, which suggested that children could perceive different levels of face trustworthiness based on cues of happy expressions of different intensities (25 and 50%), and the influence of happy expressions on trustworthiness perception would be enhanced with an increase in emotional intensity (<xref ref-type="bibr" rid="ref29">Hess et al., 2000</xref>; <xref ref-type="bibr" rid="ref10">Caulfield et al., 2014</xref>, <xref ref-type="bibr" rid="ref9">2016</xref>). There may be several reasons for this conflicting result. First, the experimental materials used in previous research comprised a combination of neutral and happy facial images (<xref ref-type="bibr" rid="ref29">Hess et al., 2000</xref>; <xref ref-type="bibr" rid="ref9">Caulfield et al., 2016</xref>); this could have caused the happy faces to appear less natural and the less intense happy face to appear even less natural, thus decreasing its trustworthiness rating. Second, the differences in interpretations of trustworthiness, compared with previous studies, might have explained the inconsistencies of the abovementioned study. Some researchers believed that the meanings of trustworthiness, warmth, and sociability are similar and that they are used to evaluate the friendly behavior intentions of the target face and that they are associated with communality (<xref ref-type="bibr" rid="ref29">Hess et al., 2000</xref>; <xref ref-type="bibr" rid="ref21">Fiske et al., 2007</xref>; <xref ref-type="bibr" rid="ref48">Oosterhof and Todorov, 2008</xref>; <xref ref-type="bibr" rid="ref10">Caulfield et al., 2014</xref>, <xref ref-type="bibr" rid="ref9">2016</xref>). However, in Chinese culture, trustworthiness refers to a moral norm that is associated with correctness rather than with the development of interpersonal skills (<xref ref-type="bibr" rid="ref33">Shu et al., 2017</xref>), which supports the &#x201C;morality differentiation hypothesis&#x201D; (<xref ref-type="bibr" rid="ref24">Goodwin et al., 2014</xref>; <xref ref-type="bibr" rid="ref38">Landy et al., 2016</xref>). A highly sociable individual may not be perceived as being more trustworthy.</p>
<p>This result of this study regarding the effect on attractiveness was also inconsistent with the results of previous studies, which reported a positive correlation between the intensity of natural smiles and ratings on physical attractiveness (<xref ref-type="bibr" rid="ref23">Golle et al., 2014</xref>). The main reason for this inconsistency might be due to the technique of stimulus creation. According to the &#x201C;average hypothesis,&#x201D; the degree of facial averageness is the main factor affecting facial attractiveness, and the more average the face, the higher the facial attractiveness (<xref ref-type="bibr" rid="ref39">Li and Cheng, 2010</xref>). Previous studies have used average faces formed by the Psychomorph software instead of natural faces to explore the effect of smiling intensity on attractiveness. Such a design allowed the influence of both averageness and smiling intensity, thus making it impossible to distinguish the effect of facial averageness and smiling intensity on facial attractiveness (<xref ref-type="bibr" rid="ref23">Golle et al., 2014</xref>). When facial averageness in the present study was controlled, the smiling intensity did not influence facial attractiveness.</p>
<p>The non-significant effect on the trait of responsibility might be due to its uniqueness. Responsibility refers to a positive trait characterized by effort, self-discipline, carefulness, and conscientiousness (<xref ref-type="bibr" rid="ref48">Oosterhof and Todorov, 2008</xref>; <xref ref-type="bibr" rid="ref31">Huang et al., 2014</xref>). Thus, its rating scores mainly reflect the executive power of the behavior rather than the behavioral tendency of the target individual, which might be less related to smile.</p>
</sec>
<sec id="sec14">
<title>The Effect of the Intensities of Happy Expressions on Capability</title>
<p>The result of the facial evaluation of people pertaining to the &#x201C;capability&#x201D; of a person represents the judgment of the ability of behavior intention of the target faces. <xref ref-type="bibr" rid="ref70">Wu et al. (2020)</xref> found that the &#x201C;capability&#x201D; dimension denoted the traits of dominance and tenacity, which included physical and intellectual strength.</p>
<p>The results of this study showed that low-intensity smiling faces were rated as more dominant, aggressive, competent, and intelligent than high-intensity smiling faces. In general, the scores of physical strength, including dominance and aggressiveness, and intellectual strength, including competence and intelligence decrease with the increase in the intensity of happy expression. This is because the &#x201C;capability&#x201D; dimension is usually related to the attainment of military/political status, and the score of this dimension reflects the competitiveness and control of an individual in a particular field (<xref ref-type="bibr" rid="ref12">Cheng et al., 2013</xref>). Compared to low-intensity smiling faces, high-intensity smiling faces have more baby-face features, and these faces represent weaker control (<xref ref-type="bibr" rid="ref36">Kraus and Chen, 2013</xref>) and weaker competitiveness and competence (<xref ref-type="bibr" rid="ref22">Gao et al., 2016</xref>). Additionally, regarding cultural differences, compared with Western leaders, Chinese leaders always present a calm and weak smile (<xref ref-type="bibr" rid="ref58">Tsai et al., 2016</xref>), with more emphasis on &#x201C;smiling without showing teeth&#x201D; (<xref ref-type="bibr" rid="ref19">Fang et al., 2019</xref>). In China, the expression of smiling without showing teeth is more likely to be a facial cue of high competence and dominance traits.</p>
<p>However, even if the overall trend is the same, due to the different meanings between dominance and competence, the intensity of happy expressions is not consistently evaluated for the two traits. This could be because dominance and aggressiveness are traits representing physical strength and imply a threatening ability to carry out the intention to hurt others, thus sharing a negative correlation with valence. Therefore, the dominance and aggressiveness scores for happy faces are lower than or equal to the scale midpoints. Compared to low-intensity happy faces, high-intensity happy faces increase the propensity for submissive behaviors. When people desire to build cooperative relationships with others (<xref ref-type="bibr" rid="ref42">Mehu et al., 2008</xref>; <xref ref-type="bibr" rid="ref2">Bell et al., 2017</xref>), or are in search of rapport (<xref ref-type="bibr" rid="ref28">Hennig-Thurau et al., 2006</xref>), they tend to smile more intensely. This submissive motivation is also incompatible with the characteristics of the dominance trait (threat). Therefore, the scores of dominance and aggressiveness decrease with the increase in the intensity of happy expression. On the other hand, people with high competence gain social status through a high level of ability or generosity, and there is a positive correlation between competence and valence. Therefore, competence including intelligence scores for happy faces is greater than or equal to the median. However, high-intensity smiling faces are often considered to show that people are carefree, satisfied with the status quo, and open to change and improvement (<xref ref-type="bibr" rid="ref3">Bodenhausen et al., 1994</xref>). This is inconsistent with the intention conveyed by the component of competence (e.g., high creativity and high efficiency; <xref ref-type="bibr" rid="ref21">Fiske et al., 2007</xref>); thus, high-intensity happy expressions might be facial cues for a lack of competence. In addition, target faces were found to be affected by a stronger positivity effect in the competence domain for moderate levels of behaviors (<xref ref-type="bibr" rid="ref51">Rusconi et al., 2020</xref>). Therefore, compared to high-intensity happy expressions, low-intensity happy expressions that are attributed to moderate levels of behaviors might work as to be facial cues for competence.</p>
<p>However, the present study demonstrated that the intensity of the happy expression did not affect the evaluation score of tenacity, though it was usually comprehended in the dimension of &#x201C;capability.&#x201D; As tenacity refers to a trait that is exhibited to protect the body from harm under stress (<xref ref-type="bibr" rid="ref72">Zhang and Wang, 2011</xref>), a person with strong tenacity is more inclined toward focusing on problem-coping strategies than on emotion-coping strategies (<xref ref-type="bibr" rid="ref20">Nicholls et al., 2008</xref>). Therefore, the score of tenacity might be unrelated to the intensity of happy expressions.</p>
<p>Taken together, the present studies have made a worthwhile contribution to the existing literature. In terms of the current research, this study explored how different intensities of happy expressions influenced the social perception of faces in the Chinese context. The results supported the &#x201C;morality differentiation hypothesis&#x201D; that trustworthiness and warmth/sociability had different meanings in China. Sociability in the context of Chinese culture focuses on the development of interpersonal skills (e.g., emotional management skills and conflict resolution strategies). Further, sociability is associated with communality (<xref ref-type="bibr" rid="ref71">Zhang et al., 2012</xref>). However, trustworthiness is considered a moral code, and it is uniquely associated with correctness (<xref ref-type="bibr" rid="ref33">Shu et al., 2017</xref>). Therefore, the intensity of happy expressions has different effects on these two traits. Compared with low-intensity happy expressions, high-intensity happy expressions only improve the evaluation score of sociability and do not affect the evaluation score of trustworthiness. Second, previous researchers have studied the &#x201C;morality differentiation hypothesis,&#x201D; which was applicable to the top-down stereotype content processing and familiar groups processing (<xref ref-type="bibr" rid="ref24">Goodwin et al., 2014</xref>; <xref ref-type="bibr" rid="ref38">Landy et al., 2016</xref>), as well as highlighted the distinct role of trustworthiness in face perception from the bottom-up perspective (<xref ref-type="bibr" rid="ref37">Krumhuber et al., 2007</xref>; <xref ref-type="bibr" rid="ref57">Todorov et al., 2015</xref>; <xref ref-type="bibr" rid="ref56">Todorov and Oh, 2021</xref>). Compared with the previous studies, the present study distinguished trustworthiness and sociability through trait assessment tasks in the first impressions of strangers with different intensity smiling, which added another supportive evidence for the &#x201C;morality differentiation hypothesis.&#x201D; Third, the present study used natural face photographs, thus having more ecological validity than computer-generated faces and composite images that were used in previous studies, revealing the novel finding of this study that the trustworthiness and attractiveness ratings were not affected by the intensity of happiness. Fourth, the present study showed the differences between physical and intellectual strength. For example, the physical strength rating for low-intensity happy expressions was equal to the scale midpoints, while the score of intellectual strength was higher than the scale midpoints; similarly, for high-intensity happy expression, the physical strength rating was lower than the middle value, and that for intellectual strength had no significant difference from the scale midpoints. Fifth, the present study fully described how the influence of the intensity of happy expressions influenced 11 traits: trustworthiness, responsibility, attractiveness, sociability, confidence, intelligence, aggressiveness, dominance, competence, warmth, and tenacity. This has consequently provided more practical suggestions for the daily communications of people, as well as hints for researchers who are interested in conducting further research on one or several traits.</p>
<p>Although the present study produced several interesting findings, it has several limitations. First, this study only selected happy expressions, thereby lacking negative and neutral expressions for comparison groups. Further research must compare the effects of positive, negative, and neutral expressions on personality trait assessment. Second, this study adopted a within-subjects design that is similar to the studies of <xref ref-type="bibr" rid="ref61">Walker et al. (2011)</xref> and <xref ref-type="bibr" rid="ref63">Wang et al. (2019)</xref>, in which the bias of the perceiver on the social perception of faces can be controlled; however, the evaluation of one trait by the participants was found to affect their judgment of another trait. To control the judgment error of traits by the same participants, the 11 traits in this study were divided into 11 blocks and then presented in random order to the participants. A mandatory rest time of 60 s was also set between each block for participants, as well as a freely regulated rest time. Therefore, the influence of the evaluation of the same participant of one trait that could affect the judgment of another trait was controlled, and the fatigue of the participants was also reduced. Future studies should adopt a between-subjects design to verify the stability of the results of this study. Third, because there are many levels of dependent variables in this study, multiple statistical analyses were conducted. Although they were statistically corrected, this does not eliminate the possible misrepresentation or understatement effect caused by multiple statistical comparative analyses. Future studies should conduct further targeted tests on these effects. Fourth, this study addressed the gap in previous research by considering how different intensities (low vs. high) of happy facial expressions affected the ascription of 11 traits focusing on Chinese faces. However, this current study lacked a direct comparison between Chinese and Western faces and participants. Therefore, further research that directly compares the underlying cultural differences of how different intensities of happy expressions affect the social perception of faces is necessary. Fifth, this was an exploratory experiment, and future research needs to recruit more participants to replicate the results of this study.</p>
</sec>
</sec>
<sec id="sec15" sec-type="conclusions">
<title>Conclusion</title>
<p>In summary, the present study revealed that different intensity happy expressions (high-intensity or low-intensity) had different effects on the social perception of Chinese faces among Chinese participants. This was mainly manifested by high-intensity happy expressions receiving higher scores for sociability and warmth in the dimension of &#x201C;approachability,&#x201D; as compared with low-intensity happy expressions. Further, high-intensity happy expressions had lower scores for the dimension of &#x201C;capability,&#x201D; (e.g., dominance, competence, and intelligence).</p>
</sec>
<sec id="sec16">
<title>Data Availability Statement</title>
<p>The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding authors.</p>
</sec>
<sec id="sec17">
<title>Ethics Statement</title>
<p>The studies involving human participants were reviewed and approved by the Ethics Committee of Liaoning Normal University. The patients/participants provided their written informed consent to participate in this study. Written informed consent was obtained from the individual(s) for the publication of any potentially identifiable images or data included in this article.</p>
</sec>
<sec id="sec18">
<title>Author Contributions</title>
<p>ZJ conceived this study. YL and YY participated in writing and revising the manuscript. YL and HL participated in performing the study. FP and QW participated in modifying the manuscript. All authors contributed to the article and approved the submitted version.</p>
<sec id="conf1" sec-type="COI-statement">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
</sec>
</body>
<back>
<ack>
<p>We thank all the students who participated in the research. We would also like to thank Editage (<ext-link xlink:href="http://www.editage.cn" ext-link-type="uri">www.editage.cn</ext-link>) for English language editing.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="ref1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Beall</surname> <given-names>A. E.</given-names></name></person-group> (<year>2007</year>). <article-title>Can a new smile make you look more intelligent and successful?</article-title> <source>Dent. Clin. N. Am.</source> <volume>51</volume>, <fpage>289</fpage>&#x2013;<lpage>297</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.cden.2007.02.002</pub-id>, PMID: <pub-id pub-id-type="pmid">17532913</pub-id></citation></ref>
<ref id="ref2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bell</surname> <given-names>R.</given-names></name> <name><surname>Koranyi</surname> <given-names>N.</given-names></name> <name><surname>Buchner</surname> <given-names>A.</given-names></name> <name><surname>Rothermund</surname> <given-names>K.</given-names></name></person-group> (<year>2017</year>). <article-title>The implicit cognition of reciprocal exchange: automatic retrieval of positive and negative experiences with partners in a prisoner&#x2019;s dilemma game</article-title>. <source>Cognit. Emot.</source> <volume>31</volume>, <fpage>657</fpage>&#x2013;<lpage>670</lpage>. doi: <pub-id pub-id-type="doi">10.1080/02699931.2016.1147423</pub-id>, PMID: <pub-id pub-id-type="pmid">26934367</pub-id></citation></ref>
<ref id="ref3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bodenhausen</surname> <given-names>G. V.</given-names></name> <name><surname>Kramer</surname> <given-names>G. P.</given-names></name> <name><surname>S&#x00FC;sser</surname> <given-names>K.</given-names></name></person-group> (<year>1994</year>). <article-title>Happiness and stereotypic thinking in social judgment</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>66</volume>, <fpage>621</fpage>&#x2013;<lpage>632</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0022-3514.66.4.621</pub-id></citation></ref>
<ref id="ref4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brambilla</surname> <given-names>M.</given-names></name> <name><surname>Leach</surname> <given-names>C. W.</given-names></name></person-group> (<year>2014</year>). <article-title>On the importance of being moral: the distinctive role of morality in social judgment</article-title>. <source>Soc. Cogn.</source> <volume>32</volume>, <fpage>397</fpage>&#x2013;<lpage>408</lpage>. doi: <pub-id pub-id-type="doi">10.1521/soco.2014.32.4.397</pub-id></citation></ref>
<ref id="ref5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brambilla</surname> <given-names>M.</given-names></name> <name><surname>Rusconi</surname> <given-names>P.</given-names></name> <name><surname>Sacchi</surname> <given-names>S.</given-names></name> <name><surname>Cherubini</surname> <given-names>P.</given-names></name></person-group> (<year>2011</year>). <article-title>Looking for honesty: the primary role of morality (vs. sociability and competence) in information gathering</article-title>. <source>Eur. J. Soc. Psychol.</source> <volume>41</volume>, <fpage>135</fpage>&#x2013;<lpage>143</lpage>. doi: <pub-id pub-id-type="doi">10.1002/ejsp.744</pub-id></citation></ref>
<ref id="ref6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brambilla</surname> <given-names>M.</given-names></name> <name><surname>Sacchi</surname> <given-names>S.</given-names></name> <name><surname>Rusconi</surname> <given-names>P.</given-names></name> <name><surname>Cherubini</surname> <given-names>P.</given-names></name> <name><surname>Yzerbyt</surname> <given-names>V. Y.</given-names></name></person-group> (<year>2012</year>). <article-title>You want to give a good impression? Be honest! Moral traits dominate group impression formation</article-title>. <source>Br. J. Soc. Psychol.</source> <volume>51</volume>, <fpage>149</fpage>&#x2013;<lpage>166</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.2044-8309.2010.02011.x</pub-id>, PMID: <pub-id pub-id-type="pmid">22435848</pub-id></citation></ref>
<ref id="ref7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brambilla</surname> <given-names>M.</given-names></name> <name><surname>Sacchi</surname> <given-names>S.</given-names></name> <name><surname>Rusconi</surname> <given-names>P.</given-names></name> <name><surname>Goodwin</surname> <given-names>G.</given-names></name></person-group> (<year>2021</year>). <article-title>The primacy of morality in impression development: theory, research, and future directions</article-title>. <source>Adv. Exp. Soc. Psychol.</source> Advance online publication. doi: <pub-id pub-id-type="doi">10.1016/bs.aesp.2021.03.001</pub-id></citation></ref>
<ref id="ref8"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Brysbaert</surname> <given-names>M.</given-names></name></person-group> (<year>2019</year>). <article-title>How many participants do we have to include in properly powered experiments? A tutorial of power analysis with reference tables</article-title>. <source>J. Cogn.</source> <volume>2</volume>, <fpage>16</fpage>&#x2013;<lpage>38</lpage>. doi: <pub-id pub-id-type="doi">10.5334/joc.72</pub-id>, PMID: <pub-id pub-id-type="pmid">31517234</pub-id></citation></ref>
<ref id="ref9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caulfield</surname> <given-names>F.</given-names></name> <name><surname>Ewing</surname> <given-names>L.</given-names></name> <name><surname>Bank</surname> <given-names>S.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2016</year>). <article-title>Judging trustworthiness from faces: emotion cues modulate trustworthiness judgments in young children</article-title>. <source>Br. J. Psychol.</source> <volume>107</volume>, <fpage>503</fpage>&#x2013;<lpage>518</lpage>. doi: <pub-id pub-id-type="doi">10.1111/bjop.12156</pub-id>, PMID: <pub-id pub-id-type="pmid">26493772</pub-id></citation></ref>
<ref id="ref10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Caulfield</surname> <given-names>F.</given-names></name> <name><surname>Ewing</surname> <given-names>L.</given-names></name> <name><surname>Burton</surname> <given-names>N.</given-names></name> <name><surname>Avard</surname> <given-names>E.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2014</year>). <article-title>Facial trustworthiness judgments in children with ASD are modulated by happy and angry emotional cues</article-title>. <source>PLoS One</source> <volume>9</volume>:<fpage>e97644</fpage>. doi: <pub-id pub-id-type="doi">10.1371/journal.pone.0097644</pub-id>, PMID: <pub-id pub-id-type="pmid">24878763</pub-id></citation></ref>
<ref id="ref11"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>L. F.</given-names></name> <name><surname>Yen</surname> <given-names>Y. S.</given-names></name></person-group> (<year>2007</year>). <source>Taiwanese Facial Expression Image Database.</source> <publisher-loc>Brain Mapping Laboratory</publisher-loc>, <publisher-name>Institute of Brain Science, National Yang-Ming University, Taipei, Taiwan</publisher-name>.</citation></ref>
<ref id="ref12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cheng</surname> <given-names>J. T.</given-names></name> <name><surname>Tracy</surname> <given-names>J. L.</given-names></name> <name><surname>Foulsham</surname> <given-names>T.</given-names></name> <name><surname>Kingstone</surname> <given-names>A.</given-names></name> <name><surname>Henrich</surname> <given-names>J.</given-names></name></person-group> (<year>2013</year>). <article-title>Two ways to the top: evidence that dominance and prestige are distinct yet viable avenues to social rank and influence</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>104</volume>, <fpage>103</fpage>&#x2013;<lpage>125</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0030398</pub-id>, PMID: <pub-id pub-id-type="pmid">23163747</pub-id></citation></ref>
<ref id="ref13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Waal</surname> <given-names>F. B.</given-names></name> <name><surname>Luttrell</surname> <given-names>L. M.</given-names></name></person-group> (<year>1985</year>). <article-title>The formal hierarchy of rhesus macaques: an investigation of the bared-teeth display</article-title>. <source>Am. J. Primatol.</source> <volume>9</volume>, <fpage>73</fpage>&#x2013;<lpage>85</lpage>. doi: <pub-id pub-id-type="doi">10.1002/ajp.1350090202</pub-id>, PMID: <pub-id pub-id-type="pmid">32102494</pub-id></citation></ref>
<ref id="ref14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dou</surname> <given-names>D. H.</given-names></name> <name><surname>Liu</surname> <given-names>X. C.</given-names></name> <name><surname>Zhang</surname> <given-names>Y. J.</given-names></name></person-group> (<year>2014</year>). <article-title>Babyface effect: babyface preference and overgeneralization</article-title>. <source>Adv. Psychol. Sci.</source> <volume>22</volume>, <fpage>760</fpage>&#x2013;<lpage>771</lpage>. doi: <pub-id pub-id-type="doi">10.3724/SP.J.1042.2014.00760</pub-id></citation></ref>
<ref id="ref15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Du</surname> <given-names>S.</given-names></name> <name><surname>Tao</surname> <given-names>Y.</given-names></name> <name><surname>Martinez</surname> <given-names>A. M.</given-names></name></person-group> (<year>2014</year>). <article-title>Compound facial expressions of emotion</article-title>. <source>Proc. Natl. Acad. Sci.</source> <volume>111</volume>, <fpage>E1454</fpage>&#x2013;<lpage>E1462</lpage>. doi: <pub-id pub-id-type="doi">10.1073/pnas.1322355111</pub-id>, PMID: <pub-id pub-id-type="pmid">24706770</pub-id></citation></ref>
<ref id="ref17"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name></person-group> (<year>1993</year>). <article-title>Facial expression and emotion</article-title>. <source>Am. Psychol.</source> <volume>48</volume>, <fpage>384</fpage>&#x2013;<lpage>392</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0003-066X.48.4.384</pub-id></citation></ref>
<ref id="ref18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name> <name><surname>Freisen</surname> <given-names>W. V.</given-names></name> <name><surname>Ancoli</surname> <given-names>S.</given-names></name></person-group> (<year>1980</year>). <article-title>Facial signs of emotional experience</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>39</volume>, <fpage>1125</fpage>&#x2013;<lpage>1134</lpage>. doi: <pub-id pub-id-type="doi">10.1037/h0077722</pub-id></citation></ref>
<ref id="ref16"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name> <name><surname>Freisen</surname> <given-names>W. V.</given-names></name> <name><surname>Hager</surname> <given-names>J.</given-names></name></person-group> (<year>2002</year>). <article-title>Emotional facial action coding system</article-title>. Manual and Investigators Guide. CD-ROM.</citation></ref>
<ref id="ref19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fang</surname> <given-names>X.</given-names></name> <name><surname>Sauter</surname> <given-names>D. A.</given-names></name> <name><surname>van Kleef</surname> <given-names>G. A.</given-names></name></person-group> (<year>2019</year>). <article-title>Unmasking smiles: the influence of culture and intensity on interpretations of smiling expressions</article-title>. <source>J. Cult. Cogn. Sci.</source> <volume>4</volume>, <fpage>293</fpage>&#x2013;<lpage>308</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s41809-019-00053-1</pub-id></citation></ref>
<ref id="ref21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fiske</surname> <given-names>S. T.</given-names></name> <name><surname>Cuddy</surname> <given-names>A. J.</given-names></name> <name><surname>Glick</surname> <given-names>P.</given-names></name></person-group> (<year>2007</year>). <article-title>Universal dimensions of social cognition: warmth and competence</article-title>. <source>Trends Cogn. Sci.</source> <volume>11</volume>, <fpage>77</fpage>&#x2013;<lpage>83</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.tics.2006.11.005</pub-id>, PMID: <pub-id pub-id-type="pmid">17188552</pub-id></citation></ref>
<ref id="ref22"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gao</surname> <given-names>L.</given-names></name> <name><surname>Ye</surname> <given-names>M. L.</given-names></name> <name><surname>Peng</surname> <given-names>J.</given-names></name> <name><surname>Chen</surname> <given-names>Y. S.</given-names></name></person-group> (<year>2016</year>). <article-title>How important are facial appearance to leadership? A literature review of leaders&#x2019; facial appearance</article-title>. <source>Psychol. Sci.</source> <volume>39</volume>, <fpage>992</fpage>&#x2013;<lpage>997</lpage>. doi: <pub-id pub-id-type="doi">10.16719/j.cnki.1671-6981.20160434</pub-id>, PMID: <pub-id pub-id-type="pmid">23379018</pub-id></citation></ref>
<ref id="ref23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Golle</surname> <given-names>J.</given-names></name> <name><surname>Mast</surname> <given-names>F. W.</given-names></name> <name><surname>Lobmaier</surname> <given-names>J. S.</given-names></name></person-group> (<year>2014</year>). <article-title>Something to smile about: the interrelationship between attractiveness and emotional expression</article-title>. <source>Cognit. Emot.</source> <volume>28</volume>, <fpage>298</fpage>&#x2013;<lpage>310</lpage>. doi: <pub-id pub-id-type="doi">10.1080/02699931.2013.817383</pub-id>, PMID: <pub-id pub-id-type="pmid">23875865</pub-id></citation></ref>
<ref id="ref24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Goodwin</surname> <given-names>G. P.</given-names></name> <name><surname>Piazza</surname> <given-names>J.</given-names></name> <name><surname>Rozin</surname> <given-names>P.</given-names></name></person-group> (<year>2014</year>). <article-title>Moral character predominates in person perception and evaluation</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>106</volume>, <fpage>148</fpage>&#x2013;<lpage>168</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0034726</pub-id>, PMID: <pub-id pub-id-type="pmid">24274087</pub-id></citation></ref>
<ref id="ref25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Harker</surname> <given-names>L.</given-names></name> <name><surname>Keltner</surname> <given-names>D.</given-names></name></person-group> (<year>2001</year>). <article-title>Expressions of positive emotion in women&#x2019;s college yearbook pictures and their relationship to personality and life outcomes across adulthood</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>80</volume>, <fpage>112</fpage>&#x2013;<lpage>124</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0022-3514.80.1.112</pub-id>, PMID: <pub-id pub-id-type="pmid">11195884</pub-id></citation></ref>
<ref id="ref26"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haxby</surname> <given-names>J. V.</given-names></name> <name><surname>Hoffman</surname> <given-names>E. A.</given-names></name> <name><surname>Gobbini</surname> <given-names>M. I.</given-names></name></person-group> (<year>2000</year>). <article-title>The distributed human neural system of face perception</article-title>. <source>Trends Cogn. Sci.</source> <volume>4</volume>, <fpage>223</fpage>&#x2013;<lpage>233</lpage>. doi: <pub-id pub-id-type="doi">10.1016/S1364-6613(00)01482-0</pub-id>, PMID: <pub-id pub-id-type="pmid">10827445</pub-id></citation></ref>
<ref id="ref27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hehman</surname> <given-names>E.</given-names></name> <name><surname>Stolier</surname> <given-names>R. M.</given-names></name> <name><surname>Freeman</surname> <given-names>J. B.</given-names></name> <name><surname>Flake</surname> <given-names>J. K.</given-names></name> <name><surname>Xie</surname> <given-names>S. Y.</given-names></name></person-group> (<year>2019</year>). <article-title>Toward a comprehensive model of face impressions: what we know, what we do not, and paths forward</article-title>. <source>Soc. Personal. Psychol. Compass</source> <volume>13</volume>:<fpage>e12431</fpage>. doi: <pub-id pub-id-type="doi">10.1111/spc3.12431</pub-id></citation></ref>
<ref id="ref28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hennig-Thurau</surname> <given-names>T.</given-names></name> <name><surname>Groth</surname> <given-names>M.</given-names></name> <name><surname>Paul</surname> <given-names>M.</given-names></name> <name><surname>Gremler</surname> <given-names>D. D.</given-names></name></person-group> (<year>2006</year>). <article-title>Are all smiles created equal? How emotional contagion and emotional labor affect service relationships</article-title>. <source>J. Mark.</source> <volume>70</volume>, <fpage>58</fpage>&#x2013;<lpage>73</lpage>. doi: <pub-id pub-id-type="doi">10.1509/jmkg.70.3.58</pub-id></citation></ref>
<ref id="ref29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hess</surname> <given-names>U.</given-names></name> <name><surname>Blairy</surname> <given-names>S.</given-names></name> <name><surname>Kleck</surname> <given-names>R. E.</given-names></name></person-group> (<year>2000</year>). <article-title>The influence of facial emotion displays, gender, and ethnicity on judgments of dominance and affiliation</article-title>. <source>J. Nonverbal Behav.</source> <volume>24</volume>, <fpage>265</fpage>&#x2013;<lpage>283</lpage>. doi: <pub-id pub-id-type="doi">10.1023/A:1006623213355</pub-id></citation></ref>
<ref id="ref30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hu</surname> <given-names>Y.</given-names></name> <name><surname>Zhang</surname> <given-names>Y.</given-names></name> <name><surname>Chen</surname> <given-names>H.</given-names></name></person-group> (<year>2018</year>). <article-title>The effect of target sex, sexual dimorphism, and facial attractiveness on perceptions of target attractiveness and trustworthiness</article-title>. <source>Front. Psychol.</source> <volume>9</volume>:<fpage>942</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2018.00942</pub-id>, PMID: <pub-id pub-id-type="pmid">29937750</pub-id></citation></ref>
<ref id="ref31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Huang</surname> <given-names>Z. H.</given-names></name> <name><surname>Bai</surname> <given-names>X. W.</given-names></name> <name><surname>Lin</surname> <given-names>L.</given-names></name> <name><surname>Song</surname> <given-names>Y.</given-names></name></person-group> (<year>2014</year>). <article-title>The mechanisms through which conscientiousness and neuroticism influence procrastination</article-title>. <source>Chin. J. Clin. Psych.</source> <volume>22</volume>, <fpage>140</fpage>&#x2013;<lpage>144</lpage>. doi: <pub-id pub-id-type="doi">10.16128/j.cnki.1005-3611.2014.01.006</pub-id></citation></ref>
<ref id="ref32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jaeger</surname> <given-names>B.</given-names></name> <name><surname>Todorov</surname> <given-names>A. T.</given-names></name> <name><surname>Evans</surname> <given-names>A. M.</given-names></name> <name><surname>Beest</surname> <given-names>I. V.</given-names></name></person-group> (<year>2020</year>). <article-title>Can we reduce facial biases? Persistent effects of facial trustworthiness on sentencing decisions</article-title>. <source>J. Exp. Soc. Psychol.</source> <volume>90</volume>, <fpage>104004</fpage>&#x2013;<lpage>104012</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.jesp.2020.104004</pub-id></citation></ref>
<ref id="ref34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jones</surname> <given-names>B. C.</given-names></name> <name><surname>DeBruine</surname> <given-names>L. M.</given-names></name> <name><surname>Flake</surname> <given-names>J. K.</given-names></name> <name><surname>Aczel</surname> <given-names>B.</given-names></name> <name><surname>Adamkovic</surname> <given-names>M.</given-names></name> <name><surname>Alaei</surname> <given-names>R.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>To which world regions does the valence&#x2013;dominance model of social perception apply?</article-title> <source>Nat. Hum. Behav.</source> <volume>5</volume>, <fpage>159</fpage>&#x2013;<lpage>169</lpage>. doi: <pub-id pub-id-type="doi">10.1038/s41562-020-01007-2</pub-id>, PMID: <pub-id pub-id-type="pmid">33398150</pub-id></citation></ref>
<ref id="ref35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kim</surname> <given-names>S. H.</given-names></name> <name><surname>Ryu</surname> <given-names>V.</given-names></name> <name><surname>Ha</surname> <given-names>R. Y.</given-names></name> <name><surname>Lee</surname> <given-names>S. J.</given-names></name> <name><surname>Cho</surname> <given-names>H. S.</given-names></name></person-group> (<year>2016</year>). <article-title>Perceptions of social dominance through facial emotion expressions in euthymic patients with bipolar I disorder</article-title>. <source>Compr. Psychiatry</source> <volume>66</volume>, <fpage>193</fpage>&#x2013;<lpage>200</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.comppsych.2016.01.012</pub-id>, PMID: <pub-id pub-id-type="pmid">26995253</pub-id></citation></ref>
<ref id="ref36"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kraus</surname> <given-names>M. W.</given-names></name> <name><surname>Chen</surname> <given-names>T. W. D.</given-names></name></person-group> (<year>2013</year>). <article-title>A winning smile? Smile intensity, physical dominance, and fighter performance</article-title>. <source>Emotion</source> <volume>13</volume>, <fpage>270</fpage>&#x2013;<lpage>279</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0030745</pub-id>, PMID: <pub-id pub-id-type="pmid">23356564</pub-id></citation></ref>
<ref id="ref37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Krumhuber</surname> <given-names>E.</given-names></name> <name><surname>Manstead</surname> <given-names>A.</given-names></name> <name><surname>Cosker</surname> <given-names>D.</given-names></name> <name><surname>Marshall</surname> <given-names>D.</given-names></name> <name><surname>Rosin</surname> <given-names>P. L.</given-names></name> <name><surname>Kappas</surname> <given-names>A.</given-names></name></person-group> (<year>2007</year>). <article-title>Facial dynamics as indicators of trustworthiness and cooperative behavior</article-title>. <source>Emotion</source> <volume>7</volume>, <fpage>730</fpage>&#x2013;<lpage>735</lpage>. doi: <pub-id pub-id-type="doi">10.1037/1528-3542.7.4.730</pub-id>, PMID: <pub-id pub-id-type="pmid">18039040</pub-id></citation></ref>
<ref id="ref38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Landy</surname> <given-names>J. F.</given-names></name> <name><surname>Piazza</surname> <given-names>J.</given-names></name> <name><surname>Goodwin</surname> <given-names>G. P.</given-names></name></person-group> (<year>2016</year>). <article-title>When it&#x2019;s bad to be friendly and smart: the desirability of sociability and competence depends on morality</article-title>. <source>Personal. Soc. Psychol. Bull.</source> <volume>42</volume>, <fpage>1272</fpage>&#x2013;<lpage>1290</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0146167216655984</pub-id>, PMID: <pub-id pub-id-type="pmid">27407101</pub-id></citation></ref>
<ref id="ref39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>O.</given-names></name> <name><surname>Cheng</surname> <given-names>H.</given-names></name></person-group> (<year>2010</year>). <article-title>The retrospect and prospect of facial attractiveness</article-title>. <source>Adv. Psychol. Sci.</source> <volume>30</volume>, <fpage>521</fpage>&#x2013;<lpage>528</lpage>. doi: <pub-id pub-id-type="doi">10.3724/SP.J.1142.2010.40521</pub-id></citation></ref>
<ref id="ref40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>Y. N.</given-names></name> <name><surname>Jiang</surname> <given-names>Z. Q.</given-names></name> <name><surname>Wu</surname> <given-names>Q.</given-names></name> <name><surname>Leng</surname> <given-names>H. Z.</given-names></name> <name><surname>Li</surname> <given-names>D.</given-names></name></person-group> (<year>2020</year>). <article-title>Effect of happy and neutral expression on social perceptions of faces in young adults</article-title>. <source>Chin. Ment. Health J.</source> <volume>34</volume>, <fpage>613</fpage>&#x2013;<lpage>619</lpage>. doi: <pub-id pub-id-type="doi">10.3969/j.issn.1000-6729.2020.7.011</pub-id></citation></ref>
<ref id="ref41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liu</surname> <given-names>C.</given-names></name> <name><surname>Ge</surname> <given-names>Y.</given-names></name> <name><surname>Luo</surname> <given-names>W. B.</given-names></name> <name><surname>Luo</surname> <given-names>Y. J.</given-names></name></person-group> (<year>2010</year>). <article-title>Show your teeth or not: the role of the mouth and eyes in smiles and its cross-cultural variations</article-title>. <source>Behav. Brain Sci.</source> <volume>33</volume>, <fpage>450</fpage>&#x2013;<lpage>452</lpage>. doi: <pub-id pub-id-type="doi">10.1017/S0140525X10001263</pub-id></citation></ref>
<ref id="ref42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mehu</surname> <given-names>M.</given-names></name> <name><surname>Little</surname> <given-names>A. C.</given-names></name> <name><surname>Dunbar</surname> <given-names>R. I. M.</given-names></name></person-group> (<year>2008</year>). <article-title>Sex differences in the effect of smiling on social judgments: An evolutionary approach</article-title>. <source>J. Soc. Evol. Cult. Psychol.</source> <volume>2</volume>, <fpage>103</fpage>&#x2013;<lpage>121</lpage>. doi: <pub-id pub-id-type="doi">10.1037/h0099351</pub-id></citation></ref>
<ref id="ref43"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Montepare</surname> <given-names>J. M.</given-names></name> <name><surname>Dobish</surname> <given-names>H.</given-names></name></person-group> (<year>2003</year>). <article-title>The contribution of emotion perceptions and their overgeneralizations to trait impressions</article-title>. <source>J. Nonverbal Behav.</source> <volume>27</volume>, <fpage>237</fpage>&#x2013;<lpage>254</lpage>. doi: <pub-id pub-id-type="doi">10.1023/A:1027332800296</pub-id>, PMID: <pub-id pub-id-type="pmid">20085393</pub-id></citation></ref>
<ref id="ref44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Na</surname> <given-names>J.</given-names></name> <name><surname>Huh</surname> <given-names>J.</given-names></name></person-group> (<year>2016</year>). <article-title>Facial inferences of social relations precited Korean elections better than did facial inferences of competence</article-title>. <source>Korean J. Soc. Person. Psychol.</source> <volume>30</volume>, <fpage>37</fpage>&#x2013;<lpage>49</lpage>. doi: <pub-id pub-id-type="doi">10.21193/kjspp.2016.30.4.003</pub-id></citation></ref>
<ref id="ref20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nicholls</surname> <given-names>A. R.</given-names></name> <name><surname>Polman</surname> <given-names>R.</given-names></name> <name><surname>Levy</surname> <given-names>A. R.</given-names></name> <name><surname>Backhouse</surname> <given-names>S. H.</given-names></name></person-group> (<year>2008</year>). <article-title>Mental toughness, optimism, pessimism, and coping among athletes</article-title>. <source>Pers. Individ. Dif.</source> <volume>44</volume>, <fpage>1182</fpage>&#x2013;<lpage>1192</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.paid.2007.11.011</pub-id></citation></ref>
<ref id="ref45"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Nunnally</surname> <given-names>J. C.</given-names></name></person-group> (<year>1978</year>). <source>Psychometric Theory.</source> <publisher-loc>New York</publisher-loc>: <publisher-name>McGraw&#x2013;Hill</publisher-name>.</citation></ref>
<ref id="ref46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oliveira</surname> <given-names>M.</given-names></name> <name><surname>Garcia-Marques</surname> <given-names>T.</given-names></name> <name><surname>Garcia-Marques</surname> <given-names>L.</given-names></name> <name><surname>Dotsch</surname> <given-names>R.</given-names></name></person-group> (<year>2020</year>). <article-title>Good to bad or bad to bad? What is the relationship between valence and the trait content of the big two?</article-title> <source>Eur. J. Soc. Psychol.</source> <volume>50</volume>, <fpage>463</fpage>&#x2013;<lpage>483</lpage>. doi: <pub-id pub-id-type="doi">10.1002/ejsp.2618</pub-id></citation></ref>
<ref id="ref47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Olivola</surname> <given-names>C. Y.</given-names></name> <name><surname>Funk</surname> <given-names>F.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name></person-group> (<year>2014</year>). <article-title>Social attributions from faces bias human choices</article-title>. <source>Trends Cogn. Sci.</source> <volume>18</volume>, <fpage>566</fpage>&#x2013;<lpage>570</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.tics.2014.09.007</pub-id>, PMID: <pub-id pub-id-type="pmid">25344029</pub-id></citation></ref>
<ref id="ref48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Oosterhof</surname> <given-names>N. N.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name></person-group> (<year>2008</year>). <article-title>The functional basis of face evaluation</article-title>. <source>Proc. Natl. Acad. Sci. U. S. A.</source> <volume>105</volume>, <fpage>11087</fpage>&#x2013;<lpage>11092</lpage>. doi: <pub-id pub-id-type="doi">10.1073/pnas.0805664105</pub-id>, PMID: <pub-id pub-id-type="pmid">18685089</pub-id></citation></ref>
<ref id="ref49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Parr</surname> <given-names>L. A.</given-names></name> <name><surname>Waller</surname> <given-names>B. M.</given-names></name></person-group> (<year>2006</year>). <article-title>Understanding chimpanzee facial expression: insights into the evolution of communication</article-title>. <source>Soc. Cogn. Affect. Neurosci.</source> <volume>1</volume>, <fpage>221</fpage>&#x2013;<lpage>228</lpage>. doi: <pub-id pub-id-type="doi">10.1093/scan/nsl031</pub-id>, PMID: <pub-id pub-id-type="pmid">18985109</pub-id></citation></ref>
<ref id="ref50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rezlescu</surname> <given-names>C.</given-names></name> <name><surname>Penton</surname> <given-names>T.</given-names></name> <name><surname>Walsh</surname> <given-names>V.</given-names></name> <name><surname>Tsujimura</surname> <given-names>H.</given-names></name> <name><surname>Scott</surname> <given-names>S. K.</given-names></name> <name><surname>Banissy</surname> <given-names>M. J.</given-names></name></person-group> (<year>2015</year>). <article-title>Dominant voices and attractive faces: the contribution of visual and auditory information to integrated person impressions</article-title>. <source>J. Nonverbal Behav.</source> <volume>39</volume>, <fpage>355</fpage>&#x2013;<lpage>370</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s10919-015-0214-8</pub-id></citation></ref>
<ref id="ref51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rusconi</surname> <given-names>P.</given-names></name> <name><surname>Sacchi</surname> <given-names>S.</given-names></name> <name><surname>Brambilla</surname> <given-names>M.</given-names></name> <name><surname>Capellini</surname> <given-names>R.</given-names></name> <name><surname>Cherubini</surname> <given-names>P.</given-names></name></person-group> (<year>2020</year>). <article-title>Being honest and acting consistently: boundary conditions of the negativity effect in the attribution of morality</article-title>. <source>Soc. Cogn.</source> <volume>38</volume>, <fpage>146</fpage>&#x2013;<lpage>178</lpage>. doi: <pub-id pub-id-type="doi">10.1521/soco.2020.38.2.146</pub-id></citation></ref>
<ref id="ref52"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Sandy</surname> <given-names>C.</given-names></name> <name><surname>Rusconi</surname> <given-names>P.</given-names></name> <name><surname>Li</surname> <given-names>S.</given-names></name></person-group> (<year>2017</year>). &#x201C;<article-title>Can humans detect the authenticity of social media accounts? On the impact of verbal and non-verbal cues on credibility judgements of Twitter profiles</article-title>,&#x201D; in <source>IEEE International Conference on Cybernetics</source>; June 2017; IEEE.</citation></ref>
<ref id="ref33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shu</surname> <given-names>D.</given-names></name> <name><surname>Shen</surname> <given-names>S.</given-names></name> <name><surname>Huang</surname> <given-names>Y.</given-names></name></person-group> (<year>2017</year>). <article-title>Tao, virtue, benevolence, righteousness and propriety:on the core values of shu school</article-title>. <source>Contemp. Soc. Sci.</source> <volume>3</volume>, <fpage>68</fpage>&#x2013;<lpage>85</lpage>. doi: <pub-id pub-id-type="doi">10.19873/j.cnki.2096-0212.2017.03.007</pub-id></citation></ref>
<ref id="ref53"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sutherland</surname> <given-names>C. A.</given-names></name> <name><surname>Liu</surname> <given-names>X.</given-names></name> <name><surname>Zhang</surname> <given-names>L.</given-names></name> <name><surname>Chu</surname> <given-names>Y.</given-names></name> <name><surname>Oldmeadow</surname> <given-names>J. A.</given-names></name> <name><surname>Young</surname> <given-names>A. W.</given-names></name></person-group> (<year>2018</year>). <article-title>Facial first impressions across culture: data-driven modeling of Chinese and British perceivers&#x2019; unconstrained facial impressions</article-title>. <source>Personal. Soc. Psychol. Bull.</source> <volume>44</volume>, <fpage>521</fpage>&#x2013;<lpage>537</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0146167217744194</pub-id>, PMID: <pub-id pub-id-type="pmid">29226785</pub-id></citation></ref>
<ref id="ref54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sutherland</surname> <given-names>C. A.</given-names></name> <name><surname>Young</surname> <given-names>A. W.</given-names></name> <name><surname>Rhodes</surname> <given-names>G.</given-names></name></person-group> (<year>2017</year>). <article-title>Facial first impressions from another angle: how social judgements are influenced by changeable and invariant facial properties</article-title>. <source>Br. J. Psychol.</source> <volume>108</volume>, <fpage>397</fpage>&#x2013;<lpage>415</lpage>. doi: <pub-id pub-id-type="doi">10.1111/bjop.12206</pub-id>, PMID: <pub-id pub-id-type="pmid">27443971</pub-id></citation></ref>
<ref id="ref55"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Thompson</surname> <given-names>D. F.</given-names></name> <name><surname>Meltzer</surname> <given-names>L.</given-names></name></person-group> (<year>1964</year>). <article-title>Communication of emotional intent by facial expression</article-title>. <source>J. Abnorm. Soc. Psychol.</source> <volume>68</volume>, <fpage>129</fpage>&#x2013;<lpage>135</lpage>. doi: <pub-id pub-id-type="doi">10.1037/h0044598</pub-id>, PMID: <pub-id pub-id-type="pmid">14117963</pub-id></citation></ref>
<ref id="ref56"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Todorov</surname> <given-names>A.</given-names></name> <name><surname>Oh</surname> <given-names>D.</given-names></name></person-group> (<year>2021</year>). <article-title>The structure and perceptual basis of social judgments from faces</article-title>. <source>Adv. Exp. Soc. Psychol.</source> <volume>63</volume>, <fpage>189</fpage>&#x2013;<lpage>245</lpage>. doi: <pub-id pub-id-type="doi">10.1016/bs.aesp.2020.11.004</pub-id></citation></ref>
<ref id="ref57"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Todorov</surname> <given-names>A.</given-names></name> <name><surname>Olivola</surname> <given-names>C. Y.</given-names></name> <name><surname>Dotsch</surname> <given-names>R.</given-names></name> <name><surname>Mende-Siedlecki</surname> <given-names>P.</given-names></name></person-group> (<year>2015</year>). <article-title>Social attributions from faces: determinants, consequences, accuracy, and functional significance</article-title>. <source>Annu. Rev. Psychol.</source> <volume>66</volume>, <fpage>519</fpage>&#x2013;<lpage>545</lpage>. doi: <pub-id pub-id-type="doi">10.1146/annurev-psych-113011-143831</pub-id>, PMID: <pub-id pub-id-type="pmid">25196277</pub-id></citation></ref>
<ref id="ref58"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tsai</surname> <given-names>J. L.</given-names></name> <name><surname>Ang</surname> <given-names>J. Y. Z.</given-names></name> <name><surname>Blevins</surname> <given-names>E.</given-names></name> <name><surname>Goernandt</surname> <given-names>J.</given-names></name> <name><surname>Fung</surname> <given-names>H. H.</given-names></name> <name><surname>Jiang</surname> <given-names>D.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Leaders&#x2019; smiles reflect cultural differences in ideal affect</article-title>. <source>Emotion</source> <volume>16</volume>, <fpage>183</fpage>&#x2013;<lpage>195</lpage>. doi: <pub-id pub-id-type="doi">10.1037/emo0000133</pub-id>, PMID: <pub-id pub-id-type="pmid">26751631</pub-id></citation></ref>
<ref id="ref59"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ueda</surname> <given-names>Y.</given-names></name> <name><surname>Yoshikawa</surname> <given-names>S.</given-names></name></person-group> (<year>2018</year>). <article-title>Beyond personality traits: which facial expressions imply dominance in two-person interaction scenes?</article-title> <source>Emotion</source> <volume>18</volume>, <fpage>872</fpage>&#x2013;<lpage>885</lpage>. doi: <pub-id pub-id-type="doi">10.1037/emo0000286</pub-id>, PMID: <pub-id pub-id-type="pmid">28872339</pub-id></citation></ref>
<ref id="ref60"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Valentine</surname> <given-names>K. A.</given-names></name> <name><surname>Li</surname> <given-names>N. P.</given-names></name> <name><surname>Penke</surname> <given-names>L.</given-names></name> <name><surname>Perrett</surname> <given-names>D. I.</given-names></name></person-group> (<year>2014</year>). <article-title>Judging a man by the width of his face: the role of facial ratios and dominance in mate choice at speed-dating events</article-title>. <source>Psychol. Sci.</source> <volume>25</volume>, <fpage>806</fpage>&#x2013;<lpage>811</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0956797613511823</pub-id>, PMID: <pub-id pub-id-type="pmid">24458269</pub-id></citation></ref>
<ref id="ref61"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Walker</surname> <given-names>M.</given-names></name> <name><surname>Jiang</surname> <given-names>F.</given-names></name> <name><surname>Vetter</surname> <given-names>T.</given-names></name> <name><surname>Sczesny</surname> <given-names>S.</given-names></name></person-group> (<year>2011</year>). <article-title>Universals and cultural differences in forming personality trait judgments from faces</article-title>. <source>Soc. Psychol. Personal. Sci.</source> <volume>2</volume>, <fpage>609</fpage>&#x2013;<lpage>617</lpage>. doi: <pub-id pub-id-type="doi">10.1177/1948550611402519</pub-id></citation></ref>
<ref id="ref62"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>D. F.</given-names></name> <name><surname>Cui</surname> <given-names>H.</given-names></name></person-group> (<year>2003</year>). <article-title>Processes and preliminary results in the construction of the Chinese personality scale (QZPS)</article-title>. <source>Acta Psychol. Sin.</source> <volume>35</volume>, <fpage>127</fpage>&#x2013;<lpage>136</lpage>.</citation></ref>
<ref id="ref63"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>H.</given-names></name> <name><surname>Han</surname> <given-names>C.</given-names></name> <name><surname>Hahn</surname> <given-names>A. C.</given-names></name> <name><surname>Fasolt</surname> <given-names>V.</given-names></name> <name><surname>Morrison</surname> <given-names>D. K.</given-names></name> <name><surname>Holzleitner</surname> <given-names>I. J.</given-names></name> <etal/></person-group>. (<year>2019</year>). <article-title>A data-driven study of Chinese participants' social judgments of Chinese faces</article-title>. <source>PLoS One</source> <volume>14</volume>:<fpage>e0210315</fpage>. doi: <pub-id pub-id-type="doi">10.1371/journal.pone.0210315</pub-id>, PMID: <pub-id pub-id-type="pmid">30608990</pub-id></citation></ref>
<ref id="ref64"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>Z.</given-names></name> <name><surname>He</surname> <given-names>X.</given-names></name> <name><surname>Liu</surname> <given-names>F.</given-names></name></person-group> (<year>2015</year>). <article-title>Examining the effect of smile intensity on age perceptions</article-title>. <source>Psychol. Rep.</source> <volume>117</volume>, <fpage>188</fpage>&#x2013;<lpage>205</lpage>. doi: <pub-id pub-id-type="doi">10.2466/07.PR0.117c10z7</pub-id>, PMID: <pub-id pub-id-type="pmid">26107108</pub-id></citation></ref>
<ref id="ref65"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>Z.</given-names></name> <name><surname>Mao</surname> <given-names>H.</given-names></name> <name><surname>Li</surname> <given-names>Y. J.</given-names></name> <name><surname>Liu</surname> <given-names>F.</given-names></name></person-group> (<year>2017</year>). <article-title>Smile big or not? Effects of smile intensity on perceptions of warmth and competence</article-title>. <source>J. Consum. Res.</source> <volume>43</volume>, <fpage>ucw062</fpage>&#x2013;<lpage>ucw805</lpage>. doi: <pub-id pub-id-type="doi">10.1093/jcr/ucw062</pub-id></citation></ref>
<ref id="ref66"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Westerman</surname> <given-names>D. L.</given-names></name> <name><surname>Lanska</surname> <given-names>M.</given-names></name> <name><surname>Olds</surname> <given-names>J. M.</given-names></name></person-group> (<year>2015</year>). <article-title>The effect of processing fluency on impressions of familiarity and liking</article-title>. <source>J. Exp. Psychol. Learn. Mem. Cogn.</source> <volume>41</volume>, <fpage>426</fpage>&#x2013;<lpage>438</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0038356</pub-id>, PMID: <pub-id pub-id-type="pmid">25528088</pub-id></citation></ref>
<ref id="ref67"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Willis</surname> <given-names>J.</given-names></name> <name><surname>Todorov</surname> <given-names>A.</given-names></name></person-group> (<year>2006</year>). <article-title>First impressions: making up your mind after a 100-ms exposure to a face</article-title>. <source>Psychol. Sci.</source> <volume>17</volume>, <fpage>592</fpage>&#x2013;<lpage>598</lpage>. doi: <pub-id pub-id-type="doi">10.1111/j.1467-9280.2006.01750.x</pub-id>, PMID: <pub-id pub-id-type="pmid">16866745</pub-id></citation></ref>
<ref id="ref68"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wilson</surname> <given-names>J. P.</given-names></name> <name><surname>Rule</surname> <given-names>N. O.</given-names></name></person-group> (<year>2015</year>). <article-title>Facial trustworthiness predicts extreme criminal-sentencing outcomes</article-title>. <source>Psychol. Sci.</source> <volume>26</volume>, <fpage>1325</fpage>&#x2013;<lpage>1331</lpage>. doi: <pub-id pub-id-type="doi">10.1177/0956797615590992</pub-id>, PMID: <pub-id pub-id-type="pmid">26162847</pub-id></citation></ref>
<ref id="ref69"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wong</surname> <given-names>S. H. W.</given-names></name> <name><surname>Zeng</surname> <given-names>Y.</given-names></name></person-group> (<year>2017</year>). <article-title>Do inferences of competence from faces predict political selection in authoritarian regimes? Evidence from China</article-title>. <source>Soc. Sci. Res.</source> <volume>66</volume>, <fpage>248</fpage>&#x2013;<lpage>263</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.ssresearch.2016.11.002</pub-id>, PMID: <pub-id pub-id-type="pmid">28705360</pub-id></citation></ref>
<ref id="ref70"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>Q.</given-names></name> <name><surname>Liu</surname> <given-names>Y.</given-names></name> <name><surname>Li</surname> <given-names>D.</given-names></name> <name><surname>Leng</surname> <given-names>H. Z.</given-names></name> <name><surname>Jiang</surname> <given-names>Z. Q.</given-names></name></person-group> (<year>2020</year>). <article-title>Dimensions of personality perception from Chinese face</article-title>. <source>Psychol. Explor.</source> <volume>40</volume>, <fpage>177</fpage>&#x2013;<lpage>182</lpage>.</citation></ref>
<ref id="ref71"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>J.</given-names></name> <name><surname>Tian</surname> <given-names>L. M.</given-names></name> <name><surname>Zhang</surname> <given-names>W. X.</given-names></name></person-group> (<year>2012</year>). <article-title>Social competence: concepts and theoretical models</article-title>. <source>Adv. Psychol. Sci.</source> <volume>20</volume>, <fpage>1991</fpage>&#x2013;<lpage>2000</lpage>. doi: <pub-id pub-id-type="doi">10.3724/SP.J.1042.2013.01991</pub-id></citation></ref>
<ref id="ref72"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Wang</surname> <given-names>M. F.</given-names></name></person-group> (<year>2011</year>). <article-title>A review of the fundamental dimension of social judgment content</article-title>. <source>Psychol. Sci.</source> <volume>24</volume>, <fpage>127</fpage>&#x2013;<lpage>132</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s11589-011-0776-4</pub-id></citation></ref>
</ref-list>
<fn-group>
<fn fn-type="financial-disclosure"><p><bold>Funding.</bold> The authors gratefully acknowledge the financial supports of the 2019 Humanities and Social Sciences Research Project of the Ministry of Education (19YJA850014) and the Key project of Liaoning Provincial Department of Education (LZ2020001).</p></fn>
</fn-group>
</back>
</article>