<?xml version="1.0" encoding="utf-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="2.3">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Psychol.</journal-id>
<journal-title>Frontiers in Psychology</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Psychol.</abbrev-journal-title>
<issn pub-type="epub">1664-1078</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpsyg.2022.895929</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Psychology</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>The Impact of Emotional States on Construction Workers&#x2019; Recognition Ability of Safety Hazards Based on Social Cognitive Neuroscience</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Chong</surname>
<given-names>Dan</given-names>
</name>
<xref rid="aff1" ref-type="aff"><sup>1</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1439923/overview"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Yu</surname>
<given-names>Anni</given-names>
</name>
<xref rid="aff1" ref-type="aff"><sup>1</sup></xref>
<xref rid="c001" ref-type="corresp"><sup>&#x002A;</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1719968/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Su</surname>
<given-names>Hao</given-names>
</name>
<xref rid="aff1" ref-type="aff"><sup>1</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1838886/overview"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Zhou</surname>
<given-names>Yue</given-names>
</name>
<xref rid="aff2" ref-type="aff"><sup>2</sup></xref>
<uri xlink:href="https://loop.frontiersin.org/people/1839262/overview"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Management Science and Engineering, Shanghai University</institution>, <addr-line>Shanghai</addr-line>, <country>China</country>
</aff>
<aff id="aff2"><sup>2</sup><institution>Shanghai Urban Construction Road Engineering Co., Ltd, Shanghai Road &#x0026; Bridge (Group) Co., Ltd</institution>, <addr-line>Shanghai</addr-line>, <country>China</country>
</aff>
<author-notes>
<fn id="fn0001" fn-type="edited-by">
<p>Edited by: Jiayu Chen, City University of Hong Kong, Hong Kong SAR, China</p>
</fn>
<fn id="fn0002" fn-type="edited-by">
<p>Reviewed by: Chaojie Fan, City University of Hong Kong, Hong Kong SAR, China; Guangchong Chen, City University of Hong Kong, Hong Kong SAR, China</p>
</fn>
<corresp id="c001">&#x002A;Correspondence: Anni Yu, <email>15705732971@163.com</email></corresp>
<fn id="fn0003" fn-type="other">
<p>This article was submitted to Emotion Science, a section of the journal Frontiers in Psychology</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>16</day>
<month>06</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2022</year>
</pub-date>
<volume>13</volume>
<elocation-id>895929</elocation-id>
<history>
<date date-type="received">
<day>15</day>
<month>03</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>18</day>
<month>05</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x00A9; 2022 Chong, Yu, Su and Zhou.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Chong, Yu, Su and Zhou</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<abstract>
<p>The construction industry is one of the most dangerous industries with grave situation owing to high accident rate and mortality rate, which accompanied with a series of security management issues that need to be tackled urgently. The unsafe behavior of construction workers is a critical reason for the high incidence of safety accidents. Affective Events Theory suggests that individual emotional states interfere with individual decisions and behaviors, which means the individual emotional states can significantly influence construction workers&#x2019; unsafe behaviors. As the complexity of the construction site environment and the lack of attention to construction workers&#x2019; emotions by managers, serious potential emotional problems were planted, resulting in the inability of construction workers to effectively recognize safety hazards, thus leading to safety accidents. Consequently, the study designs a behavioral experiment with E-prime software based on social cognitive neuroscience theories. Forty construction workers&#x2019; galvanic skin response signals were collected by a wearable device (HKR-11C+), and the galvanic skin response data were classified into different emotional states with support vector machine (SVM) algorithm. Variance analysis, correlation analysis and regression analysis were used to analyze the influence of emotional states on construction workers&#x2019; recognition ability of safety hazards. The research findings indicate that the SVM algorithm could effectively classify galvanic skin response data. The construct ion workers&#x2019; the reaction time to safety hazards and emotional valence were negatively correlated, while the accuracy of safety hazards recognition and the perception level of safety hazard separately had an inverted &#x201C;U&#x201D; type relationship with emotional valence. For construction workers with more than 20&#x2009;years of working experience, work experience could effectively reduce the influence of emotional fluctuations on the accuracy of safety hazards identification. This study contributes to the application of physiological measurement techniques in construction safety management and shed a light on improving the theoretical system of safety management.</p>
</abstract>
<kwd-group>
<kwd>safety management</kwd>
<kwd>construction workers</kwd>
<kwd>emotional states</kwd>
<kwd>safety hazards</kwd>
<kwd>galvanic skin response</kwd>
</kwd-group>
<contract-num rid="cn1">71901139</contract-num>
<contract-num rid="cn2">19DZ1204203</contract-num>
<contract-num rid="cn2">21692195100</contract-num>
<contract-sponsor id="cn1">National Natural Science Foundation of China<named-content content-type="fundref-id">10.13039/501100001809</named-content>
</contract-sponsor>
<contract-sponsor id="cn2">Science and Technology Commission of Shanghai Municipality<named-content content-type="fundref-id">10.13039/501100003399</named-content>
</contract-sponsor>
<counts>
<fig-count count="11"/>
<table-count count="13"/>
<equation-count count="3"/>
<ref-count count="62"/>
<page-count count="16"/>
<word-count count="9447"/>
</counts>
</article-meta>
</front>
<body>
<sec id="sec1" sec-type="intro">
<title>Introduction</title>
<p>With a large number of employees, the construction industry is a typical labor-intensive industry worldwide. Every year, over 60,000 work-related fatalities are reported from construction workplaces around the world (<xref ref-type="bibr" rid="ref28">Lingard, 2013</xref>). The construction industry is a pillar industry in China, from 2000 to 2020, the number of people in the construction industry has increased from 19.94 million to 53.67 million, with an annual benefit of 729.96 billion RMB in the construction industry in 2020, an increase of 3.5% over the previous year (<xref ref-type="bibr" rid="ref002">China National Bureau of Statistics, 2020</xref>). However, the occupational safety of construction workers is not guaranteed with frequent safety accidents in the construction industry. The total number of safety accidents in the construction industry has remained high over the years (<xref ref-type="bibr" rid="ref001">China Construction Industry Association, 2020</xref>). According to the Ministry of Housing and Urban&#x2013;Rural Development of China, 773 production safety accidents occurred in 2019 in housing and municipal engineering in China, with 904 deaths, an increase of 39 safety accidents and 64 fatalities over 2018, up 5.31 and 7.62%, respectively (<xref ref-type="bibr" rid="ref005">Ministry of Housing and Urban&#x2013;Rural Development of China, 2019</xref>). In other countries, construction casualty rates are also disproportionately high compared to the number of people employed (<xref ref-type="bibr" rid="ref2">Al-Bayati et al., 2019</xref>; <xref ref-type="bibr" rid="ref30">Liu et al., 2020</xref>). Overall, the construction industry continues to experience a disproportionate share of work-related injuries and illnesses, which significantly contributes to work-related fatalities (<xref ref-type="bibr" rid="ref37">Pandit et al., 2019</xref>). With the frequent occurrence of safety accidents in the construction industry, the aim of research on safety management has gradually changed from the specific environment to human factors. Currently most of the construction workers in China are migrant workers, who have strong risk-taking and fluke psychology. The coarse management mode in the construction industry fails to take care of the psychological needs of the employees and brings serious hidden mental health problems.</p>
<p>Accident Causation Theory suggests that human factors are the main factor in the frequency of safety accidents (<xref ref-type="bibr" rid="ref42">Salminen and Tallberg, 1996</xref>). Accurate identification and assessment of the potential consequences of construction site safety hazards by construction workers is an important prerequisite for safety management (<xref ref-type="bibr" rid="ref8">Farmer and Chambers, 1929</xref>). The better the construction workers&#x2019; ability to identify and assess the visible or potential safety hazards in the construction sites, the smaller are the chances of their unsafe behaviors occurring (<xref ref-type="bibr" rid="ref38">Perlman et al., 2014</xref>; <xref ref-type="bibr" rid="ref36">Namian et al., 2016</xref>). Through a statistical analysis of the causes of 75,000 injuries and fatalities occurred in enterprises, American safety engineer Heinrich concluded that more than 88% of safety accidents were caused by unsafe human behavior (<xref ref-type="bibr" rid="ref17">Heinrich, 1941</xref>). The analysis of safety accident surveys in the construction industry also showed that unsafe behavior of construction workers was a common cause of safety accidents (<xref ref-type="bibr" rid="ref16">Haslam et al., 2005</xref>). The main factors affecting unsafe behavior of construction workers can be classified into three aspects: individual factors, organizational factors and environmental factors (<xref ref-type="bibr" rid="ref1">Abdelhamid and Everett, 2000</xref>; <xref ref-type="bibr" rid="ref54">Zhou et al., 2008</xref>). Individual influences include psychological factors, physiological factors and the physical quality of the worker, which lead to unsafe behavior of the worker under a single or multiple factors (<xref ref-type="bibr" rid="ref3">Alizadeh et al., 2015</xref>; <xref ref-type="bibr" rid="ref36">Namian et al., 2016</xref>). An individual&#x2019;s safety behavior is affected by emotional state; Affective Event Theory (AET) suggests that employees&#x2019; behavior and performance at work are largely determined by the changes in their emotions at each moment rather than their attitudes or personalities (<xref ref-type="bibr" rid="ref49">Weiss and Cropanzano, 1996</xref>). The AET has been demonstrated effectively in the areas of mine worker safety behavior (<xref ref-type="bibr" rid="ref50">Yang et al., 2020</xref>), driver driving safety (<xref ref-type="bibr" rid="ref35">Muller et al., 2014</xref>) and among other areas. Kajiwara verified that emotions can influence the productivity and accuracy of workers in a logistics picking system (<xref ref-type="bibr" rid="ref008">Kajiwara et al., 2019</xref>). Manzoor developed an agent-based computational social agent model to explore how decisions can be affected by regulating the emotions involved, and how emotions are affected by emotion regulation and contagion (<xref ref-type="bibr" rid="ref34">Manzoor and Treur, 2015</xref>). However, people do not always think rationally when they act, thus they often make irrational choices or decisions when they are &#x201C;emotionally driven.&#x201D; Therefore, employees&#x2019; emotional states and the long-term emotions accumulated from emotional fragments can interfere with individual decisions and behaviors (<xref ref-type="bibr" rid="ref20">Rachlin, 2003</xref>).</p>
<p>The psychological factors are considered as the most important contributors of unsafe behavior for construction workers, and the psychological activities are influenced by individual emotions. Ekman divided emotions into six basic emotions that are sadness, happiness, anger, disgust, fear and surprise (<xref ref-type="bibr" rid="ref7">Ekman, 1992b</xref>). Zelenski classified all emotional states into positive and negative emotions (<xref ref-type="bibr" rid="ref51">Zelenski et al., 2012</xref>). Emotions can be measured in three dimensions including personal physiological changes, subjective feelings and external expressions (<xref ref-type="bibr" rid="ref24">Kim et al., 2013</xref>). When an emotion occurs, it will increase the heart rate, dopamine secretion or brain activity. These changes can be reflected by physiological signals, such as galvanic skin response, blood pressure, respiration amplitude and brain waves, which can be collected by wearable devices in real time (<xref ref-type="bibr" rid="ref6">Dzedzickis et al., 2020</xref>). Galvanic skin response is a highly relevant physiological signal for individual emotions. Four-channel biosensors were used to measure electromyogram, electrocardiogram, skin conductivity and respiration changes, by using an extended linear discriminant analysis (pLDA), Kim et al. developed a novel scheme of emotion-specific multilevel dichotomous classification (EMDC) with an accuracy of 95% (<xref ref-type="bibr" rid="ref23">Kim and Andr&#x00E9;, 2008</xref>). <xref ref-type="bibr" rid="ref53">Zhao et al. (2018)</xref> used a sensor-enriched wearable wristband to measure the three physiological signals including blood volume pause, electrodermal activity and skin temperature. They classify the emotions into four types in aspect of arousal and valence. Zhang choose four physiological signals including photoplethysmography, galvanic skin response, respiration amplitude and skin temperature. Recursive Feature Elimination-Correlation Bias Reduction-Support Vector Machine (SVM-RFE-CBR) algorithm was used for the classification (<xref ref-type="bibr" rid="ref4">Chen et al., 2020</xref>). Izard et al. determined an individual&#x2019;s emotions by questionnaire (<xref ref-type="bibr" rid="ref5">Dougherty et al., 1974</xref>). Watson proposed the positive and negative emotion scale, ten adjectives were applied to express their individual emotions at work, and the results reflect the individual&#x2019;s accumulated emotions and emotional experience (<xref ref-type="bibr" rid="ref48">Watson et al., 1988</xref>). This scale can describe the emotions effectively for its simplicity and interpretation. External expressions refer to the external changes that can be visually observed under a stimulus, such as changes in facial expressions, tone of voice and behavior. Nevertheless, this method lacks objectivity because the external performance of individuals can be hidden or disguised (<xref ref-type="bibr" rid="ref003">Ekman, 1992a</xref>).</p>
<p>The affective generalization theory (<xref ref-type="bibr" rid="ref007">Johnson and Tversky, 1983</xref>) suggests that emotions irrelevant to the decision-making task will affect people&#x2019;s judgments about the probability of events with the same emotion valence. Specifically, positive emotions reduce the subjective estimate of risky events and people in positive mood are prone to perform risky behaviors, while people in negative emotions are prone to perform risky behaviors. In contrast, Mood Maintenance Hypothesis (<xref ref-type="bibr" rid="ref18">Isen and Patrick, 1983</xref>) is the other classic theory about emotional valence, and it refers to people&#x2019;s tendency to maintain positive mood states and implies that positive mood is associated with less critical thinking and reduced information processing, and is prone to reduce their estimates of risk events and less likely to take risky behavior. Positive emotions promote brain mental activity and thus, can enable individuals maintain a higher level of concentration (<xref ref-type="bibr" rid="ref40">Pool et al., 2016</xref>). Fredrickson found the relationship between positive emotions and unsafe behaviors has a U-shaped effect (<xref ref-type="bibr" rid="ref11">Fredrickson and Branigan, 2005</xref>), while negative emotions can decrease individual&#x2019;s attention, responsiveness and reasoning abilities (<xref ref-type="bibr" rid="ref32">MacLeod et al., 1986</xref>). The more intense and emotional the workers are, the more likely they are to commit intentional violations, leading to unsafe accidents (<xref ref-type="bibr" rid="ref41">Radenhausen and Anker, 1988</xref>; <xref ref-type="bibr" rid="ref13">Golparvar, 2016</xref>).</p>
<p>Research on emotions suggests that there are diverse effects of emotions on individuals&#x2019; behavior. Theoretical controversies over the emotion maintenance hypothesis and the affective generalization theory remain. Given the specificities of construction task and the construction worker population, this paper examines the effects of emotions on construction workers&#x2019; recognition of safety hazards from a safety management perspective. A wearable device (HKR-11C+) was used to collect physiological signals for emotion classification, which is more objective compared with subjective questionnaire traditionally used in previous studies. In addition, this study achieves a quantitative analysis between emotions and individual behavior through the quantification of emotional valence. This research is helpful for construction workers to regulate their self-safety behaviors from an individual psychological perspective, as well as provide theoretical safety management strategies that focus on individual psychology for construction companies.</p>
</sec>
<sec id="sec2" sec-type="materials|methods">
<title>Materials and Methods</title>
<p>The famous James Lange&#x2019;s theory divides emotions into two basic dimensions that are emotional arousal and emotional valence (<xref ref-type="bibr" rid="ref25">Lang, 1995</xref>). Emotional arousal refers to the level of activation of an individual&#x2019;s emotion in response to a stimulus from passive to active, while emotional valence describes the level of pleasant or unpleasant experience from negative to positive. Emotions in this study were classified into positive, neutral and negative emotions by emotional valence. The construction workers&#x2019; recognition ability of safety hazards is measured in three aspects including the reaction time to safety hazards, identification accuracy of safety hazards, and the perception level of safety hazards.</p>
<sec id="sec3">
<title>Participants</title>
<p>Thirty students from Shanghai University majoring in construction engineering management were selected for the pilot test to confirm the feasibility and validity of the experiment. Forty construction workers from six Shanghai construction engineering enterprises were recruited for the formal experiment. Among all the subjects, 3 are construction workers and the other 34 were workers in supervisory positions. There are 3 subjects were below undergraduate level, 27 were undergraduates and 10 were postgraduates. All subjects had received safety management training of construction work. The studies involving human participants were reviewed and approved by Ethics Committee of Shanghai University. Participants selected for the pilot and formal experiment were based on the following criteria: (1) Familiar with the construction industry or have long-term working experience on construction site, and familiar with the operation codes on the construction site; (2) Physically and mentally healthy without any psychological disorders; (3) All are right-handed; (4) All provided written informed consent. The basic information of the subjects are shown in <xref rid="tab1" ref-type="table">Table 1</xref>.</p>
<table-wrap position="float" id="tab1">
<label>Table 1</label>
<caption><p>Basic information of the subjects.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th/>
<th align="left" valign="top">Category</th>
<th align="center" valign="top">Number</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top" rowspan="2">Gender</td>
<td align="left" valign="top">Male</td>
<td align="center" valign="top">37</td>
</tr>
<tr>
<td align="left" valign="top">Female</td>
<td align="center" valign="top">0</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Years of working experience</td>
<td align="left" valign="top">1&#x2013;9&#x2009;years</td>
<td align="center" valign="top">12</td>
</tr>
<tr>
<td align="left" valign="top">10&#x2013;19&#x2009;years</td>
<td align="center" valign="top">13</td>
</tr>
<tr>
<td align="left" valign="top">20&#x2009;year above</td>
<td align="center" valign="top">12</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="2">Job type</td>
<td align="left" valign="top">Operative workers</td>
<td align="center" valign="top">10</td>
</tr>
<tr>
<td align="left" valign="top">Supervisory workers</td>
<td align="center" valign="top">27</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Education background</td>
<td align="left" valign="top">Below undergraduate</td>
<td align="center" valign="top">6</td>
</tr>
<tr>
<td align="left" valign="top">Undergraduate</td>
<td align="center" valign="top">21</td>
</tr>
<tr>
<td align="left" valign="top">Postgraduate</td>
<td align="center" valign="top">10</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="sec4">
<title>Procedures</title>
<p>The experiment was carried out in a closed construction site conference room without interference. Forty rounds of experiments were included in this study. Each round contains three parts that are positive emotions, neutral emotions, and negative emotions. There are 120 samples in total. After informed consent, all the participants were attached electrodes for the physiological measurement. The experiment guidance is displayed by computer, which explains the purpose and procedures of the experiment. Following the instruction, a picture of the targeted emotion will be shown on the screen to stimulate the empathic effect of the participant, and the picture will last for 6&#x2009;s, 15 pictures will be shown each time. Subsequently, participants completed the Positive and Negative Affect Schedule (PANAS) (<xref ref-type="bibr" rid="ref48">Watson et al., 1988</xref>), followed by the identification and assessments of safety hazards in the construction pictures. The experimental procedure is shown in <xref rid="fig1" ref-type="fig">Figure 1</xref>.</p>
<fig position="float" id="fig1">
<label>Figure 1</label>
<caption><p>Procedures of the experiment.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g001.tif"/>
</fig>
</sec>
<sec id="sec5">
<title>Individual Emotional Arousal</title>
<p>Arousal of individual emotions using picture stimuli is one common forms of emotional stimulation (<xref ref-type="bibr" rid="ref12">Gerdes et al., 2014</xref>). The International Affective Picture System (IAPS) database (<xref ref-type="bibr" rid="ref26">Lang et al., 1988</xref>) was used to evoke different emotional states of the construction workers. Thirty images each of positive, neutral and negative pictures were selected from the IAPS as emotional arousal stimulus material. Researchers have found that incidental emotions pervasively carry over from one situation to the next, affecting decisions that unrelated to that emotion, known as the carryover of incidental emotion (<xref ref-type="bibr" rid="ref31">Loewenstein and Lerner, 2003</xref>; <xref ref-type="bibr" rid="ref27">Lerner and Tiedens, 2006</xref>; <xref ref-type="bibr" rid="ref22">Keltner and Lerner, 2010</xref>). In the IAPS database, values of valence indicating the level of enjoyment and the values of arousal indicating the level of excitement. The positive pictures selected in this paper include life scenes, animal activity pictures and baby pictures. Neutral pictures include pictures of static objects, abstract artwork and pictures of natural environment. Negative pictures include catastrophic events, violent and brutal scenes and pictures of disabled individuals. Images of the IAPS database cannot be displayed as a result of a confidentiality agreement. The valence value of the negative neutral and positive mood pictures were 1.78, 4.92, 7.83, and the arousal value was 6.36, 3.37, 5.14, respectively. The mean squared deviation of the pictures was less than 2.4, ensuring the validity of the pictures (<xref ref-type="bibr" rid="ref52">Zhang et al., 2018</xref>). The subjects were randomly shown one set of emotional pictures, and level of emotional arousal was evaluated using the PANAS Emotional Self-Rating Scale; subjects whose emotions were not aroused were excluded from the results of the experiment, the emotional scale is shown in <xref rid="tab2" ref-type="table">Table 2</xref>.</p>
<table-wrap position="float" id="tab2">
<label>Table 2</label>
<caption><p>The emotional state after emotion arousal.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">1</th>
<th align="center" valign="top">2</th>
<th align="center" valign="top">3</th>
<th align="center" valign="top">4</th>
<th align="center" valign="top">5</th>
<th align="center" valign="top">6</th>
<th align="center" valign="top">7</th>
<th align="center" valign="top">8</th>
<th align="center" valign="top">9</th>
</tr>
</thead>
<tbody>
<tr>
<td align="center" valign="top">Very Negative</td>
<td/>
<td align="center" valign="top">Negative</td>
<td/>
<td align="center" valign="top">Neutral</td>
<td/>
<td align="center" valign="top">Positive</td>
<td/>
<td align="center" valign="top">Very Positive</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>After the 15 emotion pictures were displayed, the participants will give a self-evaluation of their emotions by filing an emotion scale adapted from PANAS, with a scale ranging from 1 to 9. The higher scores indicate stronger positive emotions, while the lower scores indicate stronger negative emotions. The emotional scale is shown in <xref rid="tab3" ref-type="table">Table 3</xref>.</p>
<table-wrap position="float" id="tab3">
<label>Table 3</label>
<caption><p>Expected frequency and severity of safety hazard.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="middle">Severity/Frequency</th>
<th align="center" valign="middle">Very common</th>
<th align="center" valign="middle">Common</th>
<th align="center" valign="middle">Uncommon</th>
<th align="center" valign="middle">Very uncommon</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Negligible</td>
<td align="center" valign="top">0.19</td>
<td align="center" valign="top">0.04</td>
<td align="center" valign="top">0.00375</td>
<td align="center" valign="top">0.000375</td>
</tr>
<tr>
<td align="left" valign="top">Emergency aid</td>
<td align="center" valign="top">1.13</td>
<td align="center" valign="top">0.27</td>
<td align="center" valign="top">0.0226</td>
<td align="center" valign="top">0.00226</td>
</tr>
<tr>
<td align="left" valign="top">Seek medical advice</td>
<td align="center" valign="top">3.2</td>
<td align="center" valign="top">0.77</td>
<td align="center" valign="top">0.064</td>
<td align="center" valign="top">0.0064</td>
</tr>
<tr>
<td align="left" valign="top">Hospitalization</td>
<td align="center" valign="top">6.4</td>
<td align="center" valign="top">1.53</td>
<td align="center" valign="top">0.128</td>
<td align="center" valign="top">0.0128</td>
</tr>
<tr>
<td align="left" valign="top">Permanent disablement or fatality</td>
<td align="center" valign="top">340.48</td>
<td align="center" valign="top">81.55</td>
<td align="center" valign="top">6.81</td>
<td align="center" valign="top">0.681</td>
</tr>
<tr>
<td align="left" valign="top">No risk</td>
<td align="center" valign="top">0</td>
<td align="center" valign="top">0</td>
<td align="center" valign="top">0</td>
<td align="center" valign="top">0</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="sec6">
<title>Galvanic Skin Response Measurement</title>
<p>After cleaning the skin surface, the participants were attached the sensor to the construction worker&#x2019;s finger. The electrodes were stick in the sensor around the index and middle finger. The galvanic skin response data were recorded and sent to a computer. The picture of the GSR equipment and the electrode attachment location is shown in <xref rid="fig2" ref-type="fig">Figure 2</xref>. The galvanic skin response at 5&#x2013;6&#x2009;s after the emotional arousal was used for identifying and classifying of the emotions. The experiment was carried out in a laboratory at a room temperature of 22&#x00B0;C, the subjects sitting still in front of a computer for the galvanic skin response measurement, with the temperature and humidity remaining constant throughout the experiment. The signal processing procedures are elaborated in Section 3.</p>
<fig position="float" id="fig2">
<label>Figure 2</label>
<caption><p>GSR equipment and attachment location.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g002.tif"/>
</fig>
</sec>
<sec id="sec7">
<title>The Measurement of the Recognition Ability of Safety Hazards</title>
<p>The recognition of safety hazards was measured by identifying the safety hazards from construction site pictures, and images of construction sites containing five types of safety hazards were collected for this study from 12 construction sites in Shanghai, China. These images were obtained from safety incident reports, and the opinions of 10 experts were collected to evaluate the two dimensions of the selected images, (1) whether the images visually represented the safety hazards of that type of construction and (2) whether the images were prevalent in the construction site. The 120 images were retained after deleting the lower scoring images. The images were displayed randomly according to category, and 16 images were presented on screen in a set for the subjects to evaluate the safety hazards. The distribution of the 16 images by category was: 7 fall from height, 2 electric shock and fire, 2 object strikes, 1 collapse hazard, 1 mechanical injury and 3 no hazard images. To avoid a learning effect in the subjects, each picture was presented only once at random, with no repeated presentations in all sets for one participants. Some examples of the images chosen are shown in <xref rid="fig3" ref-type="fig">Figure 3</xref>.</p>
<fig position="float" id="fig3">
<label>Figure 3</label>
<caption><p>Examples of safety hazards pictures.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g003.tif"/>
</fig>
<p>The cognitive level of safety hazards was measured by behavioral experiment. The participants were requested to view the construction site images and determine if the pictures contain safety hazards (<xref ref-type="bibr" rid="ref46">Tixier et al., 2014</xref>). The participants pressed &#x201C;1&#x201D; on the keyboard if they think there are safety hazards, while &#x201C;0&#x201D; on the keyboard if there are not. The computer automatically records the time taken to identify safety hazards and the accuracy of the safety hazards assessment. The perception level of safety hazards of construction workers was measured by safety hazards perception assessment form, which was completed simultaneously when workers believe there is a safety hazards in the given picture. Hallowell pioneered the use of this form by quantifying safety hazards perception as the product of the expected frequency and severity of injury (<xref ref-type="bibr" rid="ref15">Hallowell, 2010</xref>). The corresponding scores of perception level of safety hazards are shown in <xref rid="tab3" ref-type="table">Table 3</xref>.</p>
</sec>
<sec id="sec8">
<title>Statistical Analysis</title>
<p>Forty subjects participated in this study, excluding three who failed in emotional arousal with 37 remaining subjects. After 37 sets of experiments, each containing 3 categories of emotional stimuli, 16 pictures of safety hazards for each category, a total of 1776 data were collected. Finally, 1,650 valid data were obtained with a 92.9% validity after removing invalid questionnaires.</p>
<p>The collected data were smoothed and filtered with a median filter and a third-order Butterworth low-pass filter. A cut-off frequency of 0.3 HZ was used to filter out abnormal signals, and the signals were eliminated from the baseline interference to reduce the influence caused by the measurement instrument itself and the current and voltage. The filtered galvanic skin response signals were extracted and normalized from both time domain signal features and descriptive features, and then, principal component analysis is applied to reduce the dimensionality of the acquired features to obtain the signal features for classification.</p>
<p>The reaction time to safety hazards, accuracy of safety hazards identification and perception level of safety hazards of construction workers in different emotional states were analyzed. One-way ANOVA was used to investigate the differences in recognition ability of safety hazards among workers within different age groups under different emotional states. Pearson correlation and regression analyses were used to quantify the effects of emotions on the reaction time to safety hazards, the accuracy of safety hazards identification, and perception level of safety hazards, respectively.</p>
</sec>
</sec>
<sec id="sec9">
<title>Results and Discussion</title>
<sec id="sec10">
<title>Classification of Galvanic Skin Response Signals</title>
<p>The collected data were smoothed and filtered with a median filter and a third-order Butterworth low-pass filter, and the cut-off frequency was set as 0.3&#x2009;HZ to filter out the abnormal signals and retain the valid signals. The original waveform, the waveform after median filtering and the waveform after low-pass filtering are shown in <xref rid="fig4" ref-type="fig">Figure 4</xref>; the obtained signals were de-baselined to reduce the effects caused by the measurement instrument itself and the current and voltage; the filtered skin electrical signals were extracted from both time domain signal features and descriptive features. The filtered electrical skin signal was extracted from both time domain signal features and descriptive features, and the extracted features are shown in <xref rid="tab4" ref-type="table">Table 4</xref>. Principal component analysis was used to reduce dimension, the results are shown in <xref rid="tab5" ref-type="table">Table 5</xref>, and the coefficient of the feature GSR_diff_Std is low, so the feature is deleted from the subsequent classification training. The final 10 features retained after normalization were obtained for the subsequent Support Vector Machine classification.</p>
<fig position="float" id="fig4">
<label>Figure 4</label>
<caption><p>Diagram of the filtering process results.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g004.tif"/>
</fig>
<table-wrap position="float" id="tab4">
<label>Table 4</label>
<caption><p>Selected variables.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">No.</th>
<th align="left" valign="top">Selected Variables</th>
<th align="left" valign="top">Codename</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">1</td>
<td align="left" valign="top">Mean value of GSR</td>
<td align="left" valign="top">GSR_Mean</td>
</tr>
<tr>
<td align="left" valign="top">2</td>
<td align="left" valign="top">Standard deviation of GSR</td>
<td align="left" valign="top">GSR_Std</td>
</tr>
<tr>
<td align="left" valign="top">3</td>
<td align="left" valign="top">Minimum value of GSR</td>
<td align="left" valign="top">GSR_Min</td>
</tr>
<tr>
<td align="left" valign="top">4</td>
<td align="left" valign="top">Maximum value of GSR</td>
<td align="left" valign="top">GSR_Max</td>
</tr>
<tr>
<td align="left" valign="top">5</td>
<td align="left" valign="top">First difference mean value of GSR</td>
<td align="left" valign="top">GSR_diff_Mean</td>
</tr>
<tr>
<td align="left" valign="top">6</td>
<td align="left" valign="top">First difference standard deviation of GSR</td>
<td align="left" valign="top">GSR_diff_Std</td>
</tr>
<tr>
<td align="left" valign="top">7</td>
<td align="left" valign="top">First difference minimum value of GSR</td>
<td align="left" valign="top">GSR_diff_Min</td>
</tr>
<tr>
<td align="left" valign="top">8</td>
<td align="left" valign="top">First difference maximum value of the of GSR</td>
<td align="left" valign="top">GSR_diff_Max</td>
</tr>
<tr>
<td align="left" valign="top">9</td>
<td align="left" valign="top">Average magnitude of GSR</td>
<td align="left" valign="top">GSR_AveMag</td>
</tr>
<tr>
<td align="left" valign="top">10</td>
<td align="left" valign="top">Average rise time of GSR</td>
<td align="left" valign="top">GSR_AveRt</td>
</tr>
<tr>
<td align="left" valign="top">11</td>
<td align="left" valign="top">Average energy of GSR</td>
<td align="left" valign="top">GSR_AveE</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap position="float" id="tab5">
<label>Table 5</label>
<caption><p>Results of principal component analysis.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th/>
<th align="center" valign="top">1</th>
<th align="center" valign="top">2</th>
<th align="center" valign="top">3</th>
<th align="center" valign="top">4</th>
<th align="center" valign="top">5</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">GSR_diff_Min</td>
<td align="center" valign="top">0.829</td>
<td align="center" valign="top">0.452</td>
<td align="center" valign="top">&#x2212;0.254</td>
<td align="center" valign="top">0.196</td>
<td align="center" valign="top">0.021</td>
</tr>
<tr>
<td align="left" valign="top">GSR_diff_Mean</td>
<td align="center" valign="top">0.826</td>
<td align="center" valign="top">0.457</td>
<td align="center" valign="top">&#x2212;0.251</td>
<td align="center" valign="top">0.200</td>
<td align="center" valign="top">0.014</td>
</tr>
<tr>
<td align="left" valign="top">GSR_diff_Max</td>
<td align="center" valign="top">0.769</td>
<td align="center" valign="top">0.526</td>
<td align="center" valign="top">&#x2212;0.275</td>
<td align="center" valign="top">0.223</td>
<td align="center" valign="top">0.017</td>
</tr>
<tr>
<td align="left" valign="top">GSR_Min</td>
<td align="center" valign="top">0.767</td>
<td align="center" valign="top">&#x2212;0.552</td>
<td align="center" valign="top">0.293</td>
<td align="center" valign="top">&#x2212;0.124</td>
<td align="center" valign="top">0.069</td>
</tr>
<tr>
<td align="left" valign="top">GSR_Max</td>
<td align="center" valign="top">0.767</td>
<td align="center" valign="top">&#x2212;0.551</td>
<td align="center" valign="top">0.293</td>
<td align="center" valign="top">&#x2212;00.124</td>
<td align="center" valign="top">0.070</td>
</tr>
<tr>
<td align="left" valign="top">GSR_Mean</td>
<td align="center" valign="top">0.766</td>
<td align="center" valign="top">&#x2212;0.552</td>
<td align="center" valign="top">0.293</td>
<td align="center" valign="top">&#x2212;0.124</td>
<td align="center" valign="top">0.069</td>
</tr>
<tr>
<td align="left" valign="top">GSR_AveRt</td>
<td align="center" valign="top">0.084</td>
<td align="center" valign="top">0.711</td>
<td align="center" valign="top">0.569</td>
<td align="center" valign="top">&#x2212;0.343</td>
<td align="center" valign="top">&#x2212;0.074</td>
</tr>
<tr>
<td align="left" valign="top">GSR_AveE</td>
<td align="center" valign="top">0.060</td>
<td align="center" valign="top">0.701</td>
<td align="center" valign="top">0.675</td>
<td align="center" valign="top">&#x2212;0.126</td>
<td align="center" valign="top">&#x2212;0.162</td>
</tr>
<tr>
<td align="left" valign="top">GSR_AveMag</td>
<td align="center" valign="top">&#x2212;0.063</td>
<td align="center" valign="top">0.087</td>
<td align="center" valign="top">0.493</td>
<td align="center" valign="top">0.732</td>
<td align="center" valign="top">&#x2212;0.209</td>
</tr>
<tr>
<td align="left" valign="top">GSR_diff_Std</td>
<td align="center" valign="top">0.152</td>
<td align="center" valign="top">0.522</td>
<td align="center" valign="top">&#x2212;0.251</td>
<td align="center" valign="top">&#x2212;0.547</td>
<td align="center" valign="top">0.090</td>
</tr>
<tr>
<td align="left" valign="top">GSR_Std</td>
<td align="center" valign="top">&#x2212;0.243</td>
<td align="center" valign="top">0.261</td>
<td align="center" valign="top">0.263</td>
<td align="center" valign="top">0.197</td>
<td align="center" valign="top">0.875</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The 120 sample points were used as the training data set; the Support Vector Machine (SVM) model was applied for classification training. In a ratio of 2:1, these 120 samples are divided into a training set (90 samples containing 30 positive, 30 neutral, and 30 negative samples) and a test set (30 samples). Both classification accuracy and model validation were improved by classified labelling of these 120 sample points and supervised learning of the model, which was implemented by Matlab R2016b. In the prediction experiment, 30 sample were validated and the results are shown in <xref rid="fig5" ref-type="fig">Figure 5</xref>; <xref rid="tab6" ref-type="table">Table 6</xref>. The vertical coordinates 1,2,3 correspond to negative, neutral and positive samples, respectively. When calculating the sensitive, specificity and precision of the examples, for each emotion, the emotion itself is considered a positive example, while the remaining two emotional states are considered negative examples. The classification results indicate that the picture triggering method in this study achieves effective emotional arousal.</p>
<fig position="float" id="fig5">
<label>Figure 5</label>
<caption><p>SVM classification prediction result.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g005.tif"/>
</fig>
<table-wrap position="float" id="tab6">
<label>Table 6</label>
<caption><p>Support vector machine classification simulation training results.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="top">Category</th>
<th align="center" valign="top">Sensitive</th>
<th align="center" valign="top">Specificity</th>
<th align="center" valign="top">Precision</th>
<th align="center" valign="top">Accuracy</th>
<th align="center" valign="top">f1</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Positive</td>
<td align="left" valign="top">77.8%</td>
<td align="left" valign="top">90.5%</td>
<td align="left" valign="top">70%</td>
<td align="center" valign="top">&#x2013;</td>
<td align="center" valign="top">0.74</td>
</tr>
<tr>
<td align="left" valign="top">Neural</td>
<td align="center" valign="top">90%</td>
<td align="center" valign="top">85%</td>
<td align="center" valign="top">75%</td>
<td align="center" valign="top">&#x2013;</td>
<td align="center" valign="top">0.82</td>
</tr>
<tr>
<td align="left" valign="top">Negative</td>
<td align="left" valign="top">90.9%</td>
<td align="left" valign="top">84.2%</td>
<td align="left" valign="top">76.9%</td>
<td align="center" valign="top">&#x2013;</td>
<td align="center" valign="top">0.83</td>
</tr>
<tr>
<td align="left" valign="top">Integral</td>
<td align="center" valign="top">&#x2013;</td>
<td align="center" valign="top">&#x2013;</td>
<td align="center" valign="top">&#x2013;</td>
<td align="center" valign="top">86.7%</td>
<td align="center" valign="top">0.80</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="sec11">
<title>Recognition Ability of Safety Hazards of Construction Workers in Different Emotional States</title>
<p>The statistical results of recognition ability of safety hazards of construction workers in different emotional states are shown in <xref rid="tab7" ref-type="table">Table 7</xref>. Workers in the positive emotional state had the shortest reaction time (5.61&#x2009;s), and workers in the negative and neutral emotional states required longer reaction time (8.08&#x2009;s and 6.91&#x2009;s, respectively). Construction workers in the neutral emotional state had the highest identification accuracy of safety hazards (92.25%) and perception level of safety hazard (24.52), while in the negative emotional state, the construction workers had the lowest identification accuracy of safety hazards (75.41%) and perception level of safety hazards (0.75) than other emotional states.</p>
<table-wrap position="float" id="tab7">
<label>Table 7</label>
<caption><p>Safety hazard cognition results in different emotional states.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="middle">Emotional State</th>
<th align="center" valign="middle">Reaction time (s)</th>
<th align="center" valign="middle">Accuracy (%)</th>
<th align="center" valign="middle">Safety hazard perception</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Negative</td>
<td align="center" valign="top">8.08</td>
<td align="center" valign="top">75.41</td>
<td align="center" valign="top">0.75</td>
</tr>
<tr>
<td align="left" valign="top">Neural</td>
<td align="center" valign="top">6.91</td>
<td align="center" valign="top">92.25</td>
<td align="center" valign="top">24.52</td>
</tr>
<tr>
<td align="left" valign="top">Positive</td>
<td align="center" valign="top">5.61</td>
<td align="center" valign="top">80.60</td>
<td align="center" valign="top">3.10</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The study found that the reaction time for identifying safety hazards was longer in negative emotions than in neutral and positive emotions. When construction workers were in a positive emotion, the feedback time for identifying safety hazards in construction site pictures was 5.61&#x2009;s, which was less than the 6.91&#x2009;s in a neutral emotion and 8.08&#x2009;s in a positive emotion. This conclusion is consistent with Fredrickson&#x2019;s findings that positive emotions serve to improve individual&#x2019;s physical, intellectual, and perceptions (<xref ref-type="bibr" rid="ref10">Fredrickson, 1998</xref>). Although the reaction time to safety hazards became shorter, the accuracy identification of safety hazards decreased when the emotional valence increased from negative to neutral. The identification accuracy of safety hazards was 92.25% under neutral emotional state, which was greater than the negative emotional state (75.41%) and the positive emotional state (80.60%). The findings are similar to the findings on the effect of emotion on driver performance, with drivers performing better in a neutral state (<xref ref-type="bibr" rid="ref19">Jeon et al., 2014</xref>) hazards.</p>
</sec>
<sec id="sec12">
<title>The Effect of Emotions on the Recognition Ability of Safety Hazards of Different Working Age Groups</title>
<p>Given that the construction workers&#x2019; emotions may be influenced by the age (<xref ref-type="bibr" rid="ref21">Kappes et al., 2017</xref>), age was chosen as the independent variable and a one-way ANOVA was used to explore the effect of emotions on construction workers&#x2019; recognition ability of safety hazards at different working ages.</p>
<sec id="sec13">
<title>Reaction Time</title>
<p>The results of the one-way ANOVA for the reaction time to safety hazards at different working ages are shown in <xref rid="tab8" ref-type="table">Table 8</xref>. The significant differences between the different age groups indicate that age has a significant effect on the reaction time to safety hazards for construction workers. Correlation-type effect size eta square (<xref ref-type="bibr" rid="ref9">Ferguson, 2009</xref>) and power reflect that age has a small estimates effect size on the reaction time of construction worker.</p>
<table-wrap position="float" id="tab8">
<label>Table 8</label>
<caption><p>One-way ANOVA results for reaction time at different working ages.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="middle">Emotional states</th>
<th align="center" valign="middle">Working years</th>
<th align="center" valign="middle">Mean value</th>
<th align="center" valign="middle">Standard deviation</th>
<th align="center" valign="middle">Minimum</th>
<th align="center" valign="middle">Maximum</th>
<th align="center" valign="middle"><italic>F</italic></th>
<th align="center" valign="middle">Sig</th>
<th align="center" valign="middle"><italic>&#x03B7;</italic><sup>2</sup></th>
<th align="center" valign="middle">Power</th>
<th align="center" valign="middle">Multiple Comparisons</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top" rowspan="3">Negative</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">7.74</td>
<td align="center" valign="top">1.384</td>
<td align="center" valign="top">4.87</td>
<td align="center" valign="top">10.50</td>
<td align="center" valign="top" rowspan="3">9.843</td>
<td align="center" valign="top" rowspan="3">0.000</td>
<td align="center" valign="top" rowspan="3">0.127</td>
<td align="center" valign="top" rowspan="3">0.096</td>
<td align="center" valign="top" rowspan="2">1&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">8.02</td>
<td align="center" valign="top">1.675</td>
<td align="center" valign="top">3.35</td>
<td align="center" valign="top">12.72</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">8.50</td>
<td align="center" valign="top">1.564</td>
<td align="center" valign="top">5.06</td>
<td align="center" valign="top">15.50</td>
<td align="center" valign="top">2&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Neural</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">6.35</td>
<td align="center" valign="top">1.779</td>
<td align="center" valign="top">1.43</td>
<td align="center" valign="top">9.75</td>
<td align="center" valign="top" rowspan="3">21.203</td>
<td align="center" valign="top" rowspan="3">0.000</td>
<td align="center" valign="top" rowspan="3">0.136</td>
<td align="center" valign="top" rowspan="3">0.103</td>
<td align="center" valign="top">1&#x2009;&#x003C;&#x2009;2</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">6.98</td>
<td align="center" valign="top">1.523</td>
<td align="center" valign="top">2.85</td>
<td align="center" valign="top">10.12</td>
<td align="center" valign="top">1&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">7.49</td>
<td align="center" valign="top">1.617</td>
<td align="center" valign="top">5.05</td>
<td align="center" valign="top">18.50</td>
<td align="center" valign="top">2&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Positive</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">4.42</td>
<td align="center" valign="top">1.931</td>
<td align="center" valign="top">0.55</td>
<td align="center" valign="top">8.95</td>
<td align="center" valign="top" rowspan="3">70.670</td>
<td align="center" valign="top" rowspan="3">0.000</td>
<td align="center" valign="top" rowspan="3">0.096</td>
<td align="center" valign="top" rowspan="3">0.076</td>
<td align="center" valign="top">1&#x2009;&#x003C;&#x2009;2</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">5.71</td>
<td align="center" valign="top">1.321</td>
<td align="center" valign="top">1.03</td>
<td align="center" valign="top">8.47</td>
<td align="center" valign="top">1&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">6.44</td>
<td align="center" valign="top">1.052</td>
<td align="center" valign="top">3.60</td>
<td align="center" valign="top">9.33</td>
<td align="center" valign="top">2&#x2009;&#x003C;&#x2009;3</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>1 represents the 0&#x2013;9 working years group, 2 represents the 10&#x2013;19&#x2009;years working group and 3 represents the 20 and above years working group.</p>
</table-wrap-foot>
</table-wrap>
<p>The reaction time to safety hazards under different working ages is shown in <xref rid="fig6" ref-type="fig">Figure 6</xref>. In all three emotional states, construction workers with more than 20&#x2009;years of experience had longest reaction time. Workers with more than 19&#x2009;years of working experience had longer reaction time than the 10&#x2013;19&#x2009;years working experience group by more than 0.5&#x2009;s, and more than 0.8&#x2009;s than those with less than 10&#x2009;years. This suggests that as construction workers increase in years of experience, the reaction time to recognize safety hazards increases, while influence of emotions on the reaction time to safety hazards decreases. This experiment required computer operation and construction workers with more than 20&#x2009;years of experience, who were generally older and less skilled at operating the devices may have contributed to longer reaction time.</p>
<fig position="float" id="fig6">
<label>Figure 6</label>
<caption><p>Reaction time to safety hazard under different working ages.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g006.tif"/>
</fig>
</sec>
<sec id="sec14">
<title>Identification Accuracy</title>
<p>The results of the one-way ANOVA for the identification accuracy of safety hazards at different working ages are shown in <xref rid="tab9" ref-type="table">Table 9</xref>. The significant differences between the different age groups indicate that age has a significant effect on the identification accuracy of safety hazards for construction workers. Correlation-type effect size eta square (<xref ref-type="bibr" rid="ref9">Ferguson, 2009</xref>) and power reflect that age has a small estimates effect size on the identification accuracy of construction worker.</p>
<table-wrap position="float" id="tab9">
<label>Table 9</label>
<caption><p>One-way ANOVA results for identification accuracy at different working ages.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="middle">Emotional states</th>
<th align="left" valign="middle">Working years</th>
<th align="center" valign="middle">Mean value</th>
<th align="center" valign="middle">Standard deviation</th>
<th align="center" valign="middle">Minimum</th>
<th align="center" valign="middle">Maximum</th>
<th align="center" valign="middle"><italic>F</italic></th>
<th align="center" valign="middle">Sig,</th>
<th align="center" valign="middle"><italic>&#x03B7;</italic><sup>2</sup></th>
<th align="center" valign="middle">Power</th>
<th align="center" valign="middle">Multiple Comparisons</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top" rowspan="3">Negative</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">0.74</td>
<td align="center" valign="top">0.459</td>
<td align="center" valign="top">0.510</td>
<td align="center" valign="top">0.902</td>
<td align="center" valign="top" rowspan="3">4.663</td>
<td align="center" valign="top" rowspan="3">0.010</td>
<td align="center" valign="top" rowspan="3">0.096</td>
<td align="center" valign="top" rowspan="3">0.076</td>
<td align="center" valign="top" rowspan="3">1&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">0.81</td>
<td align="center" valign="top">0.446</td>
<td align="center" valign="top">0.603</td>
<td align="center" valign="top">0.895</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">0.85</td>
<td align="center" valign="top">0.370</td>
<td align="center" valign="top">0.537</td>
<td align="center" valign="top">0.916</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Neural</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">0.90</td>
<td align="center" valign="top">0.242</td>
<td align="center" valign="top">0.645</td>
<td align="center" valign="top">0.923</td>
<td align="center" valign="top" rowspan="3">3.978</td>
<td align="center" valign="top" rowspan="3">0.042</td>
<td align="center" valign="top" rowspan="3">0.082</td>
<td align="center" valign="top" rowspan="3">0.069</td>
<td align="center" valign="top" rowspan="2">1&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">0.94</td>
<td align="center" valign="top">0.149</td>
<td align="center" valign="top">0.668</td>
<td align="center" valign="top">0.952</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">0.94</td>
<td align="center" valign="top">0.236</td>
<td align="center" valign="top">0.636</td>
<td align="center" valign="top">0.914</td>
<td align="center" valign="top">2&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Positive</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">0.71</td>
<td align="center" valign="top">0.439</td>
<td align="center" valign="top">0.554</td>
<td align="center" valign="top">0.886</td>
<td align="center" valign="top" rowspan="3">6.166</td>
<td align="center" valign="top" rowspan="3">0.002</td>
<td align="center" valign="top" rowspan="3">0.063</td>
<td/>
<td align="center" valign="top" rowspan="3">1&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">0.73</td>
<td align="center" valign="top">0.395</td>
<td align="center" valign="top">0.527</td>
<td align="center" valign="top">0.893</td>
<td align="center" valign="top">0.061</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">0.86</td>
<td align="center" valign="top">0.351</td>
<td align="center" valign="top">0.611</td>
<td align="center" valign="top">0.921</td>
<td/>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>1 represents the 0&#x2013;9 working years group, 2 represents the 10&#x2013;19&#x2009;years working group and 3 represents the 20 and above years working group.</p>
</table-wrap-foot>
</table-wrap>
<p>The identification accuracy of safety hazards under different working ages is shown in <xref rid="fig7" ref-type="fig">Figure 7</xref>. The graph illustrates that the construction workers with more than 20&#x2009;years of working experience have a higher accuracy in safety hazards identification under positive and negative emotional states than construction workers in the other two age groups. While under neutral emotions, workers with more than 20&#x2009;years of working experience and the group with 10&#x2013;19&#x2009;years have similar accuracy in evaluate safety hazards, at 93.73 and 93.58%, respectively, which are both at a high level. The above results indicate that the accuracy of safety hazards identification of experienced workers with more than 20&#x2009;years of experience is less affected by their emotional state, which means that work experience can effectively reduce the impact of emotional fluctuations on the accuracy of safety hazards identification.</p>
<fig position="float" id="fig7">
<label>Figure 7</label>
<caption><p>Identification accuracy of safety hazard under different working ages.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g007.tif"/>
</fig>
</sec>
<sec id="sec15">
<title>Safety Hazards Perception</title>
<p>The results of the one-way ANOVA for the perception level of safety hazards at different working ages are shown in <xref rid="tab10" ref-type="table">Table 10</xref>. The significant differences between the different age groups indicate that age has a significant effect on the perception level of safety hazards for construction workers. Correlation-type effect size eta square (<xref ref-type="bibr" rid="ref9">Ferguson, 2009</xref>) and power reflect that age has a small estimates effect size on the safety hazards perception of construction worker.</p>
<table-wrap position="float" id="tab10">
<label>Table 10</label>
<caption><p>One-way ANOVA results for perception level at different working ages.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th align="left" valign="middle">Emotional states</th>
<th align="center" valign="middle">Working years</th>
<th align="center" valign="middle">Mean value</th>
<th align="center" valign="middle">Standard deviation</th>
<th align="center" valign="middle">Minimum</th>
<th align="center" valign="middle">Maximum</th>
<th align="center" valign="middle"><italic>F</italic></th>
<th align="center" valign="middle">Sig</th>
<th align="center" valign="middle"><italic>&#x03B7;</italic><sup>2</sup></th>
<th align="center" valign="middle">Power</th>
<th align="center" valign="middle">Multiple Comparisons</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top" rowspan="3">Negative</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">2.7177</td>
<td align="center" valign="top">12.7158</td>
<td align="center" valign="top">0.0037</td>
<td align="center" valign="top">81.5500</td>
<td align="center" valign="top" rowspan="3">3.004</td>
<td align="center" valign="top" rowspan="3">0.041</td>
<td align="center" valign="top">0.028</td>
<td align="center" valign="top" rowspan="3">0.052</td>
<td align="center" valign="top" rowspan="3">1&#x2009;&#x003C;&#x2009;2</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">7.1499</td>
<td align="center" valign="top">20.2831</td>
<td align="center" valign="top">0.0640</td>
<td align="center" valign="top">81.5500</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">4.1766</td>
<td align="center" valign="top">16.6419</td>
<td align="center" valign="top">0.0128</td>
<td align="center" valign="top">81.5500</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Neural</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">13.6066</td>
<td align="center" valign="top">25.9091</td>
<td align="center" valign="top">0.1280</td>
<td align="center" valign="top">81.5500</td>
<td align="center" valign="top" rowspan="3">10.945</td>
<td align="center" valign="top" rowspan="3">0.000</td>
<td align="center" valign="top">0.023</td>
<td align="center" valign="top" rowspan="3">0.051</td>
<td align="center" valign="top">1&#x2009;&#x003C;&#x2009;2</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">40.2416</td>
<td align="center" valign="top">38.2904</td>
<td align="center" valign="top">0.2700</td>
<td align="center" valign="top">81.5500</td>
<td align="center" valign="top">1&#x2009;&#x003C;&#x2009;3</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">20.3367</td>
<td align="center" valign="top">30.7954</td>
<td align="center" valign="top">0.2700</td>
<td align="center" valign="top">81.5500</td>
<td align="center" valign="top">2&#x2009;&#x003E;&#x2009;3</td>
</tr>
<tr>
<td align="left" valign="top" rowspan="3">Positive</td>
<td align="center" valign="top">0&#x2013;9</td>
<td align="center" valign="top">0.57601</td>
<td align="center" valign="top">1.4384</td>
<td align="center" valign="top">0.0000</td>
<td align="center" valign="top">6.8100</td>
<td align="center" valign="top" rowspan="3">33.901</td>
<td align="center" valign="top" rowspan="3">0.000</td>
<td align="center" valign="top">0.027</td>
<td align="center" valign="top" rowspan="3">0.052</td>
<td align="center" valign="top" rowspan="2">1&#x2009;&#x003C;&#x2009;2</td>
</tr>
<tr>
<td align="center" valign="top">10&#x2013;19</td>
<td align="center" valign="top">6.7472</td>
<td align="center" valign="top">21.2234</td>
<td align="center" valign="top">0.0003</td>
<td align="center" valign="top">81.5500</td>
</tr>
<tr>
<td align="center" valign="top">20 and above</td>
<td align="center" valign="top">1.1089</td>
<td align="center" valign="top">2.16974</td>
<td align="center" valign="top">0.0003</td>
<td align="center" valign="top">6.8100</td>
<td align="center" valign="top">2&#x2009;&#x003E;&#x2009;3</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p>1 represents the 0&#x2013;9 working years group, 2 represents the 10&#x2013;19&#x2009;years working group and 3 represents the 20 and above years working group.</p>
</table-wrap-foot>
</table-wrap>
<p>The perception level of safety hazards under different working ages is shown in <xref rid="fig8" ref-type="fig">Figure 8</xref>. As illustrated in the figure, the highest level of safety hazards perception was found in the group of workers aged 10&#x2013;19&#x2009;years old in the neutral mood state, at 40.24, and the group of workers aged 10&#x2013;19&#x2009;years old had a higher level of safety hazards recognition in all emotional states. In all age groups, the level of perceived safety hazards increased from low to high then decreased as construction workers changed from extreme negative to extreme positive emotions.</p>
<fig position="float" id="fig8">
<label>Figure 8</label>
<caption><p>Perception level of safety hazard under different working ages.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g008.tif"/>
</fig>
</sec>
</sec>
<sec id="sec16">
<title>Quantitative Relationship Between Emotional Valence and Recognition Ability of Safety Hazards</title>
<p>According to the results of safety hazards reaction time, identification accuracy of safety hazards and perception level of safety hazards, correlation and regression analysis were used to explore the quantitative relationship between emotional valence and construction workers&#x2019; recognition ability of safety hazards.</p>
<sec id="sec17">
<title>Emotional Valence and Reaction Time</title>
<p>The Pearson correlation coefficient (<italic>r</italic>&#x2009;=&#x2009;&#x2212;0.556, <italic>p</italic>&#x2009;=&#x2009;0.000) indicates a moderate negative correlation between emotional state and reaction time to safety hazards. The regression model passed the <italic>F</italic>-test (<italic>p</italic>&#x2009;=&#x2009;0.000), and the emotional valence explains 30.9% of the workers&#x2019; reaction time to safety hazards (<italic>R</italic><sup>2</sup>&#x2009;=&#x2009;0.309, SE&#x2009;=&#x2009;0.832, <italic>F</italic>&#x2009;=&#x2009;664.413).</p>
<p>The regression results are shown in <xref rid="tab11" ref-type="table">Table 11</xref>, and the quantitative relationship between safety hazards reaction time and emotional valence of construction workers is shown in <xref ref-type="disp-formula" rid="EQ2">Equation (1)</xref>.</p>
<disp-formula id="EQ2">
<label>(1)</label>
<mml:math id="M1">
<mml:mrow>
<mml:mi mathvariant="normal">RT</mml:mi>
<mml:mo>=</mml:mo>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>9.952</mml:mn>
<mml:mi mathvariant="normal">E</mml:mi>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>5</mml:mn>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>0.556</mml:mn>
<mml:mi mathvariant="normal">EV</mml:mi>
<mml:mo>+</mml:mo>
<mml:mi>&#x03B5;</mml:mi>
</mml:mrow>
</mml:math>
</disp-formula>
<table-wrap position="float" id="tab11">
<label>Table 11</label>
<caption><p>Coefficients between reaction time and emotional valence.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2"/>
<th align="center" valign="middle" colspan="2">Unstandardized coefficients</th>
<th align="center" valign="middle">Standardized coefficients</th>
<th align="center" valign="middle" rowspan="2"><italic>t</italic></th>
<th align="center" valign="middle" rowspan="2">Sig.</th>
</tr>
<tr>
<th align="center" valign="middle"><italic>B</italic></th>
<th align="center" valign="middle">Standard error</th>
<th align="center" valign="middle">Beta</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="top">Constant(<italic>&#x03B2;</italic><sub>0</sub>)</td>
<td align="center" valign="top">&#x2212;9.952E-5</td>
<td align="center" valign="top">0.022</td>
<td/>
<td align="center" valign="top">&#x2212;0.005</td>
<td align="center" valign="top">0.996</td>
</tr>
<tr>
<td align="left" valign="top">Emotional valence(<italic>&#x03B2;</italic><sub>1</sub>)</td>
<td align="center" valign="top">&#x2212;0.556</td>
<td align="center" valign="top">0.022</td>
<td align="center" valign="top">&#x2212;0.556</td>
<td align="center" valign="top">&#x2212;25.776</td>
<td align="center" valign="top">0.000</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Where</p>
<list list-type="simple">
<list-item>
<p>RT is safety hazards reaction time,</p>
</list-item>
<list-item>
<p>EV is emotional valence,</p>
</list-item>
<list-item>
<p><italic>&#x03B5;</italic> is error term, which indicates unexplained variability of the data.</p>
</list-item>
</list>
<p>The coefficient of emotional state and reaction time to safety hazards (<italic>&#x03B2;<sub>1</sub></italic>&#x2009;=&#x2009;&#x2212;0.556) is less than zero, indicating that the reaction time to safety hazards decreases as the emotional state increases, as shown in <xref rid="fig9" ref-type="fig">Figure 9</xref>. Either in positive or negative emotional state, the reaction time to safety hazards increased by 0.556&#x2009;units for each unit decrease in the construction worker&#x2019;s emotional valence. The reaction time to safety hazards decreases continuously as the emotional state of construction workers change from extreme negative to extreme positive.</p>
<fig position="float" id="fig9">
<label>Figure 9</label>
<caption><p>The effect of emotional valence on the reaction time to safety hazard.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g009.tif"/>
</fig>
</sec>
<sec id="sec18">
<title>Emotional Valence and Identification Accuracy</title>
<p>Descriptive statistics found that the identification accuracy of safety hazards was higher for construction workers in neutral emotions than in both positive and negative emotions. The Pearson correlation coefficient shows a moderate negative correlation between positive emotion and identification accuracy of safety hazards (r&#x2009;=&#x2009;&#x2212;0.526, <italic>p</italic>&#x2009;=&#x2009;0.000), while negative emotion is negatively correlated with identification accuracy of safety hazards (<italic>r</italic>&#x2009;=&#x2009;&#x2212;0.356, <italic>p</italic>&#x2009;=&#x2009;0.000).</p>
<p>The correlation analysis shows that the relationship between the emotional state and the identification accuracy of safety hazards is an inverted U-shape, so curvilinear regression model is established to show a quadratic expression. The regression model passed the F-test (<italic>p</italic>&#x2009;=&#x2009;0.000) and the emotional valence explains 21.3% of the workers&#x2019; identification accuracy of safety hazards (<italic>R</italic><sup>2</sup>&#x2009;=&#x2009;0.213, SE&#x2009;=&#x2009;0.834, <italic>F</italic>&#x2009;=&#x2009;201.79). The regression results are shown in <xref rid="tab12" ref-type="table">Table 12</xref>, and quantitative relationship between identification accuracy of safety hazards and emotional valence of construction workers is shown in <xref ref-type="disp-formula" rid="EQ3">Equation (2)</xref>.</p>
<disp-formula id="EQ3">
<label>(2)</label>
<mml:math id="M2">
<mml:mrow>
<mml:mi mathvariant="normal">IA</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0.51</mml:mn>
<mml:mo>+</mml:mo>
<mml:mn>0.021</mml:mn>
<mml:mi mathvariant="normal">EV</mml:mi>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>0.443</mml:mn>
<mml:msup>
<mml:mrow>
<mml:mi mathvariant="normal">EV</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:mi>&#x03B5;</mml:mi>
</mml:mrow>
</mml:math>
</disp-formula>
<table-wrap position="float" id="tab12">
<label>Table 12</label>
<caption><p>Coefficients between identification accuracy and emotional valence.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2"/>
<th align="center" valign="middle" colspan="2">Unstandardized coefficients</th>
<th align="center" valign="middle">Standardized coefficients</th>
<th align="center" valign="middle" rowspan="2"><italic>t</italic></th>
<th align="center" valign="middle" rowspan="2">Sig.</th>
</tr>
<tr>
<th align="center" valign="middle"><italic>B</italic></th>
<th align="center" valign="middle">Standard error</th>
<th align="center" valign="middle">Beta</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="bottom">Constant(<inline-formula>
<mml:math id="M3">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B1;</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>)</td>
<td align="center" valign="bottom">0.51</td>
<td align="center" valign="bottom">0.031</td>
<td/>
<td align="center" valign="bottom">16.380</td>
<td align="center" valign="bottom">0.000</td>
</tr>
<tr>
<td align="left" valign="top">Emotional state(<inline-formula>
<mml:math id="M4">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B1;</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>)</td>
<td align="center" valign="top">0.021</td>
<td align="center" valign="top">0.022</td>
<td align="center" valign="top">0.023</td>
<td align="center" valign="top">0.981</td>
<td align="center" valign="top">0.327</td>
</tr>
<tr>
<td align="left" valign="top">Emotional state(<inline-formula>
<mml:math id="M5">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B1;</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>)</td>
<td align="center" valign="top">&#x2212;0.443</td>
<td align="center" valign="top">0.022</td>
<td align="center" valign="top">&#x2212;0.459</td>
<td align="center" valign="top">19.769</td>
<td align="center" valign="top">0.000</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Where</p>
<list list-type="simple">
<list-item>
<p>IA is identification accuracy of safety hazards,</p>
</list-item>
<list-item>
<p>EV is construction worker&#x2019;s emotional valence,</p>
</list-item>
<list-item>
<p><inline-formula>
<mml:math id="M6">
<mml:mi>&#x03B5;</mml:mi>
</mml:math>
</inline-formula> is error term, which indicates unexplained variability of the data.</p>
</list-item>
</list>
<p>The coefficient of the quadratic term (<inline-formula>
<mml:math id="M7">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B1;</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>=&#x2009;&#x2212;&#x2009;0.443) between the emotional state and the accuracy of safety hazards identification is negative, indicating that the accuracy of safety hazards identification increases and then decreases as the emotional valence increases, reaching the highest point when the emotional state of workers is in neutral emotion as shown in <xref rid="fig10" ref-type="fig">Figure 10</xref>. Specifically, when the construction workers are in positive emotional state, the identification accuracy of safety hazards decreases by 1.350&#x2009;units for each unit increase in emotional state. While the construction workers are in a negative emotional state, the identification accuracy of safety hazards decreases by 1.308&#x2009;units for each unit decrease in their emotional states.</p>
<fig position="float" id="fig10">
<label>Figure 10</label>
<caption><p>The effect of emotional valence on the identification accuracy.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g010.tif"/>
</fig>
</sec>
</sec>
<sec id="sec19">
<title>Emotional Valence and Perception Level of Safety Hazards</title>
<p>The correlations between the positive emotion, negative emotion and the perception level of safety hazards were explored, respectively, by Pearson correlation coefficient. The results indicate a low negative correlation between positive emotion and perception level of safety hazards (<italic>r</italic>&#x2009;=&#x2009;&#x2212;0.256, <italic>p</italic>&#x2009;=&#x2009;0.000), while negative emotion is moderately negative correlated with the perception level of safety hazards (<italic>r</italic>&#x2009;=&#x2009;&#x2212;0.520, <italic>p</italic>&#x2009;=&#x2009;0.000).</p>
<p>The correlation analysis shows that the relationship between the emotional state and perception level of safety hazards is an inverted U-shape, so curvilinear regression is used to establish a quadratic expression for regression analysis. The regression model passed the <italic>F</italic>-test (<italic>p</italic>&#x2009;=&#x2009;0.000), and the emotional valence explains 35.7% of the workers&#x2019; perception level of safety hazards (<italic>R</italic><sup>2</sup>&#x2009;=&#x2009;0.357, SE&#x2009;=&#x2009;0.802, <italic>F</italic>&#x2009;=&#x2009;365.999). The regression results are shown in <xref rid="tab13" ref-type="table">Table 13</xref>, and quantitative relationship between perception level of safety hazards and emotional valence of construction workers is shown in <xref ref-type="disp-formula" rid="EQ1">Equation (3)</xref>.</p>
<disp-formula id="EQ1">
<label>(3)</label>
<mml:math id="M8">
<mml:mrow>
<mml:mi mathvariant="normal">PL</mml:mi>
<mml:mo>=</mml:mo>
<mml:mn>0.683</mml:mn>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>0.079</mml:mn>
<mml:mi mathvariant="normal">EV</mml:mi>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>0.617</mml:mn>
<mml:msup>
<mml:mrow>
<mml:mi mathvariant="normal">EV</mml:mi>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
<mml:mo>+</mml:mo>
<mml:mi>&#x03B5;</mml:mi>
</mml:mrow>
</mml:math>
</disp-formula>
<table-wrap position="float" id="tab13">
<label>Table 13</label>
<caption><p>Coefficients between perception level of safety hazard and emotional valence.</p></caption>
<table frame="hsides" rules="groups">
<thead>
<tr>
<th rowspan="2"/>
<th align="center" valign="middle" colspan="2">Unstandardized coefficients</th>
<th align="center" valign="middle">Standardized coefficients</th>
<th align="center" valign="middle" rowspan="2"><italic>t</italic></th>
<th align="center" valign="middle" rowspan="2">Sig.</th>
</tr>
<tr>
<th align="center" valign="middle"><italic>B</italic></th>
<th align="center" valign="middle">Standard error</th>
<th align="center" valign="middle">Beta</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left" valign="bottom">Constant(<inline-formula>
<mml:math id="M9">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B3;</mml:mi>
<mml:mn>0</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>)</td>
<td align="center" valign="bottom">0.683</td>
<td align="center" valign="bottom">0.034</td>
<td/>
<td align="center" valign="bottom">20.354</td>
<td align="center" valign="bottom">0.000</td>
</tr>
<tr>
<td align="left" valign="top">Emotional state(<inline-formula>
<mml:math id="M10">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B3;</mml:mi>
<mml:mn>1</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>)</td>
<td align="center" valign="top">&#x2212;0.079</td>
<td align="center" valign="top">0.021</td>
<td align="center" valign="top">&#x2212;0.084</td>
<td align="center" valign="top">&#x2212;3.747</td>
<td align="center" valign="top">0.000</td>
</tr>
<tr>
<td align="left" valign="top">Emotional state(<inline-formula>
<mml:math id="M11">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B3;</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>)</td>
<td align="center" valign="top">&#x2212;0.617</td>
<td align="center" valign="top">0.023</td>
<td align="center" valign="top">&#x2212;0.604</td>
<td align="center" valign="top">&#x2212;27.055</td>
<td align="center" valign="top">0.000</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Where</p>
<list list-type="simple">
<list-item>
<p><italic>PL</italic> is perception level of safety hazards,</p>
</list-item>
<list-item>
<p>EV is construction worker&#x2019;s emotional valence,</p>
</list-item>
<list-item>
<p><inline-formula>
<mml:math id="M12">
<mml:mi>&#x03B5;</mml:mi>
</mml:math>
</inline-formula> is error term, which indicates unexplained variability of the data.</p>
</list-item>
</list>
<p>The coefficient of the quadratic term(<inline-formula>
<mml:math id="M13">
<mml:mrow>
<mml:msub>
<mml:mi>&#x03B3;</mml:mi>
<mml:mn>2</mml:mn>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>=&#x2009;&#x2212;&#x2009;0.617) between the emotional state and the perception level of safety hazards is negative, indicating that the perception level of safety hazards increases and then decreases as the emotional valence increases, reaching the highest point when the emotional state of workers is in neutral emotion as shown in <xref rid="fig11" ref-type="fig">Figure 11</xref>. Specifically, when construction workers are in positive emotion, the perception level of safety hazards decreases by 1.93&#x2009;units for every each increase in their emotional state. When construction workers are in negative emotion, the perception level of safety hazards decreases by 1.772&#x2009;units for each unit decrease in their emotional state. Therefore, the perception level of safety hazards of construction workers in negative and positive emotions is lower than that in neutral emotions. As construction workers&#x2019; emotions change from extremely negative to extremely positive, the perception level of safety hazards changes from low to high and then lower.</p>
<fig position="float" id="fig11">
<label>Figure 11</label>
<caption><p>The effect of emotional valence on the perception level of safety hazard.</p></caption>
<graphic xlink:href="fpsyg-13-895929-g011.tif"/>
</fig>
</sec>
<sec id="sec20">
<title>The Safest Emotion</title>
<p>There was a negative correlation between reaction time to safety hazards and emotional valence, while the accuracy of safety hazards identification and the perception level of safety hazards had an inverted &#x201C;U&#x201D; shape relationship with emotional valence. When workers are under positive emotional valence, the findings are consistent with the affective generalization theory (<xref ref-type="bibr" rid="ref007">Johnson and Tversky, 1983</xref>), where positive emotion drive construction workers to make optimistic judgements about the construction environment, and therefore lower levels of perception of safety hazards. When under the negative emotional valence, the findings are more in line with the Mood Maintenance Hypothesis (<xref ref-type="bibr" rid="ref18">Isen and Patrick, 1983</xref>), where construction workers tend to take more risky and aggressive decisions in order to escape from their current negative emotional state, thus underestimating the risks of the environment and lowering the level of safety hazards perceived by workers. In addition, the findings of this study are support the Affective states as information hypothesis proposed by <xref ref-type="bibr" rid="ref44">Schwarz and Clore (1981)</xref>. This theory suggests that emotions simplify people&#x2019;s risk decision-making process, which people judge things based on their feelings rather than their features, and that emotions can lead to overestimation of events of the same valence. This finding is related to information acquisition and cognitive processes, and subsequent research could be further explored from this perspective. Workers should avoid overexcited emotional states, for each unit increase in emotional valence, the reaction time to safety hazards reduced by 0.556&#x2009;units. Meanwhile the identification accuracy of safety hazards reduced by 1.35&#x2009;units, and the level of safety hazards perception reduced by 1.93&#x2009;units when the emotional valence shift by one unit from neutral emotional state. Workers with high emotional valence have a more relaxed and pleasurable state, with increased reaction speed but reduced ability to judge and perceive safety hazards due to inattentiveness. Construction workers also need to avoid negative emotions such as excessive sadness and grief, for each unit decrease in emotional state, the feedback time for safety hazards recognition increases by 0.556&#x2009;units. Meanwhile the identification accuracy of safety hazards decreases by 1.308&#x2009;units and the perception level of safety hazards decreases by 1.772&#x2009;units. When workers are immersed in a state of loss and frustration, the attention allocated to safety hazards identification decreases, prolonging their own judgment time, with a concomitant decrease in the accuracy of safety hazards identification and the level of safety hazards perception. Therefore, neutral emotions are the safest emotions.</p>
</sec>
</sec>
<sec id="sec21" sec-type="conclusions">
<title>Conclusion</title>
<p>Behavioral experiment revealed that the support vector machine (SVM) algorithm was effective in classifying galvanic skin response signals to identify emotional states. The reaction time to recognition ability of safety hazards of construction workers under negative emotion is longer than neutral and positive emotions, and the identification accuracy of safety hazards and the perception level of safety hazards are lower, so the general recognition ability of safety hazards of construction workers under negative emotion is poorer. The reaction time to safety hazards identification is shorter for construction workers in positive emotions, but the accuracy of safety hazards identification and the level of safety hazards perception are lower, and the accuracy of safety hazards identification and the level of safety hazards perception are higher for construction workers in neutral emotions than in negative and positive emotions. For construction workers with more than 20&#x2009;years of experience, work experience can effectively reduce the impact of emotional fluctuations on the accuracy of safety hazards evaluation. Emotion predicted the recognition ability of safety hazards of construction workers, with a moderate negative correlation between reaction time to safety hazards and emotional valence, and a low relationship between accuracy of safety hazards identification and perception level of safety hazards and emotional valence shaped an inverted &#x201C;U.&#x201D; Compared to positive emotions and negative emotions, construction workers in neutral emotions have the highest level of accuracy of safety hazards identification and perception of safety hazards, making neutral emotions deemed to be the safest emotion.</p>
<p>The complexity and dynamics of the construction site require workers to identify the safety hazards present on the site timely and accurately, and keeping their emotional state stable is beneficial to improving construction workers&#x2019; ability to identify safety hazards and keep themselves safe. Currently, China&#x2019;s construction workers are generally poorly educated, lack continuous psychological training and have weak emotional control, while safety training in construction companies tend to focus on the operational specifications, unsafe behaviors, the requirements for wearing safety gear and the main prohibitions of safe production, and rarely include emotional management and requirements in safety education and training. Construction companies should pay more attention to the emotional health of construction workers and keep their emotional state stable through psychological training, to improve workers&#x2019; awareness of their emotion and emergency handling ability, therefore to reduce the probability of safety accidents and improve the safety management of construction sites.</p>
<p>There are some limitations to the present study which may be relevant for future research. First, the study used galvanic skin response to monitor emotions, while scientific and technological advances have led to increasingly sophisticated techniques for monitoring physiological signals. Some studies focus on collecting EEG signals through EEG devices (<xref ref-type="bibr" rid="ref010">Takehara et al., 2020</xref>; <xref ref-type="bibr" rid="ref004">Long et al., 2021</xref>), exploring brain activity in different emotional states. Eye-tracking devices were used to collect construction workers&#x2019; eye-movement signals, and analyses how eye-movement signals reflect the emotions of construction workers (<xref ref-type="bibr" rid="ref009">Soleymani et al., 2012</xref>). Future research could investigate the impact of emotions on an individual&#x2019;s physiological signals, as well as cognitive abilities, in a multidimensional approach through the applications of novel devices. Second, the participants in this study were mainly construction workers, and this paper explored the effect of work experience on construction workers&#x2019; emotions. Future research could further focus on the variability within groups of construction workers, such as different personality traits (<xref ref-type="bibr" rid="ref45">Sugi et al., 2020</xref>; <xref ref-type="bibr" rid="ref33">Maier et al., 2021</xref>), different populations (<xref ref-type="bibr" rid="ref14">Guti&#x00E9;rrez-Cobo et al., 2017</xref>) or gender (<xref ref-type="bibr" rid="ref29">Lischke et al., 2020</xref>), to explore differences in the influence of emotions between groups of construction workers. At last, the study categorized emotions into the three most basic types, negative, neutral and positive by emotional valence. In fact, there are more varieties of emotions and even research paradigms, research on different negative emotions has gained extensive attention (<xref ref-type="bibr" rid="ref39">Pittig et al., 2014</xref>; <xref ref-type="bibr" rid="ref47">Topolinski and Strack, 2015</xref>), and subsequent research can be conducted from these perspectives and be studied in more detail.</p>
</sec>
<sec id="sec22" sec-type="data-availability">
<title>Data Availability Statement</title>
<p>The raw data supporting the conclusions of this article will be made available by the authors, without undue reservation.</p>
</sec>
<sec id="sec23">
<title>Ethics Statement</title>
<p>The studies involving human participants were reviewed and approved by the Ethics Committee of Shanghai University.</p>
</sec>
<sec id="sec24">
<title>Author Contributions</title>
<p>DC, AY, and HS contributed to conceptualization, writing&#x2014; review and editing, formal analysis, methodology, and original draft. HS and DC contributed to investigation. DC and YZ contributed to supervision. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec id="sec25" sec-type="funding-information">
<title>Funding</title>
<p>This study was funded by the National Natural Science Foundation of China (grant no. 71901139) and Science and Technology Commission of Shanghai Municipality (grant nos. 19DZ1204203 and 21692195100).</p>
</sec>
<sec id="conf1" sec-type="COI-statement">
<title>Conflict of Interest</title>
<p>YZ is employed by Shanghai Road &#x0026; Bridge (Group) Co., Ltd.</p>
<p>The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="sec27" sec-type="disclaimer">
<title>Publisher&#x2019;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</body>
<back>
<ack>
<p>The authors would like to specially thank Shanghai Road &#x0026; Bridge (Group) Co., Ltd., Shanghai Construction Group (SCG), and China State Construction for providing generous help on data collection.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="ref1"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Abdelhamid</surname> <given-names>T. S.</given-names></name> <name><surname>Everett</surname> <given-names>J. G.</given-names></name></person-group> (<year>2000</year>). <article-title>Identifying root causes of construction accidents</article-title>. <source>J. Constr. Eng. Manag.</source> <volume>126</volume>, <fpage>52</fpage>&#x2013;<lpage>60</lpage>. doi: <pub-id pub-id-type="doi">10.1061/(ASCE)0733-9364(2000)126:1(52)</pub-id></citation></ref>
<ref id="ref2"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Al-Bayati</surname> <given-names>A. J.</given-names></name> <name><surname>Albert</surname> <given-names>A.</given-names></name> <name><surname>Ford</surname> <given-names>G.</given-names></name></person-group> (<year>2019</year>). <article-title>Construction safety culture and climate: satisfying necessity for an industry framework</article-title>. <source>Pract. Period. Struct. Des. Constr.</source> <volume>24</volume>:<fpage>04019028</fpage>. doi: <pub-id pub-id-type="doi">10.1061/(ASCE)SC.1943-5576.0000452</pub-id></citation></ref>
<ref id="ref3"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Alizadeh</surname> <given-names>S. S.</given-names></name> <name><surname>Mortazavi</surname> <given-names>S. B.</given-names></name> <name><surname>Sepehri</surname> <given-names>M. M.</given-names></name></person-group> (<year>2015</year>). <article-title>Assessment of accident severity in the construction industry using the Bayesian theorem</article-title>. <source>Int. J. Occup. Saf. Ergon.</source> <volume>21</volume>, <fpage>551</fpage>&#x2013;<lpage>557</lpage>. doi: <pub-id pub-id-type="doi">10.1080/10803548.2015.1095546</pub-id>, PMID: <pub-id pub-id-type="pmid">26694008</pub-id></citation></ref>
<ref id="ref4"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>S.</given-names></name> <name><surname>Zhang</surname> <given-names>L.</given-names></name> <name><surname>Jiang</surname> <given-names>F.</given-names></name> <name><surname>Chen</surname> <given-names>W.</given-names></name> <name><surname>Chen</surname> <given-names>H.</given-names></name></person-group> (<year>2020</year>). <article-title>Emotion recognition based on multiple physiological signals</article-title>. <source>Chin. J. Med. Instrum.</source> <volume>44</volume>, <fpage>283</fpage>&#x2013;<lpage>287</lpage>. doi: <pub-id pub-id-type="doi">10.3969/j.issn.1671-7104.2020.04.001</pub-id>, PMID: <pub-id pub-id-type="pmid">32762198</pub-id></citation></ref>
<ref id="ref001"><citation citation-type="journal"><person-group person-group-type="author"><collab id="col101">China Construction Industry Association</collab></person-group> (<year>2020</year>). <article-title>Statistical analysis of the development of the construction industry in 2020 [EB/OL]</article-title>. Available at: <ext-link xlink:href="https://mp.weixin.qq.com/s/EAO_iMuFO_rJ7ii6MQO1kg" ext-link-type="uri">https://mp.weixin.qq.com/s/EAO_iMuFO_rJ7ii6MQO1kg</ext-link></citation></ref>
<ref id="ref002"><citation citation-type="journal"><person-group person-group-type="author"><collab id="col102">China National Bureau of Statistics</collab></person-group> (<year>2020</year>). <article-title>Statistical Bulletin on National Economic and Social Development of the People&#x2019;s Republic of China 2020 [EB/OL]</article-title>. Available at: <ext-link xlink:href="http://www.stats.gov.cn/ztjc/zthd/lhfw/2021/lh_hgjj/202103/t20210301_1814216.html" ext-link-type="uri">http://www.stats.gov.cn/ztjc/zthd/lhfw/2021/lh_hgjj/202103/t20210301_1814216.html</ext-link></citation></ref>
<ref id="ref5"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dougherty</surname> <given-names>F. E.</given-names></name> <name><surname>Bartlett</surname> <given-names>E. S.</given-names></name> <name><surname>Izard</surname> <given-names>C. E.</given-names></name></person-group> (<year>1974</year>). <article-title>Responses of schizophrenics to expressions of the fundamental emotions</article-title>. <source>J. Clin. Psychol.</source> <volume>30</volume>, <fpage>243</fpage>&#x2013;<lpage>246</lpage>. doi: <pub-id pub-id-type="doi">10.1002/1097-4679(197407)30:3&#x003C;243::AID-JCLP2270300304&#x003E;3.0.CO;2-0</pub-id>, PMID: <pub-id pub-id-type="pmid">4605059</pub-id></citation></ref>
<ref id="ref6"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dzedzickis</surname> <given-names>A.</given-names></name> <name><surname>Kaklauskas</surname> <given-names>A.</given-names></name> <name><surname>Bucinskas</surname> <given-names>V.</given-names></name></person-group> (<year>2020</year>). <article-title>Human emotion recognition: review of sensors and methods</article-title>. <source>Sensors</source> <volume>20</volume>:<fpage>592</fpage>. doi: <pub-id pub-id-type="doi">10.3390/s20030592</pub-id></citation></ref>
<ref id="ref003"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name></person-group> (<year>1992a</year>). <article-title>Facial expressions of emotion: an old controversy and new findings. Philosophical transactions of the Royal Society of London</article-title>. <source>Series B: Biol. Sci.</source> <volume>335</volume>, <fpage>63</fpage>&#x2013;<lpage>69</lpage>. doi: <pub-id pub-id-type="doi">10.1098/rstb.1992.0008</pub-id></citation></ref>
<ref id="ref7"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ekman</surname> <given-names>P.</given-names></name></person-group> (<year>1992b</year>). <article-title>An argument for basic emotions</article-title>. <source>Cognit. Emot.</source> <volume>6</volume>, <fpage>169</fpage>&#x2013;<lpage>200</lpage>. doi: <pub-id pub-id-type="doi">10.1080/02699939208411068</pub-id></citation></ref>
<ref id="ref8"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Farmer</surname> <given-names>E.</given-names></name> <name><surname>Chambers</surname> <given-names>E. G.</given-names></name></person-group> (<year>1929</year>). <article-title>A Study of Personal Qualities in Accident Proneness and Proficiency</article-title>. <publisher-loc>London</publisher-loc>: <publisher-name>H.M. Stationery Off.</publisher-name></citation></ref>
<ref id="ref9"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ferguson</surname> <given-names>C. J.</given-names></name></person-group> (<year>2009</year>). <article-title>An effect size primer: a guide for clinicians and researchers</article-title>. <source>Prof. Psychol. Res. Pract.</source> <volume>40</volume>, <fpage>532</fpage>&#x2013;<lpage>538</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0015808</pub-id></citation></ref>
<ref id="ref10"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fredrickson</surname> <given-names>B. L.</given-names></name></person-group> (<year>1998</year>). <article-title>What good are positive emotions?</article-title> <source>Rev. Gen. Psychol.</source> <volume>2</volume>, <fpage>300</fpage>&#x2013;<lpage>319</lpage>. doi: <pub-id pub-id-type="doi">10.1037/1089-2680.2.3.300</pub-id>, PMID: <pub-id pub-id-type="pmid">21850154</pub-id></citation></ref>
<ref id="ref11"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fredrickson</surname> <given-names>B. L.</given-names></name> <name><surname>Branigan</surname> <given-names>C.</given-names></name></person-group> (<year>2005</year>). <article-title>Positive emotions broaden the scope of attention and thought-action repertoires</article-title>. <source>Cognit. Emot.</source> <volume>19</volume>, <fpage>313</fpage>&#x2013;<lpage>332</lpage>. doi: <pub-id pub-id-type="doi">10.1080/02699930441000238</pub-id>, PMID: <pub-id pub-id-type="pmid">21852891</pub-id></citation></ref>
<ref id="ref12"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gerdes</surname> <given-names>A.</given-names></name> <name><surname>Wieser</surname> <given-names>M. J.</given-names></name> <name><surname>Alpers</surname> <given-names>G. W.</given-names></name></person-group> (<year>2014</year>). <article-title>Emotional pictures and sounds: a review of multimodal interactions of emotion cues in multiple domains</article-title>. <source>Front. Psychol.</source> <volume>5</volume>:<fpage>1351</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2014.01351</pub-id></citation></ref>
<ref id="ref13"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Golparvar</surname> <given-names>M.</given-names></name></person-group> (<year>2016</year>). <article-title>Unconventional functions of deviant behaviors in the relationship between job stress and emotional exhaustion: three study findings</article-title>. <source>Curr. Psychol.</source> <volume>35</volume>, <fpage>269</fpage>&#x2013;<lpage>284</lpage>. doi: <pub-id pub-id-type="doi">10.1007/s12144-014-9292-8</pub-id></citation></ref>
<ref id="ref14"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Guti&#x00E9;rrez-Cobo</surname> <given-names>M. J.</given-names></name> <name><surname>Cabello</surname> <given-names>R.</given-names></name> <name><surname>Fern&#x00E1;ndez-Berrocal</surname> <given-names>P.</given-names></name></person-group> (<year>2017</year>). <article-title>The three models of emotional intelligence and performance in a hot and cool go/no-go task in undergraduate students</article-title>. <source>Front. Behav. Neurosci.</source> <volume>11</volume>:<fpage>33</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnbeh.2017.00033</pub-id>, PMID: <pub-id pub-id-type="pmid">28275343</pub-id></citation></ref>
<ref id="ref15"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hallowell</surname> <given-names>M.</given-names></name></person-group> (<year>2010</year>). <article-title>Safety risk perception in construction companies in the Pacific northwest of the USA</article-title>. <source>Constr. Manag. Econ.</source> <volume>28</volume>, <fpage>403</fpage>&#x2013;<lpage>413</lpage>. doi: <pub-id pub-id-type="doi">10.1080/01446191003587752</pub-id></citation></ref>
<ref id="ref16"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Haslam</surname> <given-names>R. A.</given-names></name> <name><surname>Hide</surname> <given-names>S. A.</given-names></name> <name><surname>Gibb</surname> <given-names>A. G. F.</given-names></name> <name><surname>Gyi</surname> <given-names>D. E.</given-names></name> <name><surname>Pavitt</surname> <given-names>T.</given-names></name> <name><surname>Atkinson</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2005</year>). <article-title>Contributing factors in construction accidents</article-title>. <source>Appl. Ergon.</source> <volume>36</volume>, <fpage>401</fpage>&#x2013;<lpage>415</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.apergo.2004.12.002</pub-id></citation></ref>
<ref id="ref17"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Heinrich</surname> <given-names>H. W.</given-names></name></person-group> (<year>1941</year>). <article-title>Industrial Accident Prevention. A Scientific Approach</article-title>. <source>Industrial Accident Prevention. A Scientific Approach.</source> (Second Edition).</citation></ref>
<ref id="ref18"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Isen</surname> <given-names>A. M.</given-names></name> <name><surname>Patrick</surname> <given-names>R.</given-names></name></person-group> (<year>1983</year>). <article-title>The effect of positive feelings on risk taking: when the chips are down</article-title>. <source>Organ. Behav. Hum. Perform.</source> <volume>31</volume>, <fpage>194</fpage>&#x2013;<lpage>202</lpage>. doi: <pub-id pub-id-type="doi">10.1016/0030-5073(83)90120-4</pub-id></citation></ref>
<ref id="ref19"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Jeon</surname> <given-names>M.</given-names></name> <name><surname>Walker</surname> <given-names>B. N.</given-names></name> <name><surname>Yim</surname> <given-names>J.-B.</given-names></name></person-group> (<year>2014</year>). <article-title>Effects of specific emotions on subjective judgment, driving performance, and perceived workload</article-title>. <source>Transport. Res. F: Traffic Psychol. Behav.</source> <volume>24</volume>, <fpage>197</fpage>&#x2013;<lpage>209</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.trf.2014.04.003</pub-id></citation></ref>
<ref id="ref007"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Johnson</surname> <given-names>E. J.</given-names></name> <name><surname>Tversky</surname> <given-names>A.</given-names></name></person-group> (<year>1983</year>). <article-title>Affect, generalization, and the perception of risk</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>45</volume>, <fpage>20</fpage>&#x2013;<lpage>31</lpage>.</citation></ref>
<ref id="ref008"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kajiwara</surname> <given-names>Y.</given-names></name> <name><surname>Toshihiko</surname> <given-names>S.</given-names></name> <name><surname>Haruhiko</surname> <given-names>K.</given-names></name></person-group> (<year>2019</year>). <article-title>Predicting emotion and engagement of workers in order picking based on behavior and pulse waves acquired by wearable devices</article-title>. <source>Sensors</source> <volume>19</volume>:<fpage>165</fpage>. doi: <pub-id pub-id-type="doi">10.3390/s19010165</pub-id></citation></ref>
<ref id="ref21"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kappes</surname> <given-names>C.</given-names></name> <name><surname>Streubel</surname> <given-names>B.</given-names></name> <name><surname>Droste</surname> <given-names>K. L.</given-names></name> <name><surname>Folta-Schoofs</surname> <given-names>K.</given-names></name></person-group> (<year>2017</year>). <article-title>Linking the positivity effect in attention with affective outcomes: age group differences and the role of arousal</article-title>. <source>Front. Psychol.</source> <volume>8</volume>:<fpage>1877</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2017.01877</pub-id>, PMID: <pub-id pub-id-type="pmid">29163266</pub-id></citation></ref>
<ref id="ref22"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Keltner</surname> <given-names>D.</given-names></name> <name><surname>Lerner</surname> <given-names>J. S.</given-names></name></person-group> (<year>2010</year>). &#x201C;<article-title>Emotion,</article-title>&#x201D; in <source>Handbook of Social Psychology.</source> eds. <person-group person-group-type="editor"><name><surname>Fiske</surname> <given-names>S. T.</given-names></name> <name><surname>Gilbert</surname> <given-names>D. T.</given-names></name> <name><surname>Lindzey</surname> <given-names>G.</given-names></name></person-group> (<publisher-loc>Hoboken. NJ</publisher-loc>: <publisher-name>John Wiley &#x0026; Sons</publisher-name>), <fpage>317</fpage>&#x2013;<lpage>352</lpage>.</citation></ref>
<ref id="ref23"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kim</surname> <given-names>J.</given-names></name> <name><surname>Andr&#x00E9;</surname> <given-names>E.</given-names></name></person-group> (<year>2008</year>). <article-title>Emotion recognition based on physiological changes in music listening</article-title>. <source>IEEE Trans. Pattern Anal. Mach. Intell.</source> <volume>30</volume>, <fpage>2067</fpage>&#x2013;<lpage>2083</lpage>. doi: <pub-id pub-id-type="doi">10.1109/TPAMI.2008.26</pub-id>, PMID: <pub-id pub-id-type="pmid">18988943</pub-id></citation></ref>
<ref id="ref24"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kim</surname> <given-names>M.-K.</given-names></name> <name><surname>Kim</surname> <given-names>M.</given-names></name> <name><surname>Eunmi</surname> <given-names>O.</given-names></name> <name><surname>Kim</surname> <given-names>S.-P.</given-names></name></person-group> (<year>2013</year>). <article-title>A review on the computational methods for emotional state estimation from the human EEG</article-title>. <source>Comput. Math. Methods Med.</source> <volume>2013</volume>:<fpage>573734</fpage>, <fpage>1</fpage>&#x2013;<lpage>13</lpage>. doi: <pub-id pub-id-type="doi">10.1155/2013/573734</pub-id></citation></ref>
<ref id="ref25"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lang</surname> <given-names>P. J.</given-names></name></person-group> (<year>1995</year>). <article-title>The emotion probe: studies of motivation and attention</article-title>. <source>Am. Psychol.</source> <volume>50</volume>, <fpage>372</fpage>&#x2013;<lpage>385</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0003-066X.50.5.372</pub-id>, PMID: <pub-id pub-id-type="pmid">7762889</pub-id></citation></ref>
<ref id="ref26"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Lang</surname> <given-names>Peter J.</given-names></name> <name><surname>Bradley</surname> <given-names>M. M.</given-names></name> <name><surname>Cuthbert</surname> <given-names>B.</given-names></name></person-group> (<year>1988</year>). <source>The International Affective Picture System.</source> <publisher-loc>Gainesville, USA</publisher-loc>: <publisher-name>Center for Research in Psychophysiology, University of Florida</publisher-name>.</citation></ref>
<ref id="ref27"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lerner</surname> <given-names>J. S.</given-names></name> <name><surname>Tiedens</surname> <given-names>L. Z.</given-names></name></person-group> (<year>2006</year>). <article-title>Portrait of the angry decision maker: how appraisal tendencies shape anger&#x2019;s influence on cognition</article-title>. <source>J. Behav. Decis. Mak.</source> <volume>19</volume>, <fpage>115</fpage>&#x2013;<lpage>137</lpage>. doi: <pub-id pub-id-type="doi">10.1002/bdm.515</pub-id></citation></ref>
<ref id="ref28"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lingard</surname> <given-names>H.</given-names></name></person-group> (<year>2013</year>). <article-title>Occupational health and safety in the construction industry</article-title>. <source>Constr. Manag. Econ.</source> <volume>31</volume>, <fpage>505</fpage>&#x2013;<lpage>514</lpage>. doi: <pub-id pub-id-type="doi">10.1080/01446193.2013.816435</pub-id></citation></ref>
<ref id="ref29"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lischke</surname> <given-names>A.</given-names></name> <name><surname>Pahnke</surname> <given-names>R.</given-names></name> <name><surname>Mau-Moeller</surname> <given-names>A.</given-names></name> <name><surname>Jacksteit</surname> <given-names>R.</given-names></name> <name><surname>Weippert</surname> <given-names>M.</given-names></name></person-group> (<year>2020</year>). <article-title>Sex-specific relationships between interoceptive accuracy and Emotion regulation</article-title>. <source>Front. Behav. Neurosci.</source> <volume>14</volume>:<fpage>67</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnbeh.2020.00067</pub-id>, PMID: <pub-id pub-id-type="pmid">32655380</pub-id></citation></ref>
<ref id="ref30"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Liu</surname> <given-names>D.</given-names></name> <name><surname>Jin</surname> <given-names>Z. Y.</given-names></name> <name><surname>Gambatese</surname> <given-names>J. A.</given-names></name></person-group> (<year>2020</year>). <article-title>Scenarios for integrating IPS-IMU system with bim technology in construction safety control</article-title>. <source>Pract. Period. Struct. Des. Constr.</source> <volume>25</volume>:<fpage>05019007</fpage>. doi: <pub-id pub-id-type="doi">10.1061/(ASCE)SC.1943-5576.0000465</pub-id></citation></ref>
<ref id="ref31"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Loewenstein</surname> <given-names>G.</given-names></name> <name><surname>Lerner</surname> <given-names>J. S.</given-names></name></person-group> (<year>2003</year>). <article-title>The role of affect in decision making</article-title>. <source>Prog Brain Res.</source> <volume>202</volume>, <fpage>37</fpage>&#x2013;<lpage>53</lpage>. doi: <pub-id pub-id-type="doi">10.1016/B978-0-444-62604-2.00003-4</pub-id></citation></ref>
<ref id="ref004"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Long</surname> <given-names>F. F. S. G.</given-names></name> <name><surname>Zhao</surname> <given-names>X.</given-names></name> <name><surname>Wei</surname> <given-names>S. C.</given-names></name> <name><surname>Ng</surname> <given-names>X. L.</given-names></name> <name><surname>Ni</surname> <given-names>A. P.</given-names></name> <name><surname>Chi</surname> <given-names>P.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>Positive and negative emotion classification based on multi-channel</article-title>. <source>Front. Behav. Neurosci.</source> <volume>15</volume>:<fpage>720451</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnbeh.2021.720451</pub-id></citation></ref>
<ref id="ref32"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>MacLeod</surname> <given-names>C.</given-names></name> <name><surname>Mathews</surname> <given-names>A.</given-names></name> <name><surname>Tata</surname> <given-names>P.</given-names></name></person-group> (<year>1986</year>). <article-title>Attentional bias in emotional disorders</article-title>. <source>J. Abnorm. Psychol.</source> <volume>95</volume>:<fpage>15</fpage>, &#x2013;<lpage>20</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0021-843X.95.1.15</pub-id></citation></ref>
<ref id="ref33"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Maier</surname> <given-names>M. J.</given-names></name> <name><surname>Schiel</surname> <given-names>J. E.</given-names></name> <name><surname>Rosenbaum</surname> <given-names>D.</given-names></name> <name><surname>Hautzinger</surname> <given-names>M.</given-names></name> <name><surname>Fallgatter</surname> <given-names>A. J.</given-names></name> <name><surname>Ehlis</surname> <given-names>A.-C.</given-names></name></person-group> (<year>2021</year>). <article-title>To regulate or not to regulate: emotion regulation in participants with low and high impulsivity</article-title>. <source>Front. Behav. Neurosci.</source> <volume>15</volume>:<fpage>645052</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnbeh.2021.645052</pub-id>, PMID: <pub-id pub-id-type="pmid">34393732</pub-id></citation></ref>
<ref id="ref34"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Manzoor</surname> <given-names>A.</given-names></name> <name><surname>Treur</surname> <given-names>J.</given-names></name></person-group> (<year>2015</year>). <article-title>An agent-based model for integrated emotion regulation and contagion in socially affected decision making</article-title>. <source>Biol. Inspired Cognit. Archit.</source> <volume>12</volume>, <fpage>105</fpage>&#x2013;<lpage>120</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.bica.2015.04.005</pub-id></citation></ref>
<ref id="ref005"><citation citation-type="journal"><person-group person-group-type="author"><collab id="col103">Ministry of Housing and Urban&#x2013;Rural Development of China</collab></person-group> (<year>2019</year>). <article-title>Circular of the general office of the ministry of housing and urban-rural development on the production and safety accidents of housing and municipal engineering in 2019 [EB/OL]</article-title>. Available at: <ext-link xlink:href="http://www.mohurd.gov.cn/wjfb/202006/t20200624_246031.html" ext-link-type="uri">http://www.mohurd.gov.cn/wjfb/202006/t20200624_246031.html</ext-link></citation></ref>
<ref id="ref35"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Muller</surname> <given-names>A. R.</given-names></name> <name><surname>Pfarrer</surname> <given-names>M. D.</given-names></name> <name><surname>Little</surname> <given-names>L. M.</given-names></name></person-group> (<year>2014</year>). <article-title>A theory of collective empathy in corporate philanthropy decisions</article-title>. <source>Acad. Manag. Rev.</source> <volume>39</volume>, <fpage>1</fpage>&#x2013;<lpage>21</lpage>. doi: <pub-id pub-id-type="doi">10.5465/amr.2012.0031</pub-id></citation></ref>
<ref id="ref36"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Namian</surname> <given-names>M.</given-names></name> <name><surname>Zuluaga</surname> <given-names>C. M.</given-names></name> <name><surname>Albert</surname> <given-names>A.</given-names></name></person-group> (<year>2016</year>). &#x201C;<article-title>Critical factors that impact construction workers&#x2019; hazard recognition performance</article-title>.&#x201D; <source>Construction Research Congress 2016</source>, <fpage>2762</fpage>&#x2013;<lpage>2772</lpage>.</citation></ref>
<ref id="ref37"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pandit</surname> <given-names>B.</given-names></name> <name><surname>Albert</surname> <given-names>A.</given-names></name> <name><surname>Patil</surname> <given-names>Y.</given-names></name> <name><surname>Al-Bayati</surname> <given-names>A. J.</given-names></name></person-group> (<year>2019</year>). <article-title>Impact of safety climate on hazard recognition and safety risk perception</article-title>. <source>Saf. Sci.</source> <volume>113</volume>, <fpage>44</fpage>&#x2013;<lpage>53</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.ssci.2018.11.020</pub-id></citation></ref>
<ref id="ref38"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Perlman</surname> <given-names>A.</given-names></name> <name><surname>Sacks</surname> <given-names>R.</given-names></name> <name><surname>Barak</surname> <given-names>R.</given-names></name></person-group> (<year>2014</year>). <article-title>Hazard recognition and risk perception in construction</article-title>. <source>Saf. Sci.</source> <volume>64</volume>, <fpage>22</fpage>&#x2013;<lpage>31</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.ssci.2013.11.019</pub-id></citation></ref>
<ref id="ref39"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pittig</surname> <given-names>A.</given-names></name> <name><surname>Pawlikowski</surname> <given-names>M.</given-names></name> <name><surname>Craske</surname> <given-names>M. G.</given-names></name> <name><surname>Alpers</surname> <given-names>G. W.</given-names></name></person-group> (<year>2014</year>). <article-title>Avoidant decision making in social anxiety: the interaction of angry faces and emotional responses</article-title>. <source>Front. Psychol.</source> <volume>5</volume>:<fpage>1050</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2014.01050</pub-id></citation></ref>
<ref id="ref40"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pool</surname> <given-names>E.</given-names></name> <name><surname>Brosch</surname> <given-names>T.</given-names></name> <name><surname>Delplanque</surname> <given-names>S.</given-names></name> <name><surname>Sander</surname> <given-names>D.</given-names></name></person-group> (<year>2016</year>). <article-title>Attentional bias for positive emotional stimuli: a meta-analytic investigation</article-title>. <source>Psychol. Bull.</source> <volume>142</volume>, <fpage>79</fpage>&#x2013;<lpage>106</lpage>. doi: <pub-id pub-id-type="doi">10.1037/bul0000026</pub-id>, PMID: <pub-id pub-id-type="pmid">26390266</pub-id></citation></ref>
<ref id="ref20"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Rachlin</surname> <given-names>H.</given-names></name></person-group> (<year>2003</year>). <article-title>Bounded rationality: The adaptive toolbox</article-title>. <source>J. Exp. Anal. Behav.</source> <volume>79</volume>, <fpage>409</fpage>&#x2013;<lpage>412</lpage>. doi: <pub-id pub-id-type="doi">10.1901/jeab.2003.79-409</pub-id></citation></ref>
<ref id="ref41"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Radenhausen</surname> <given-names>R. A.</given-names></name> <name><surname>Anker</surname> <given-names>J. M.</given-names></name></person-group> (<year>1988</year>). <article-title>Effects of depressed mood induction on reasoning performance</article-title>. <source>Percept. Mot. Skills</source> <volume>66</volume>, <fpage>855</fpage>&#x2013;<lpage>860</lpage>. doi: <pub-id pub-id-type="doi">10.2466/pms.1988.66.3.855</pub-id>, PMID: <pub-id pub-id-type="pmid">3405709</pub-id></citation></ref>
<ref id="ref42"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Salminen</surname> <given-names>S.</given-names></name> <name><surname>Tallberg</surname> <given-names>T.</given-names></name></person-group> (<year>1996</year>). <article-title>Human errors in fatal and serious occupational accidents in Finland</article-title>. <source>Ergonomics</source> <volume>39</volume>, <fpage>980</fpage>&#x2013;<lpage>988</lpage>. doi: <pub-id pub-id-type="doi">10.1080/00140139608964518</pub-id>, PMID: <pub-id pub-id-type="pmid">8690011</pub-id></citation></ref>
<ref id="ref44"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Schwarz</surname> <given-names>N.</given-names></name> <name><surname>Clore</surname> <given-names>G. L.</given-names></name></person-group> (<year>1981</year>). <article-title>Mood, misattribution, and judgments of well-being: informative and directive functions of affective states</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>45</volume>, <fpage>513</fpage>&#x2013;<lpage>523</lpage>.</citation></ref>
<ref id="ref009"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Soleymani</surname> <given-names>M.</given-names></name> <name><surname>Pantic</surname> <given-names>M.</given-names></name> <name><surname>Pun</surname> <given-names>T.</given-names></name></person-group> (<year>2012</year>). <article-title>Multimodal Emotion Recognition in Response to Videos.</article-title> <source>IEEE Trans. Affect. Comput.</source> <volume>3</volume>, <fpage>211</fpage>&#x2013;<lpage>223</lpage>. doi: <pub-id pub-id-type="doi">10.1109/T-AFFC.2011.37</pub-id></citation></ref>
<ref id="ref45"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sugi</surname> <given-names>M.</given-names></name> <name><surname>Sakuraba</surname> <given-names>S.</given-names></name> <name><surname>Saito</surname> <given-names>H.</given-names></name> <name><surname>Miyazaki</surname> <given-names>M.</given-names></name> <name><surname>Yoshida</surname> <given-names>S.</given-names></name> <name><surname>Kamada</surname> <given-names>T.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>Personality traits modulate the impact of emotional stimuli during a working memory task: a near-infrared spectroscopy study</article-title>. <source>Front. Behav. Neurosci.</source> <volume>14</volume>:<fpage>514414</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnbeh.2020.514414</pub-id>, PMID: <pub-id pub-id-type="pmid">33093826</pub-id></citation></ref>
<ref id="ref010"><citation citation-type="book"><person-group person-group-type="author"><name><surname>Takehara</surname> <given-names>H.</given-names></name> <name><surname>Shigekazu</surname> <given-names>I.</given-names></name> <name><surname>Tatsuya</surname> <given-names>I.</given-names></name></person-group> (<year>2020</year>). <article-title>Comparison between facilitating and suppressing facial emotional expressions using frontal EEG asymmetry</article-title>. <source>Front. Behav. Neurosci.</source> <volume>14</volume>:<fpage>554147</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fnbeh.2020.554147</pub-id></citation></ref>
<ref id="ref46"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tixier</surname> <given-names>A. J.-P.</given-names></name> <name><surname>Hallowell</surname> <given-names>M. R.</given-names></name> <name><surname>Albert</surname> <given-names>A.</given-names></name> <name><surname>van Boven</surname> <given-names>L.</given-names></name> <name><surname>Kleiner</surname> <given-names>B. M.</given-names></name></person-group> (<year>2014</year>). <article-title>Psychological antecedents of risk-taking behavior in construction</article-title>. <source>J. Constr. Eng. Manag.</source> <volume>140</volume>:<fpage>04014052</fpage>. doi: <pub-id pub-id-type="doi">10.1061/(ASCE)CO.1943-7862.0000894</pub-id></citation></ref>
<ref id="ref47"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Topolinski</surname> <given-names>S.</given-names></name> <name><surname>Strack</surname> <given-names>F.</given-names></name></person-group> (<year>2015</year>). <article-title>Corrugator activity confirms immediate negative affect in surprise</article-title>. <source>Front. Psychol.</source> <volume>6</volume>:<fpage>134</fpage>. doi: <pub-id pub-id-type="doi">10.3389/fpsyg.2015.00134</pub-id></citation></ref>
<ref id="ref48"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Watson</surname> <given-names>D.</given-names></name> <name><surname>Clark</surname> <given-names>L. A.</given-names></name> <name><surname>Tellegen</surname> <given-names>A.</given-names></name></person-group> (<year>1988</year>). <article-title>Development and validation of brief measures of positive and negative affect: the PANAS scales</article-title>. <source>J. Pers. Soc. Psychol.</source> <volume>54</volume>, <fpage>1063</fpage>&#x2013;<lpage>1070</lpage>. doi: <pub-id pub-id-type="doi">10.1037/0022-3514.54.6.1063</pub-id>, PMID: <pub-id pub-id-type="pmid">3397865</pub-id></citation></ref>
<ref id="ref49"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Weiss</surname> <given-names>H. M.</given-names></name> <name><surname>Cropanzano</surname> <given-names>R.</given-names></name></person-group> (<year>1996</year>). <article-title>Affective events theory: A theoretical discussion of the affective experiences and job satisfaction and variations in affective experience over time</article-title>. <source>December Organ. Behav. Hum</source> <volume>78</volume>, <fpage>1</fpage>&#x2013;<lpage>24</lpage>.</citation></ref>
<ref id="ref50"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yang</surname> <given-names>X.</given-names></name> <name><surname>Tian</surname> <given-names>Y.</given-names></name> <name><surname>Feng</surname> <given-names>K.</given-names></name> <name><surname>Yang</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>S.-h.</given-names></name> <name><surname>Wang</surname> <given-names>S.</given-names></name></person-group> (<year>2020</year>). <article-title>Signal game analysis on the effectiveness of coal mine safety supervision based on the affective events theory</article-title>. <source>Complexity</source> <volume>2020</volume>, <fpage>1</fpage>&#x2013;<lpage>9</lpage>. doi: <pub-id pub-id-type="doi">10.1155/2020/5710419</pub-id></citation></ref>
<ref id="ref51"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zelenski</surname> <given-names>J. M.</given-names></name> <name><surname>Santoro</surname> <given-names>M. S.</given-names></name> <name><surname>Whelan</surname> <given-names>D. C.</given-names></name></person-group> (<year>2012</year>). <article-title>Would introverts be better off if they acted more like extraverts? Exploring emotional and cognitive consequences of counterdispositional behavior</article-title>. <source>Emotion</source> <volume>12</volume>, <fpage>290</fpage>&#x2013;<lpage>303</lpage>. doi: <pub-id pub-id-type="doi">10.1037/a0025169</pub-id>, PMID: <pub-id pub-id-type="pmid">21859197</pub-id></citation></ref>
<ref id="ref52"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>X.</given-names></name> <name><surname>Chao</surname> <given-names>X.</given-names></name> <name><surname>Xue</surname> <given-names>W.</given-names></name> <name><surname>Jing</surname> <given-names>H.</given-names></name> <name><surname>He</surname> <given-names>Y.</given-names></name> <name><surname>Gao</surname> <given-names>M.</given-names></name></person-group> (<year>2018</year>). <article-title>Emotion recognition based on multichannel physiological signals with comprehensive nonlinear processing</article-title>. <source>Sensors</source> <volume>18</volume>:<fpage>3886</fpage>. doi: <pub-id pub-id-type="doi">10.3390/s18113886</pub-id>, PMID: <pub-id pub-id-type="pmid">30423894</pub-id></citation></ref>
<ref id="ref53"><citation citation-type="other"><person-group person-group-type="author"><name><surname>Zhao</surname> <given-names>B.</given-names></name> <name><surname>Wang</surname> <given-names>Z.</given-names></name> <name><surname>Yu</surname> <given-names>Z.</given-names></name> <name><surname>Guo</surname> <given-names>B.</given-names></name></person-group> (<year>2018</year>). &#x201C;<article-title>EmotionSense: Emotion Recognition Based on Wearable Wristband</article-title>.&#x201D; in <source>2018 IEEE SmartWorld, Ubiquitous Intelligence &#x0026; Computing, Advanced &#x0026; Trusted Computing, Scalable Computing &#x0026; Communications, Cloud &#x0026; Big Data Computing, Internet of People and Smart City Innovation.</source> Oct 8&#x2013;12, 2018; 2018 IEEE SmartWorld.</citation></ref>
<ref id="ref54"><citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhou</surname> <given-names>Q.</given-names></name> <name><surname>Fang</surname> <given-names>D.</given-names></name> <name><surname>Wang</surname> <given-names>X.</given-names></name></person-group> (<year>2008</year>). <article-title>A method to identify strategies for the improvement of human safety behavior by considering safety climate and personal experience</article-title>. <source>Saf. Sci.</source> <volume>46</volume>, <fpage>1406</fpage>&#x2013;<lpage>1419</lpage>. doi: <pub-id pub-id-type="doi">10.1016/j.ssci.2007.10.005</pub-id></citation></ref>
</ref-list>
</back>
</article>