<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Neurosci.</journal-id>
<journal-title>Frontiers in Neuroscience</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Neurosci.</abbrev-journal-title>
<issn pub-type="epub">1662-453X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fnins.2025.1651762</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Neuroscience</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>A novel fast detection algorithm for depression based on 3-channel EEG signals</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author" corresp="yes">
<name><surname>Guo</surname> <given-names>XiWu</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/3108858/overview"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/data-curation/"/>
<role content-type="https://credit.niso.org/contributor-roles/funding-acquisition/"/>
<role content-type="https://credit.niso.org/contributor-roles/investigation/"/>
<role content-type="https://credit.niso.org/contributor-roles/methodology/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Guo</surname> <given-names>ZiHan</given-names></name>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/3206727/overview"/>
<role content-type="https://credit.niso.org/contributor-roles/data-curation/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
<role content-type="https://credit.niso.org/contributor-roles/formal-analysis/"/>
<role content-type="https://credit.niso.org/contributor-roles/software/"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Xie</surname> <given-names>TaoLi</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<role content-type="https://credit.niso.org/contributor-roles/supervision/"/>
<role content-type="https://credit.niso.org/contributor-roles/validation/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-editing/"/>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of the People&#x00027;s Hospital of Taihe County, Fuyang</institution>, <addr-line>Anhui</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>Medical College, Tianshi College</institution>, <addr-line>Tianjin</addr-line>, <country>China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Xin Shi, Chongqing University, China</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Zechen Li, Chengdu University of Information Technology, China</p>
<p>Hang Yu, University of Electronic Science and Technology of China, China</p></fn>
<corresp id="c001">&#x0002A;Correspondence: XiWu Guo <email>19165586971&#x00040;163.com</email></corresp>
</author-notes>
<pub-date pub-type="epub">
<day>29</day>
<month>09</month>
<year>2025</year>
</pub-date>
<pub-date pub-type="collection">
<year>2025</year>
</pub-date>
<volume>19</volume>
<elocation-id>1651762</elocation-id>
<history>
<date date-type="received">
<day>22</day>
<month>06</month>
<year>2025</year>
</date>
<date date-type="accepted">
<day>04</day>
<month>09</month>
<year>2025</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2025 Guo, Guo and Xie.</copyright-statement>
<copyright-year>2025</copyright-year>
<copyright-holder>Guo, Guo and Xie</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license>
</permissions>
<abstract>
<p>Medically unexplained symptoms (MUS) are an emerging field in current research. Among middle-aged and elderly patients, most MUS symptoms are mainly caused by depression, but early symptoms do not meet the international somatization standards, which delays treatment. Therefore, developing a rapid auxiliary diagnosis method is of great significance. This paper proposes a novel model for identifying depression based on 3-channel electroencephalogram (EEG) signals from the prefrontal lobe of the human brain. For the collected resting-state EEG signals, variational mode decomposition (VMD) is first used for signal decomposition, and the power spectrum is employed to select intrinsic mode function (IMF) components. After extracting energy features via sample entropy, LightGBM is adopted for classification, with a classification accuracy of 97.42%. Through comparative experiments, the model proposed in this paper achieves a balance between high accuracy and timeliness. This is conducive to the development of a depression detection system based on portable real-time electroencephalography (EEG), and provides a solution for EEG signal devices in real-time depression detection and pre-triage of patients with Medically Unexplained Symptoms (MUS).</p></abstract>
<kwd-group>
<kwd>medically unexplained symptoms</kwd>
<kwd>EEG signals</kwd>
<kwd>depression</kwd>
<kwd>LightGBM</kwd>
<kwd>VMD</kwd>
</kwd-group>
<counts>
<fig-count count="10"/>
<table-count count="4"/>
<equation-count count="19"/>
<ref-count count="50"/>
<page-count count="15"/>
<word-count count="8824"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-at-acceptance</meta-name>
<meta-value>Translational Neuroscience</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>1 Introduction</title>
<p>The general undifferentiated symptoms refer to pain, fatigue, gastrointestinal and cardiovascular symptoms, which are known as medically unexplained symptoms (MUS), which are very common in elderly people and healthcare patients (<xref ref-type="bibr" rid="B32">Leiknes et al., 2007</xref>; <xref ref-type="bibr" rid="B30">Kurita et al., 2012</xref>; <xref ref-type="bibr" rid="B45">Steinbrecher et al., 2011</xref>; <xref ref-type="bibr" rid="B11">Claassen-van Dessel et al., 2018</xref>). These symptoms are generally harmless to the human body, but in recent years, many studies have shown that mental illnesses such as depression often present with medically unexplained symptoms, which may affect their treatment outcomes (<xref ref-type="bibr" rid="B28">Hung et al., 2019</xref>; <xref ref-type="bibr" rid="B27">Huijbregts et al., 2010</xref>; <xref ref-type="bibr" rid="B43">Simon et al., 1999</xref>; <xref ref-type="bibr" rid="B36">Mergl et al., 2007</xref>; <xref ref-type="bibr" rid="B5">Barsky et al., 2005</xref>; <xref ref-type="bibr" rid="B25">Harris et al., 2009</xref>).</p>
<p>Somatic symptom disorder is one of the common mental disorders, with an incidence of &#x0007E;6% in the general population, particularly among retired or widowed elderly people (<xref ref-type="bibr" rid="B47">Wittchen et al., 2011</xref>). According to the diagnostic definition of somatic disorders in the International Classification of Diseases, 10th Edition (ICD-10) (<xref ref-type="bibr" rid="B16">DiSantostefano, 2009</xref>), at least six medically unexplained somatic symptoms in two different organ systems, persisting for more than 2 years, are required for a diagnosis of somatic disorder. However, the prevalence of somatic disorder is not high, accounting for only 0.4% in the general population (<xref ref-type="bibr" rid="B12">Creed and Barsky, 2004</xref>; <xref ref-type="bibr" rid="B15">de Waal et al., 2004</xref>; <xref ref-type="bibr" rid="B22">Fink et al., 2004</xref>). Due to the low incidence, many patients with medically unexplained symptoms are often overlooked by hospitals. Meanwhile, studies have shown that many patients with medically unexplained symptoms have some degree of physical impairment but do not meet the strict criteria for somatic symptom disorder, thus failing to receive appropriate treatment (<xref ref-type="bibr" rid="B34">Mayou et al., 2005</xref>; <xref ref-type="bibr" rid="B24">Gureje and Reed, 2016</xref>). Additionally, regarding fatigue, edema, and unexplained pain as a single condition poses challenges for many professional physicians, mainly because doctors cannot make quantitative judgments based on descriptions (<xref ref-type="bibr" rid="B31">Leiknes et al., 2006</xref>; <xref ref-type="bibr" rid="B35">McFarlane et al., 2008</xref>; <xref ref-type="bibr" rid="B42">Sharpe et al., 2006</xref>), leading to significant differences in diagnostic opinions among physicians.</p>
<p>Among medically unexplained symptoms, most patients have mental illnesses such as depression. In patients with mild depression, who are in a long-term state of low mood, symptoms are relatively mild and show a certain degree of somatization. These patients have no obvious symptoms and are classified as having medically unexplained symptoms. However, due to the inability to confirm the etiology, patients may fail to receive proper treatment, potentially developing into severe depression and even suicidal ideation. Based on the statistical analysis of medical records of elderly patients admitted to our hospital in the past 5 years, as shown in <xref ref-type="fig" rid="F1">Figure 1</xref>, among a total of 198 middle-aged and elderly patients with medically unexplained symptoms (MUS) admitted, this patient group initially presented with MUS symptoms such as fatigue. Through long-term disease follow-up, the final medically confirmed results showed that 22.7% of the patients had somatization disorder, 21.2% had grade 3 hypertension, 19.2% had sleep disorder of a certain degree, 14.1% had hyperlipidemia, 13.1% had anxiety and anxiety disorder. From the data analysis, nearly 35.8% of the patients initially presented with symptoms such as fatigue of unknown cause, which is likely to be an early manifestation of somatization disorder caused by anxiety and depression. However, according to the quantitative criteria for depression, these patients did not fall into the category of depression in the early stage, which may easily delay treatment. Currently, the most conventional detection method for depression is psychological questionnaires, but it has subjectivity, and due to patients&#x00027; potential concealment and resistance to psychological questionnaires, accurate diagnostic results are often difficult to obtain (<xref ref-type="bibr" rid="B46">Wang et al., 2023</xref>). Therefore, in recent years, electroencephalogram (EEG) signals, as an auxiliary diagnostic tool for mental illnesses, have become a research hotspot.</p>
<fig position="float" id="F1">
<label>Figure 1</label>
<caption><p>Distribution map of diagnostic results in 198 MUS patients.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0001.tif">
<alt-text>Pie chart showing the prevalence of various disorders. Somatization Disorder is 22.7% (red), Hypertension, Grade 3 (blue) is 21.2%, Sleep Disorder/Insomnia (green) is 19.2%, Hyperlipidemia (purple) is 14.1%, Anxiety State/Anxiety Disorder (orange) is 13.1%, and Others (teal) is 9.7%.</alt-text>
</graphic>
</fig>
<p>In recent years, significant advancements in neuroimaging techniques, such as positron emission tomography (PET), magnetic resonance imaging (MRI), and electroencephalography (EEG), have enabled noninvasive studies of brain functions and related disorders (<xref ref-type="bibr" rid="B14">De la Salle et al., 2016</xref>). However, the cost of PET and MRI equipment is prohibitive, requiring specialized personnel for operation (<xref ref-type="bibr" rid="B18">Ehman et al., 2017</xref>). PET involves the use of radioactive tracers, which increases safety risks and costs (<xref ref-type="bibr" rid="B48">Zhang et al., 2024</xref>). Moreover, due to the excessively high costs of PET and MRI, as well as the requirement for professional personnel to conduct interpretation, EEG has become a common technical method for depression detection owing to its advantages such as non-invasiveness and ease of operation (<xref ref-type="bibr" rid="B29">Klooster et al., 2023</xref>). The EEG activities in the &#x003B4;, &#x003B8;, &#x003B1;, and &#x003B2; frequency bands of patients with depression are usually higher than those in the normal control group, and the &#x003B1; and &#x003B2; frequency bands contain more depression-related EEG information than the low-frequency &#x003B4; and &#x003B8; bands (<xref ref-type="bibr" rid="B26">Hasanzadeh et al., 2020</xref>). Many scholars have also studied the effects of drugs, environment, religious beliefs, etc., on the brain waves of depressed patients (<xref ref-type="bibr" rid="B6">Berger et al., 2021</xref>; <xref ref-type="bibr" rid="B1">Akbari et al., 2021</xref>; <xref ref-type="bibr" rid="B50">Zuchowicz et al., 2019</xref>; <xref ref-type="bibr" rid="B9">Cao et al., 2019</xref>; <xref ref-type="bibr" rid="B23">Grieve et al., 2019</xref>; <xref ref-type="bibr" rid="B40">Panier et al., 2020</xref>; <xref ref-type="bibr" rid="B21">Feldmann et al., 2018</xref>; <xref ref-type="bibr" rid="B4">Bachmann et al., 2018</xref>). <xref ref-type="bibr" rid="B37">Mohammadi and Moradi (2021)</xref> found through detecting four EEG bands that the Alpha band is closely related to the severity of depression. <xref ref-type="bibr" rid="B39">Nusslock et al. (2018)</xref> demonstrated the influence of prefrontal EEG asymmetry on the EEG diagnosis of depression. <xref ref-type="bibr" rid="B49">Zhu et al. (2019)</xref> achieved multimodal depression diagnosis by combining EEG and eye movement. These studies focus on the selection of recording locations and EEG frequency bands, and many researchers have also made corresponding contributions in feature extraction.</p>
<p>The identification of depression using EEG signals mainly consists of two aspects: feature extraction and classification models. In terms of feature extraction, it is mainly divided into time-domain and frequency-domain methods. Time-domain methods mainly include techniques such as multiscale principal component analysis, intrinsic time-scale decomposition, linear discriminant analysis, and adjacent component analysis, which are used to analyze EEG time series and extract time-frequency features (<xref ref-type="bibr" rid="B33">Malviya and Mal, 2023</xref>). In terms of the frequency domain, Zhang et al. proposed a model combining Wavelet Packet Decomposition (WPD) and Variational Mode Decomposition (VMD) for the extraction of frequency-domain features from EEG signals (<xref ref-type="bibr" rid="B48">Zhang et al., 2024</xref>). <xref ref-type="bibr" rid="B2">Alhalaseh and Alasasfeh (2020)</xref> applied empirical mode decomposition (EMD) and VMD filters to clean EEG signals and further classified the emotions from EEG signals using entropy and Higuchi&#x00027;s fractal dimension as features. In terms of model classification, many scholars have also made contributions. <xref ref-type="bibr" rid="B19">El-Dahshan et al. (2024)</xref> utilized recurrence plots to obtain deep features from PPV signals and demonstrated that recurrence plots can effectively identify periodicity in signals. <xref ref-type="bibr" rid="B44">Siuly et al. (2024)</xref> employed the wavelet scattering transform (WST) method to extract time-frequency features of EEG signals, demonstrating the superiority of time-frequency domain features in EEG analysis. <xref ref-type="bibr" rid="B7">Cai et al. (2018)</xref> distinguished depression patients from normal controls by fusing different EEG data sources, and the KNN classifier used achieved the highest classification accuracy of 86.98% after fusing multi-source data. <xref ref-type="bibr" rid="B20">Fan et al. (2020)</xref> used high-density 128-channel EEG and long and short-term memory network strategy based on convolution to diagnose depression, and the proposed model reached the accuracy of 83.47%. <xref ref-type="bibr" rid="B3">Aydemir et al. (2021)</xref> used wavelet and melamine pattern to extract features of EEG signals of patients with depression, and used KNN and SVM classifiers for classification to obtain high automatic recognition accuracy. <xref ref-type="bibr" rid="B8">Cai et al. (2020)</xref> proposed a new autism EEG signal conversion method, which used a combination of local binary patterns and short-time Fourier transform to generate the spectral features of the signal, and used a lightweight neural network for training, the resulting model can be used to aid in the diagnosis of autism. Most traditional research methods use 128-channel brain electrodes to collect as many brain channel signals as possible, which leads to huge computational complexity and is not conducive to real-time monitoring of depression patients. In recent years, many researchers have focused on the prefrontal brain, selecting FP1, FP2, and FPZ signals as the signal sources. Although certain effects have been achieved, there are still limitations. This is because the prefrontal data has fewer signal channels, which is more susceptible to data fluctuations. Meanwhile, EEG signals inherently contain a large number of redundant features. Therefore, removing as many EEG redundant features as possible while maintaining high real-time performance remains a highly challenging problem.</p>
<p>Therefore, the main innovations of this paper are as follows:</p>
<list list-type="simple">
<list-item><p>1) A sample entropy feature is proposed to describe the difference between EEG signals of depression patients and normal individuals. As an energy feature, entropy can effectively characterize the complexity changes of myoelectric signals.</p></list-item>
<list-item><p>2) A redundant signal elimination strategy combining Variational Mode Decomposition (VMD) and power spectrum is proposed. By decomposing EEG signals via VMD and selecting Intrinsic Mode Functions (IMFs) through power spectrum analysis, this approach helps eliminate redundant features in EEG signals. Combined with the LightBGM classification model, it achieves high accuracy and provides a feasible scheme for real-time monitoring of depression patients.</p></list-item>
</list>
</sec>
<sec sec-type="materials and methods" id="s2">
<title>2 Materials and methods</title>
<sec>
<title>2.1 Data description</title>
<p>This experiment utilized a public dataset, namely the MODMA dataset (<xref ref-type="bibr" rid="B37">Mohammadi and Moradi, 2021</xref>), which was established by the Second Hospital of Lanzhou University. This dataset mainly consists of 55 participants, including a total of 26 outpatients diagnosed with depression (15 males and 11 females; aged 16&#x02013;56 years), and 29 healthy controls (19 males and 10 females; aged 18&#x02013;55 years). All MDD patients received a structured Mini-International Neuropsychiatric Interview (MINI) that met the diagnostic criteria for major depression of the Diagnostic and Statistical Manual of Mental Disorders (DSM) based on the DSM-IV. The dataset adopts a three-lead full-brain coverage EEG experimental protocol. According to the international 10&#x02013;20 system electrode placement standard, three positioning points are selected on the forehead for electrode placement, with their specific pasting positions shown in the <xref ref-type="fig" rid="F2">Figure 2</xref>. All subjects completed the Mini-Mental State Examination (MMSE) with the assistance of professional psychologists as a preliminary screening for depressive tendencies. If participants were at high risk of depression, they were required to additionally complete the Patient Health Questionnaire-9 (PHQ-9) to assess depression severity, while all basic information was collected. Candidate subjects were comprehensively determined to meet the experimental requirements based on self-rating scale data and inclusion criteria. Eligible subjects completed head cleaning under staff guidance and then wore detection equipment in a standard experimental environment. It should be specifically noted that: all subjects must not have taken any psychotropic drugs within 2 weeks before the experiment, and must not have other mental illnesses or organic brain injuries (such as epilepsy). Female subjects with depression must confirm that they are not pregnant. Meanwhile, the following conditions are excluded: lactating women, those taking contraceptives, individuals with a history of alcohol or psychotropic drug abuse/dependence within the past year, and individuals who have suffered abuse.</p>
<fig position="float" id="F2">
<label>Figure 2</label>
<caption><p>Location of frontal lobe EEG signal acquisition (<xref ref-type="bibr" rid="B37">Mohammadi and Moradi, 2021</xref>).</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0002.tif">
<alt-text>Diagram showing an EEG setup. On the left, a circular head model marks the nasion, inion, and electrode positions Fp1, Fp2, and Fpz. On the right, a schematic head illustration connects these electrodes to an EEG signal collector and a ground.</alt-text>
</graphic>
</fig>
</sec>
<sec>
<title>2.2 Sample entropy</title>
<p>Sample Entropy (SE) is a method proposed by <xref ref-type="bibr" rid="B41">Richman (2011)</xref> in 2000 to measure the complexity of time series. According to the principle and formula of sample entropy, a higher entropy value of a time series indicates greater complexity; conversely, a lower entropy value implies higher autocorrelation of the time series. In recent years, entropy has emerged as a novel method for evaluating the complexity and irregularity of EEG signals in individuals with Depressive Disorder. Increased EEG signal entropy has been observed in patients with DDD, which indicates enhanced complexity and reduced predictability of brain activity (<xref ref-type="bibr" rid="B10">Chen et al., 2020</xref>; <xref ref-type="bibr" rid="B13">&#x0010C;uki&#x00107; et al., 2020</xref>). The integration of this information-theoretic approach is regarded as a promising method for the assessment and monitoring of clinical depression (<xref ref-type="bibr" rid="B38">Murphy et al., 2020</xref>).</p>
<list list-type="simple">
<list-item><p>1) A set of -dimensional vectors <bold>X</bold>(<italic>q</italic>) &#x0003D; {<bold>X</bold>(<italic>q</italic>), <bold>X</bold>(<italic>q</italic> &#x0002B; 1), ..., <bold>X</bold>(<italic>q</italic> &#x0002B; <italic>k</italic> &#x02212; 1)}are constructed in order from a time series with a data quantity of <italic>Q</italic>, where <italic>q</italic> &#x0003D; 1, 2, ..., <italic>Q</italic> &#x02212; <italic>k</italic> &#x0002B; 1.</p></list-item>
<list-item><p>2) Calculate the distance <italic>d</italic><sub><italic>ij</italic></sub> &#x0003D; max[|<italic>x</italic>(<italic>i</italic> &#x0002B; <italic>g</italic>) &#x02212; <italic>x</italic>(<italic>j</italic> &#x02212; <italic>g</italic>)|] between the <italic>K</italic>-dimensional vector <bold>X</bold>(<italic>i</italic>) and other vectors<bold>X</bold>(<italic>j</italic>), where <italic>j</italic> &#x0003D; 1, 2, ..., <italic>Q</italic> &#x02212; <italic>k</italic> &#x0002B; 1, <italic>g</italic> &#x0003D; 0, 1, ..., <italic>k</italic> &#x02212; 1, <italic>i</italic> &#x02260; <italic>j</italic>.</p></list-item>
<list-item><p>3) For a given sequence, define the number of <italic>d</italic><sub><italic>ij</italic></sub> &#x02264; <italic>r</italic>, (<italic>r</italic> &#x0003E; 0) as <italic>B</italic><sub><italic>i</italic></sub>. The probability of matching <italic>K</italic> points is <inline-formula><mml:math id="M1"><mml:msubsup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>r</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula>, whose mean is <italic>B</italic><sup><italic>k</italic></sup>(<italic>r</italic>), and the formula is:
<disp-formula id="E1"><label>(1)</label><mml:math id="M2"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:msubsup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>r</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>/</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>Q</mml:mi><mml:mo>-</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="E2"><label>(2)</label><mml:math id="M3"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:msup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>r</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>Q</mml:mi><mml:mo>-</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>Q</mml:mi><mml:mo>-</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:munderover></mml:mstyle><mml:msubsup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>r</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p></list-item>
<list-item><p>4) Increase the dimension <italic>K</italic> by 1, and repeat steps 1&#x02013;3 to obtain <italic>B</italic><sup><italic>k</italic>&#x0002B;1</sup>(<italic>r</italic>) The estimated value of sample entropy is:
<disp-formula id="E3"><label>(3)</label><mml:math id="M4"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mtext class="textrm" mathvariant="normal">SampEn</mml:mtext><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>r</mml:mi><mml:mo>,</mml:mo><mml:mi>Q</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mo>-</mml:mo><mml:mo class="qopname">ln</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:msup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>r</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>/</mml:mo><mml:msup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>r</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p></list-item>
</list>
<p>In the formula: <italic>r</italic>&#x02013; similarity tolerance.</p>
</sec>
<sec>
<title>2.3 Variational mode decomposition</title>
<p>Variational Mode Decomposition (<xref ref-type="bibr" rid="B17">Dragomiretskiy and Zosso, 2013</xref>) is a novel adaptive signal decomposition technique. It is a non-recursive method that decomposes a multi-component signal into an ensemble of band-limited intrinsic mode functions (IMFs), also known as modes or components, with specific sparsity properties.</p>
<p>The key advantages of VMD include its ability to adaptively decompose non-stationary and non-linear signals, its robustness to noise and sampling, and its capability to handle different types of signals, including those with closely spaced frequency components. VMD has found successful applications in various fields, such as biomedical signal processing, fault diagnosis, and financial time series analysis. The main steps of VMD are as follows:</p>
<list list-type="simple">
<list-item><p>1) The original signal <italic>x</italic>(<italic>t</italic>) can be directly defined as:
<disp-formula id="E4"><label>(4)</label><mml:math id="M5"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>x</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>K</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p></list-item>
<list-item><p>2) For each mode function, the single-sided spectrum of the analysis signal can be obtained through Hilbert transform.
<disp-formula id="E5"><label>(5)</label><mml:math id="M6"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mrow><mml:mo stretchy="false">[</mml:mo><mml:mrow><mml:mi>&#x003B4;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>&#x003C0;</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac></mml:mrow><mml:mo stretchy="false">]</mml:mo></mml:mrow><mml:mi>u</mml:mi><mml:mi>k</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p>
<p>Where &#x003B4;(<italic>t</italic>) is the Dirac function and <italic>k</italic> is the number of modes to be decomposed.</p></list-item>
<list-item><p>3) For each mode function <italic>u</italic><sub><italic>k</italic></sub>(<italic>t</italic>), the basic frequency band after each modal spectrum modulation can be obtained by aliasing the exponential term <inline-formula><mml:math id="M7"><mml:msup><mml:mrow><mml:mi>e</mml:mi></mml:mrow><mml:mrow><mml:mo>-</mml:mo><mml:mi>j</mml:mi><mml:msub><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mi>t</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula>of the center frequency &#x003C9;<sub><italic>k</italic></sub>corresponding to the mode function <italic>u</italic><sub><italic>k</italic></sub>(<italic>t</italic>).
<disp-formula id="E6"><label>(6)</label><mml:math id="M8"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mrow><mml:mo stretchy="false">{</mml:mo><mml:mrow><mml:mrow><mml:mo stretchy="false">[</mml:mo><mml:mrow><mml:mi>&#x003B4;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>&#x003C0;</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac></mml:mrow><mml:mo stretchy="false">]</mml:mo></mml:mrow><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo stretchy="false">}</mml:mo></mml:mrow><mml:msup><mml:mrow><mml:mi>e</mml:mi></mml:mrow><mml:mrow><mml:mo>-</mml:mo><mml:mi>j</mml:mi><mml:mi>&#x003C9;</mml:mi><mml:mi>k</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:msup></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p></list-item>
<list-item><p>4) The bandwidth of each mode signal is estimated using the Gaussian smoothing method, which solves the variational problem under constraints. The objective function is:
<disp-formula id="E7"><label>(7)</label><mml:math id="M9"><mml:mrow><mml:mtable><mml:mtr><mml:mtd><mml:mrow><mml:mi>m</mml:mi><mml:mi>i</mml:mi><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mrow><mml:mo stretchy="false">{</mml:mo><mml:mrow><mml:msub><mml:mi>u</mml:mi><mml:mi>k</mml:mi></mml:msub></mml:mrow><mml:mo stretchy="false">}</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mrow><mml:mo stretchy="false">{</mml:mo><mml:mrow><mml:msub><mml:mi>&#x003C9;</mml:mi><mml:mi>k</mml:mi></mml:msub></mml:mrow><mml:mo stretchy="false">}</mml:mo></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">&#x0007B;</mml:mo><mml:mstyle displaystyle='true'><mml:mo>&#x02211;</mml:mo><mml:mi>k</mml:mi></mml:mstyle><mml:msup><mml:mrow><mml:mrow><mml:mo>&#x02016;</mml:mo><mml:mrow><mml:msub><mml:mo>&#x02202;</mml:mo><mml:mi>t</mml:mi></mml:msub><mml:mo stretchy="false">&#x0007B;</mml:mo><mml:mo stretchy='false'>[</mml:mo><mml:mi>&#x003B4;</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mtext>t</mml:mtext><mml:mo stretchy='false'>)</mml:mo><mml:mtext>+</mml:mtext><mml:mfrac><mml:mi>j</mml:mi><mml:mrow><mml:mi>&#x003C0;</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac><mml:mo stretchy='false'>]</mml:mo><mml:msub><mml:mi>u</mml:mi><mml:mi>k</mml:mi></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>t</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo stretchy="false">&#x0007D;</mml:mo><mml:mo>+</mml:mo><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mo>&#x02212;</mml:mo><mml:mi>j</mml:mi><mml:mi>&#x003C9;</mml:mi><mml:mi>k</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:msup></mml:mrow><mml:mo>&#x02016;</mml:mo></mml:mrow></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mi>s</mml:mi><mml:mo>.</mml:mo><mml:mi>t</mml:mi><mml:mo>.</mml:mo><mml:mstyle displaystyle='true'><mml:munder><mml:mo>&#x02211;</mml:mo><mml:mi>k</mml:mi></mml:munder><mml:mrow><mml:msub><mml:mi>u</mml:mi><mml:mi>k</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mi>f</mml:mi></mml:mrow></mml:mstyle></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:mrow></mml:math></disp-formula></p>
<p>Where {<italic>u</italic><sub><italic>k</italic></sub>} &#x0003D; {<italic>u</italic><sub>1</sub>, &#x02026;&#x02026;<italic>u</italic><sub>k</sub>},{&#x003C9;<sub><italic>k</italic></sub>} &#x0003D; {&#x003C9;<sub>1</sub>, &#x02026;&#x02026;&#x003C9;<sub>k</sub>},&#x02202;<sub><italic>t</italic></sub> is the partial derivative with respect to <italic>t</italic>,<italic>f</italic>is the original complex signal before decomposition.</p></list-item>
<list-item><p>5) For the above variational problem, the solution process is as follow: The quadratic penalty factor &#x003B1; and the Lagrangian multiplication operator &#x003BB;(<italic>t</italic>) are introduced into <xref ref-type="disp-formula" rid="E7">Equation 7</xref> to transform the constrained variational problem into an unconstrained variational problem. The hyperparameter penalty factor &#x003B1; mainly ensures the reconstruction accuracy of the signal, while &#x003BB;(<italic>t</italic>) maintains the strictness of the constraint conditions. Therefore, <xref ref-type="disp-formula" rid="E4">Equation 4</xref> is expanded into the Lagrangian expression as follows:
<disp-formula id="E8"><label>(8)</label><mml:math id="M10"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>L</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mrow><mml:mo stretchy="false">{</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mtext class="textrm" mathvariant="normal">k</mml:mtext></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">}</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mrow><mml:mo stretchy="false">{</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">}</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mi>&#x003BB;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>&#x003B1;</mml:mi><mml:mstyle displaystyle="true"><mml:munder class="msub"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:munder></mml:mstyle><mml:mstyle displaystyle="true"><mml:mo>&#x02225;</mml:mo><mml:msub><mml:mrow><mml:mi>&#x02202;</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">{</mml:mo><mml:mrow><mml:mrow><mml:mo stretchy="false">[</mml:mo><mml:mrow><mml:mi>&#x003B4;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>&#x003C0;</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:mfrac></mml:mrow><mml:mo stretchy="false">]</mml:mo></mml:mrow><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mtext class="textrm" mathvariant="normal">k</mml:mtext></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo stretchy="false">}</mml:mo></mml:mrow><mml:mo>&#x000B7;</mml:mo><mml:msup><mml:mrow><mml:mi>e</mml:mi></mml:mrow><mml:mrow><mml:mo>-</mml:mo><mml:mi>j</mml:mi><mml:msub><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:mo>&#x02225;</mml:mo></mml:mstyle></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;&#x000A0;</mml:mtext><mml:mo>&#x0002B;</mml:mo><mml:mo>&#x0003C;</mml:mo><mml:mi>&#x003BB;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mtext class="textrm" mathvariant="normal">t</mml:mtext></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mtext class="textrm" mathvariant="normal">f</mml:mtext><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mtext class="textrm" mathvariant="normal">t</mml:mtext></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>-</mml:mo><mml:mstyle displaystyle="true"><mml:munder class="msub"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:munder></mml:mstyle><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mtext class="textrm" mathvariant="normal">k</mml:mtext></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mtext class="textrm" mathvariant="normal">t</mml:mtext></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0003E;</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula></p></list-item>
</list>
<p>Use the alternating direction of the multiplier to calculate <xref ref-type="disp-formula" rid="E8">Equation 8</xref> and continuously optimize by alternating and iteratively updating <inline-formula><mml:math id="M12"><mml:msup><mml:mrow><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mstyle class="text"><mml:mtext class="textrm" mathvariant="normal">k</mml:mtext></mml:mstyle></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula>, <inline-formula><mml:math id="M13"><mml:msup><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mrow><mml:mstyle class="text"><mml:mtext class="textrm" mathvariant="normal">k</mml:mtext></mml:mstyle></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula> ,<inline-formula><mml:math id="M14"><mml:msup><mml:mrow><mml:msub><mml:mrow><mml:mi>&#x003BB;</mml:mi></mml:mrow><mml:mrow><mml:mstyle class="text"><mml:mtext class="textrm" mathvariant="normal">k</mml:mtext></mml:mstyle></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula> to obtain the optimal solution of <xref ref-type="disp-formula" rid="E8">Equation 8</xref>. Among them,<inline-formula><mml:math id="M15"><mml:msup><mml:mrow><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mstyle class="text"><mml:mtext class="textrm" mathvariant="normal">k</mml:mtext></mml:mstyle></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup></mml:math></inline-formula> can be transformed into the frequency domain through Fourier transform, and we can get:</p>
<disp-formula id="E10"><label>(9)</label><mml:math id="M16"><mml:mtable columnalign='left'><mml:mtr><mml:mtd><mml:msubsup><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover><mml:mi>k</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:munder><mml:mrow><mml:mi>arg</mml:mi><mml:mtext>&#x000A0;</mml:mtext><mml:mi>min</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover><mml:mi>k</mml:mi></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>u</mml:mi><mml:mi>k</mml:mi></mml:msub><mml:mo>&#x02208;</mml:mo><mml:mi>X</mml:mi></mml:mrow></mml:munder><mml:mo>&#x0007B;</mml:mo><mml:mi>&#x003B1;</mml:mi><mml:mrow><mml:mo>&#x02016;</mml:mo><mml:mrow><mml:mi>j</mml:mi><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>[</mml:mo><mml:mo stretchy='false'>[</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mtext>sgn</mml:mtext><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>&#x003C9;</mml:mi><mml:mi>k</mml:mi></mml:msub><mml:mo stretchy='false'>)</mml:mo><mml:mo stretchy='false'>]</mml:mo><mml:mo>&#x000B7;</mml:mo><mml:msub><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover><mml:mi>k</mml:mi></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>&#x003C9;</mml:mi><mml:mi>k</mml:mi></mml:msub><mml:mo stretchy='false'>)</mml:mo><mml:mo stretchy='false'>]</mml:mo></mml:mrow><mml:mo>&#x02016;</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;&#x02009;</mml:mtext><mml:mo>+</mml:mo><mml:msubsup><mml:mrow><mml:mo>&#x02016;</mml:mo><mml:mrow><mml:mover><mml:mrow><mml:mtext>&#x000A0;</mml:mtext><mml:mi>f</mml:mi></mml:mrow><mml:mo>&#x02227;</mml:mo></mml:mover><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>&#x02212;</mml:mo><mml:mstyle displaystyle='true'><mml:munder><mml:mo>&#x02211;</mml:mo><mml:mi>i</mml:mi></mml:munder><mml:mrow><mml:msub><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover><mml:mi>i</mml:mi></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mover><mml:mi>&#x003BB;</mml:mi><mml:mo>&#x02227;</mml:mo></mml:mover><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:mfrac></mml:mrow></mml:mstyle></mml:mrow><mml:mo>&#x02016;</mml:mo></mml:mrow><mml:mn>2</mml:mn><mml:mn>2</mml:mn></mml:msubsup><mml:mo>&#x0007D;</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Where <italic>X</italic> is the constraint condition of &#x000FB;<sub><italic>k</italic></sub>, <italic>u</italic><sub><italic>k</italic></sub> that is, <inline-formula><mml:math id="M18"><mml:munder class="msub"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:munder><mml:msub><mml:mrow><mml:mi>u</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>f</mml:mi></mml:math></inline-formula>; the purpose of the quadratic penalty factor &#x003B1; is mainly to reduce the signal interference of Gaussian noise;<inline-formula><mml:math id="M30"><mml:mrow><mml:mover><mml:mi>&#x003BB;</mml:mi><mml:mo>&#x02227;</mml:mo></mml:mover><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow></mml:math></inline-formula> is the tolerance of the entire noise signal, which is mainly used to ensure that the signal after decomposition is not distorted; <inline-formula><mml:math id="M31"><mml:mrow><mml:mover><mml:mrow><mml:mtext>&#x000A0;</mml:mtext><mml:mi>f</mml:mi></mml:mrow><mml:mo>&#x02227;</mml:mo></mml:mover><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow></mml:math></inline-formula> is the Fourier transform of <inline-formula><mml:math id="M32"><mml:mrow><mml:mi>f</mml:mi><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>)</mml:mo><mml:mo>,</mml:mo><mml:msub><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover><mml:mi>i</mml:mi></mml:msub><mml:mo stretchy='false'>(</mml:mo><mml:mi>&#x003C9;</mml:mi><mml:mo stretchy='false'>)</mml:mo></mml:mrow></mml:math></inline-formula> is the Fourier transform of u<sub><italic>k</italic></sub>(<italic>t</italic>).</p>
<p><xref ref-type="disp-formula" rid="E10">Equation 9</xref> can be transformed into the frequency domain through Fourier transform, and then the solution of <inline-formula><mml:math id="M19"><mml:msup><mml:mrow><mml:msub><mml:mrow><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula> can be obtained as:</p>
<disp-formula id="E12"><label>(10)</label><mml:math id="M20"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:msubsup><mml:mrow><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mover accent="true"><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mo>^</mml:mo></mml:mover><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>-</mml:mo><mml:mstyle displaystyle="true"><mml:munder class="msub"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:munder></mml:mstyle><mml:msub><mml:mrow><mml:mover accent='true'><mml:mi>u</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mover accent="true"><mml:mrow><mml:mi>&#x003BB;</mml:mi></mml:mrow><mml:mo>^</mml:mo></mml:mover><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x0002B;</mml:mo><mml:mn>2</mml:mn><mml:mi>&#x003B1;</mml:mi><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x003C9;</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>In our proposed method, VMD is used to decompose the EEG signal into its intrinsic mode or component, which can then be used for component selection using the power spectrum, avoiding interference from redundant information.</p>
</sec>
<sec>
<title>2.4 LightGBM</title>
<p>Light Gradient Boosting Machine (LightGBM) is a highly efficient implementation of the gradient boosting decision tree (GBDT) algorithm, proposed by Ke et al. to address the limitations of traditional GBDT in handling large-scale datasets, such as high computational complexity and slow training speed [1]. Distinguished by two core optimization strategies&#x02013;Gradient-based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB)&#x02013;LightGBM achieves significant improvements in training efficiency while maintaining or enhancing prediction accuracy, making it widely applied in fields like machine learning, data mining, and biomedical signal analysis (e.g., EEG-based depression detection).</p>
<sec>
<title>2.4.1 Gradient-based one-side sampling (GOSS)</title>
<p>GOSS focuses on sampling instances with large gradients (critical for model update) while retaining a small proportion of instances with small gradients to preserve the overall data distribution. Specifically, during each iteration:</p>
<list list-type="simple">
<list-item><p>a. Sort training instances by the absolute value of their gradients in descending order.</p></list-item>
<list-item><p>b. Select the top a &#x000D7; 100% instances (large-gradient samples) as core samples.</p></list-item>
<list-item><p>c. Randomly sample b &#x000D7; 100% instances from the remaining (1-a) &#x000D7; 100% instances (small-gradient samples) and multiply their gradients by a weight factor <inline-formula><mml:math id="M21"><mml:mfrac><mml:mrow><mml:mn>1</mml:mn><mml:mo>-</mml:mo><mml:mi>a</mml:mi></mml:mrow><mml:mrow><mml:mi>b</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula> to compensate for the sampling bias.</p></list-item>
</list>
</sec>
<sec>
<title>2.4.2 Objective function</title>
<p>The objective function of LightGBM follows the gradient boosting framework, combining a loss function and a regularization term to prevent overfitting. For the t-th iteration, the objective function is defined as:</p>
<disp-formula id="E13"><label>(11)</label><mml:math id="M22"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:msup><mml:mrow><mml:mi mathvariant="script">O</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:mi>L</mml:mi><mml:mrow><mml:mo stretchy="true">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>y</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msubsup><mml:mrow><mml:mover accent='true'><mml:mi>y</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mo stretchy="true">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mo>&#x003A9;</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>where n is the number of training instances, y<sub><italic>i</italic></sub> is the true label of the i-th instance, <inline-formula><mml:math id="M23"><mml:msubsup><mml:mrow><mml:mover accent='true'><mml:mi>y</mml:mi><mml:mo>&#x0005E;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>t</mml:mi><mml:mo>-</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow></mml:msubsup></mml:math></inline-formula> is the predicted value of the i-th instance after <italic>t</italic> &#x02212; 1 iterations, &#x003A9;(<italic>f</italic><sub><italic>t</italic></sub>) is the regularization term for the t-th tree, defined as:</p>
<disp-formula id="E14"><label>(12)</label><mml:math id="M24"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mo>&#x003A9;</mml:mo><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>&#x003B3;</mml:mi><mml:mi>T</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac><mml:mi>&#x003BB;</mml:mi><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>T</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msubsup><mml:mrow><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Here, T is the number of leaves in the t-th tree, &#x003C9;<sub><italic>i</italic></sub> is the score of the j-th eaf, and &#x003B3;, &#x003BB; are regularization parameters.</p>
</sec>
</sec>
<sec>
<title>2.5 Proposed recognize model</title>
<p>The algorithm flowchart selected in this paper is shown in <xref ref-type="fig" rid="F3">Figure 3</xref>. First, three-channel EEG signals are collected through the human prefrontal lobe. The collected signals undergo feature extraction via Variational Mode Decomposition (VMD), and appropriate feature components are calculated by combining the power spectrum. Finally, sample entropy is calculated for the obtained feature components. After obtaining the entropy features, they are fed into the LightBGM network for classification to ultimately determine whether the subject is in a depressive state or a depressive patient. The block diagram of its algorithm is shown in the <xref ref-type="fig" rid="F4">Figure 4a</xref>.</p>
<fig position="float" id="F3">
<label>Figure 3</label>
<caption><p>Model framework structure diagram.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0003.tif">
<alt-text>Flowchart depicting EEG signal acquisition, processing, and classification. The left section shows EEG acquisition locations and raw signals. The right section details VMD decomposition with single-channel and power spectrum features. The bottom section illustrates the LightGBM classification model with sample entropy feature extraction, histograms, and classification results in a confusion matrix form.</alt-text>
</graphic>
</fig>
<fig position="float" id="F4">
<label>Figure 4</label>
<caption><p>Model framework diagram. <bold>(a)</bold> framework diagram of the vmd-lightgbm model. <bold>(b)</bold> 3-layer CNN-LSTM model framework diagram.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0004.tif">
<alt-text>Diagram of a two-part machine learning model for processing EEG signals. Part (a) shows VMD decomposition of raw EEG signals into IMF components, followed by entropy calculation, histogram construction, and classification using LightBGM. Part (b) illustrates a neural network sequence with convolutional layers followed by dropout, flattening, and an LSTM layer leading to output.</alt-text>
</graphic>
</fig>
<p>As can be seen from <xref ref-type="fig" rid="F4">Figure 4a</xref>, after inputting the original EEG signals, they are first decomposed using Variational Mode Decomposition (VMD) to obtain a total of 4 Intrinsic Mode Function (IMF) feature components. These four components undergo sample entropy feature extraction, and the resulting features are finally input into LightGBM for classification.</p>
</sec>
</sec>
<sec id="s3">
<title>3 Experiments</title>
<sec>
<title>3.1 VMD signal decomposition and IMF component selection</title>
<p>Considering the strong correlation between the prefrontal lobe and emotional processes, as well as mental illnesses, electroencephalogram (EEG) signals were collected via three electrodes. A common EEG acquisition device has three electrodes (Fp1, Fpz, and Fp2) on the prefrontal lobe. Data were recorded in a room free of loud noise and strong magnetism. Participants kept their eyes closed until their EEG signals were observed to be relatively stable, after which we began 90-s data acquisition, the sampling frequency is 250 Hz. In the processing of the MODMA dataset, the original hexadecimal data were first converted into decimal data. Then, the signals were filtered with a 1 Hz high-pass and 45 Hz low-pass finite impulse response (FIR) filter. Finally, the dataset used an adaptive noise canceller to eliminate eye-blink artifacts, thereby obtaining the noise-removed EEG signal data. The resulting EEG signals are shown in <xref ref-type="fig" rid="F5">Figure 5</xref>, where HC represents healthy subjects and MDD represents major depressive disorder patients.</p>
<fig position="float" id="F5">
<label>Figure 5</label>
<caption><p>EEG data of HC and MDD individuals <bold>(a)</bold> HC data. <bold>(b)</bold> MDD data.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0005.tif">
<alt-text>Two line graphs labeled &#x0201C;a&#x0201D; and &#x0201C;b&#x0201D; show amplitude versus time in milliseconds. Both graphs display oscillating waveforms with peaks and troughs, occurring at different amplitude scales. Graph &#x0201C;a&#x0201D; has a higher amplitude range than graph &#x0201C;b&#x0201D;.</alt-text>
</graphic>
</fig>
<p>First, the data of each group were decomposed by VMD. The Variational Mode Decomposition (VMD) method is adopted mainly because the EEG activities in the &#x003B4;, &#x003B8;, &#x003B1;, and &#x003B2; frequency bands of patients with depression are generally higher than those of the normal control group. Moreover, the &#x003B1; and &#x003B2; frequency bands contain more depression-related EEG information than the low-frequency &#x003B4; and &#x003B8; bands. Therefore, decomposing the EEG signal into different frequency bands via VMD can effectively filter out interference from other frequency bands and ensure the validity of the signal. The VMD technique was used to decompose the non-stationary EMG signals into multiple frequency-band-limited IMFs, making each decomposed component easier to distinguish for emotion state classification. Taking the HC data as an example, parameter enumeration and optimization of the VMD algorithm were performed to obtain five groups of components. The decomposition results are shown in <xref ref-type="fig" rid="F6">Figure 6</xref>. The characteristic components after VMD decomposition can more obviously reflect the variation trends of EEG signals at different frequencies, and the variation characteristics of EEG can be retained by extracting the effective fluctuation information of each modal component.</p>
<fig position="float" id="F6">
<label>Figure 6</label>
<caption><p>EEG signal VMD decomposition results.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0006.tif">
<alt-text>This figure shows that the EEG signal can be decomposed into 5 IMF components through VMD, and it displays the results of these five components.</alt-text>
</graphic>
</fig>
<p>In this work, frequency analysis was conducted to determine the primary IMFs. <xref ref-type="fig" rid="F7">Figure 7</xref> shows the sample power diagrams of the primary IMFs. Since the frequency distribution of IMF5 differs significantly from that of the remaining components, only IMF1-IMF4 were selected as the primary IMF components for feature extraction. Additionally, it can be clearly observed from the power spectrograms that although the signal is decomposed into multiple groups of IMFs, highly discriminative information is retained only in a few IMFs. IMF2 and IMF3 exhibit higher energy density, while IMF1 and IMF4 have relatively lower energy content.</p>
<fig position="float" id="F7">
<label>Figure 7</label>
<caption><p>Power spectrum analysis of EEG three channel signals using IMF. <bold>(a)</bold> Channel1. <bold>(b)</bold> Channel2. <bold>(c)</bold> Channel3.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0007.tif">
<alt-text>Three frequency spectrum graphs, labeled a, b, and c, displaying amplitudes against frequencies for four intrinsic mode functions (IMFs). Each graph shows a peak around 20 Hz, with IMF 2 having the highest amplitude. Color-coded legends identify the IMFs: IMF 1 in blue, IMF 2 in orange, IMF 3 in red, and IMF 4 in purple. Graph a ranges up to 6000 amplitude, while b and c are up to 7000 amplitude.</alt-text>
</graphic>
</fig>
<p>After obtaining the IMFs, we extracted sample entropy features from the main IMFs. This feature can effectively characterize the changes in EEG complexity and avoid Gaussian noise interference. After extracting the sample entropy features, since there are three channels in total, each channel is decomposed into 4 IMF channels, and sample entropy features are extracted, we finally obtained 12 feature vectors.</p>
</sec>
<sec>
<title>3.2 Influence of learning rate on model accuracy</title>
<p>To ensure the accuracy of evaluation results, this paper uses Accuracy, Precision, Recall, F1-Score, Confusion Matrix, and average time consumption (T) as evaluation indicators for the lower limb gait phase recognition method. Here, TP represents the number of samples correctly predicted as positive by the model, FN represents the number of positive samples incorrectly predicted as negative by the model, FP represents the number of samples incorrectly predicted as positive by the model, and TN represents the number of samples incorrectly predicted as negative by the model.</p>
<disp-formula id="E15"><label>(13)</label><mml:math id="M25"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>A</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:mi>u</mml:mi><mml:mi>r</mml:mi><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>y</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>T</mml:mi><mml:mi>N</mml:mi></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>T</mml:mi><mml:mi>N</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>F</mml:mi><mml:mi>P</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>F</mml:mi><mml:mi>N</mml:mi></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="E16"><label>(14)</label><mml:math id="M26"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>P</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>i</mml:mi><mml:mi>o</mml:mi><mml:mi>n</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>F</mml:mi><mml:mi>P</mml:mi></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="E17"><label>(15)</label><mml:math id="M27"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>R</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>l</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>F</mml:mi><mml:mi>N</mml:mi></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<disp-formula id="E18"><label>(16)</label><mml:math id="M28"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>F</mml:mi><mml:mn>1</mml:mn><mml:mo>-</mml:mo><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mn>2</mml:mn><mml:mo>&#x000B7;</mml:mo><mml:mi>P</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>i</mml:mi><mml:mi>o</mml:mi><mml:mi>n</mml:mi><mml:mo>&#x000B7;</mml:mo><mml:mi>R</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>l</mml:mi></mml:mrow><mml:mrow><mml:mi>P</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>i</mml:mi><mml:mi>o</mml:mi><mml:mi>n</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>R</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:mi>a</mml:mi><mml:mi>l</mml:mi></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>The average recognition time evaluates the real-time performance of the model by calculating the recognition time of each sample.</p>
<disp-formula id="E19"><label>(17)</label><mml:math id="M29"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>T</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msub><mml:mrow><mml:mi>t</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mo>.</mml:mo><mml:mo>.</mml:mo><mml:mo>.</mml:mo><mml:mo>,</mml:mo><mml:mi>N</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>To optimize the parameters of the LightGBM classifier model, an enumeration method was used to search for optimal parameter combinations of different numbers of leaves and learning rates, thereby determining the best values of hyperparameters. The influence of model hyperparameters on the classifier&#x00027;s accuracy is shown in <xref ref-type="table" rid="T1">Table 1</xref>. It can be seen from the table that the highest classification accuracy is achieved with the parameters of a maximum number of leaves of 50 and a learning rate of 0.01. Therefore, this set of hyperparameters will be used for subsequent model training and testing.</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Model accuracy results under different hyperparameters.</p></caption>
<table frame="box" rules="all">
<thead>
<tr>
<th/>
<th valign="top" align="center" colspan="6"><bold>Number of leaves</bold></th>
</tr>
<tr>
<th valign="top" align="left"><bold>LR</bold></th>
<th valign="top" align="center"><bold>10.0</bold></th>
<th valign="top" align="center"><bold>20.0</bold></th>
<th valign="top" align="center"><bold>30.0</bold></th>
<th valign="top" align="center"><bold>40.0</bold></th>
<th valign="top" align="center"><bold>50.0</bold></th>
<th valign="top" align="center"><bold>100.0</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">0.01</td>
<td valign="top" align="center">94.98</td>
<td valign="top" align="center">95.68</td>
<td valign="top" align="center">96.39</td>
<td valign="top" align="center">96.45</td>
<td valign="top" align="center">97.78</td>
<td valign="top" align="center">97.28</td>
</tr>
<tr>
<td valign="top" align="left">0.005</td>
<td valign="top" align="center">94.15</td>
<td valign="top" align="center">95.54</td>
<td valign="top" align="center">96.44</td>
<td valign="top" align="center">96.78</td>
<td valign="top" align="center">97.56</td>
<td valign="top" align="center">97.16</td>
</tr>
<tr>
<td valign="top" align="left">0.001</td>
<td valign="top" align="center">92.82</td>
<td valign="top" align="center">94.45</td>
<td valign="top" align="center">95.38</td>
<td valign="top" align="center">95.87</td>
<td valign="top" align="center">96.45</td>
<td valign="top" align="center">97.05</td>
</tr>
<tr>
<td valign="top" align="left">0.0005</td>
<td valign="top" align="center">92.68</td>
<td valign="top" align="center">93.87</td>
<td valign="top" align="center">94.96</td>
<td valign="top" align="center">95.22</td>
<td valign="top" align="center">96.35</td>
<td valign="top" align="center">97.04</td>
</tr>
<tr>
<td valign="top" align="left">0.0001</td>
<td valign="top" align="center">92.69</td>
<td valign="top" align="center">94.21</td>
<td valign="top" align="center">94.54</td>
<td valign="top" align="center">95.12</td>
<td valign="top" align="center">96.15</td>
<td valign="top" align="center">96.85</td>
</tr></tbody>
</table>
</table-wrap>
<p>To verify the generalization ability of the model, the test set data without cross-validation was used for model inspection, and the model accuracy is shown in <xref ref-type="table" rid="T2">Table 2</xref>.</p>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>5 fold cross validation model testing accuracy.</p></caption>
<table frame="box" rules="all">
<thead>
<tr>
<th valign="top" align="left"><bold>Indicator parameters</bold></th>
<th valign="top" align="center"><bold>Acc/%</bold></th>
<th valign="top" align="center"><bold>Pre/%</bold></th>
<th valign="top" align="center"><bold>Re/%</bold></th>
<th valign="top" align="center"><bold>F1/%</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Flod 1</td>
<td valign="top" align="center">97.56 &#x000B1; 0.31</td>
<td valign="top" align="center">97.19 &#x000B1; 0.28</td>
<td valign="top" align="center">97.38 &#x000B1; 0.24</td>
<td valign="top" align="center">97.43 &#x000B1; 0.21</td>
</tr>
<tr>
<td valign="top" align="left">Flod 2</td>
<td valign="top" align="center">97.35 &#x000B1; 0.27</td>
<td valign="top" align="center">97.23 &#x000B1; 0.22</td>
<td valign="top" align="center">97.24 &#x000B1; 0.21</td>
<td valign="top" align="center">97.31 &#x000B1; 0.19</td>
</tr>
<tr>
<td valign="top" align="left">Flod 3</td>
<td valign="top" align="center">97.27 &#x000B1; 0.23</td>
<td valign="top" align="center">97.16 &#x000B1; 0.17</td>
<td valign="top" align="center">97.25 &#x000B1; 0.18</td>
<td valign="top" align="center">97.18 &#x000B1; 0.20</td>
</tr>
<tr>
<td valign="top" align="left">Flod 4</td>
<td valign="top" align="center">97.34 &#x000B1; 0.28</td>
<td valign="top" align="center">97.21 &#x000B1; 0.23</td>
<td valign="top" align="center">97.32 &#x000B1; 0.25</td>
<td valign="top" align="center">97.26 &#x000B1; 0.22</td>
</tr>
<tr>
<td valign="top" align="left">Flod 5</td>
<td valign="top" align="center">97.58 &#x000B1; 0.26</td>
<td valign="top" align="center">97.46 &#x000B1; 0.21</td>
<td valign="top" align="center">97.48 &#x000B1; 0.23</td>
<td valign="top" align="center">97.43 &#x000B1; 0.22</td>
</tr>
<tr>
<td valign="top" align="left">Average</td>
<td valign="top" align="center">97.42 &#x000B1; 0.27</td>
<td valign="top" align="center">97.25 &#x000B1; 0.22</td>
<td valign="top" align="center">97.33 &#x000B1; 0.22</td>
<td valign="top" align="center">97.33 &#x000B1; 0.21</td>
</tr></tbody>
</table>
</table-wrap>
<p>Combined with <xref ref-type="table" rid="T2">Table 2</xref>, it can be seen that after 5-fold cross-validation, the accuracy of the model adopted in this paper on each fold of the test set remains above 97.27%, with an average recognition accuracy of 97.42%. Among them, the fifth fold of data achieves the highest accuracy, while the third fold of data has relatively the lowest accuracy. The model exhibits good stability, and the accuracy on the test set can distinguish data of different categories, further verifying the effectiveness of the model. To fully exploit and utilize the temporal features in the training set and enhance the model&#x00027;s generalization ability, the trained model is applied to the test set for evaluation, and the confusion matrix shown in <xref ref-type="fig" rid="F8">Figure 8</xref> is obtained. The confusion matrix can intuitively reflect the classification performance of the model under various categories, where HC represents healthy subjects and MDD represents major depressive disorder patients.</p>
<fig position="float" id="F8">
<label>Figure 8</label>
<caption><p>Classification confusion matrix.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0008.tif">
<alt-text>Confusion matrix comparing predicted and true types labeled as HC and MDD. Top-left shows 2.62% HC predicted as HC, top-right 97.38% HC predicted as MDD. Bottom-left 97.42% MDD predicted as HC, bottom-right 2.58% MDD predicted as MDD.</alt-text>
</graphic>
</fig>
<p>As shown in <xref ref-type="fig" rid="F8">Figure 8</xref>, the recognition accuracy of each phase in the test set is above 97.42%. During the learning process, the model needs to effectively extract and classify features from a large number of complex and interleaved features, but the overall recognition accuracy is high. Similar features in some data may lead to misjudgments. The experimental results in this paper show that the model can accurately determine the emotional state of the population, providing a universal judgment basis for related applications. Therefore, the effectiveness of the model using EEG signals for depression recognition is verified.</p>
</sec>
<sec>
<title>3.3 Comparative experiments</title>
<p>To further verify the effectiveness and superiority of the proposed method, this subsection conducts horizontal comparisons using algorithms that have achieved excellent results in emotion recognition and pattern recognition in recent years. The comparative methods cover both machine learning and deep learning approaches, all trained and tested on the same dataset. The comparison methods cover machine learning and deep learning methods, all of which are trained and tested on the same dataset. In terms of dataset division, we perform a sliding window on 90-s data samples with a length of 1 s, resulting in 4,950 samples. The dataset is segmented at 1-s intervals, mainly because there are many factors that ultimately affect the occurrence of MUS. A large proportion of these factors are caused by somatization and depression. This paper only discusses the technical means for preliminary screening of depression. With only one second of data, it can effectively determine whether an individual suffers from depression, thereby realizing the front-end triage of undifferentiated disorders, quickly screening out depressed patients from normal individuals, and improving the diagnostic efficiency. However, due to certain feature similarity between adjacent sliding windows, using the conventional 7:3 random division of samples in machine learning may easily lead to similarity between some samples in the test set and training set. Therefore, we divide the 4,950 samples according to the time dimension, with the first 50% of the samples as the training set and the last 50% as the test set. From the training set, 70% is extracted for training and 30% for validation. To ensure the effectiveness of the model, all input sample data undergo VMD component decomposition and sample entropy feature extraction. In this study, we used a laptop with an Intel i5-12400F &#x00040;2.5 GHz CPU and an NVIDIA RTX 3060 GPU as the hardware environment. The software environment consists of Python with PyTorch 2.2.2. The algorithm comparison results are shown in <xref ref-type="table" rid="T3">Table 3</xref>.</p>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p>Comparison results of accuracy performance of different recognition methods.</p></caption>
<table frame="box" rules="all">
<thead>
<tr>
<th valign="top" align="left"><bold>Method</bold></th>
<th valign="top" align="center"><bold>Accuracy/%</bold></th>
<th valign="top" align="center"><bold>Precision/%</bold></th>
<th valign="top" align="center"><bold>Recall/%</bold></th>
<th valign="top" align="center"><bold>F1-Score/%</bold></th>
<th valign="top" align="center"><bold>T/ms</bold></th>
<th valign="top" align="center"><bold>Para/M</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">SVM</td>
<td valign="top" align="center">89.29 &#x000B1; 0.01</td>
<td valign="top" align="center">89.71 &#x000B1; 0.01</td>
<td valign="top" align="center">89.32 &#x000B1; 0.01</td>
<td valign="top" align="center">89.19 &#x000B1; 0.01</td>
<td valign="top" align="center">1.86</td>
<td valign="top" align="center">0.24</td>
</tr>
<tr>
<td valign="top" align="left">RF</td>
<td valign="top" align="center">90.68 &#x000B1; 0.02</td>
<td valign="top" align="center">90.67 &#x000B1; 0.02</td>
<td valign="top" align="center">90.68 &#x000B1; 0.02</td>
<td valign="top" align="center">90.71 &#x000B1; 0.02</td>
<td valign="top" align="center">1.52</td>
<td valign="top" align="center">0.25</td>
</tr>
<tr>
<td valign="top" align="left">LightGBM</td>
<td valign="top" align="center">93.78 &#x000B1; 0.02</td>
<td valign="top" align="center">93.62 &#x000B1; 0.03</td>
<td valign="top" align="center">93.72 &#x000B1; 0.01</td>
<td valign="top" align="center">93.64 &#x000B1; 0.02</td>
<td valign="top" align="center">0.06</td>
<td valign="top" align="center">0.02</td>
</tr>
<tr>
<td valign="top" align="left">2CNN-LSTM</td>
<td valign="top" align="center">95.23 &#x000B1; 0.24</td>
<td valign="top" align="center">95.18 &#x000B1; 0.35</td>
<td valign="top" align="center">95.26 &#x000B1; 0.24</td>
<td valign="top" align="center">95.13 &#x000B1; 0.36</td>
<td valign="top" align="center">2.52</td>
<td valign="top" align="center">2.74</td>
</tr>
<tr>
<td valign="top" align="left">3CNN-LSTM</td>
<td valign="top" align="center">97.12 &#x000B1; 0.34</td>
<td valign="top" align="center">97.15 &#x000B1; 0.32</td>
<td valign="top" align="center">97.12 &#x000B1; 0.32</td>
<td valign="top" align="center">97.10 &#x000B1; 0.36</td>
<td valign="top" align="center">2.82</td>
<td valign="top" align="center">3.18</td>
</tr>
<tr>
<td valign="top" align="left">2CNN - BiLSTM</td>
<td valign="top" align="center">93.89 &#x000B1; 0.44</td>
<td valign="top" align="center">93.77 &#x000B1; 0.46</td>
<td valign="top" align="center">93.87 &#x000B1; 0.38</td>
<td valign="top" align="center">93.69 &#x000B1; 0.48</td>
<td valign="top" align="center">2.36</td>
<td valign="top" align="center">2.87</td>
</tr>
<tr>
<td valign="top" align="left">3CNN - BiLSTM</td>
<td valign="top" align="center">97.72 &#x000B1; 0.54</td>
<td valign="top" align="center">97.77 &#x000B1; 0.59</td>
<td valign="top" align="center">97.72 &#x000B1; 0.54</td>
<td valign="top" align="center">97.73 &#x000B1; 0.56</td>
<td valign="top" align="center">2.90</td>
<td valign="top" align="center">3.38</td>
</tr>
<tr>
<td valign="top" align="left">MACNN</td>
<td valign="top" align="center">96.78 &#x000B1; 0.34</td>
<td valign="top" align="center">96.53 &#x000B1; 0.32</td>
<td valign="top" align="center">96.51 &#x000B1; 0.21</td>
<td valign="top" align="center">96.47 &#x000B1; 0.33</td>
<td valign="top" align="center">2.80</td>
<td valign="top" align="center">3.15</td>
</tr>
<tr>
<td valign="top" align="left">Our method</td>
<td valign="top" align="center">97.42 &#x000B1; 0.27</td>
<td valign="top" align="center">97.25 &#x000B1; 0.22</td>
<td valign="top" align="center">97.33 &#x000B1; 0.22</td>
<td valign="top" align="center">97.33 &#x000B1; 0.21</td>
<td valign="top" align="center">2.23</td>
<td valign="top" align="center">0.32</td>
</tr></tbody>
</table>
</table-wrap>
<p>For the CNN-LSTM model, in terms of network structure, we adopted a network structure where a 3-layer CNN network is connected in series with LSTM. Here, the &#x0201C;3 layers&#x0201D; refer to 3 modules, and each module includes a 3 &#x000D7; 3 convolution, regularization, a ReLU layer, and a global pooling layer, as shown in <xref ref-type="fig" rid="F4">Figure 4b</xref>. We chose to use the Adam optimizer, set the learning rate to 0.001, set the total number of epochs to 100, and set the batch size to 64. The loss function used is the cross-entropy loss function. For the CNN-BiLSTM model, we only performed bidirectional processing on the LSTM model, and the remaining parameters are the same as those of the CNN-LSTM model. The experimental results show that the method proposed in this paper achieves the best performance in classification, with an average recognition accuracy of 97.42%, significantly superior to other comparative algorithms. In terms of model parameters, this method only has 0.32 M parameters, far lower than other deep learning algorithms, demonstrating its lightweight advantage.</p>
<p>In machine learning methods, relying on manually extracted features inevitably leads to partial loss of EEG features, directly limiting the model accuracy of traditional algorithms such as SVM and RF. However, the machine learning algorithm LightGBM, with its efficient decision tree mechanism, can not only achieve high classification accuracy but also maintain extremely small model parameters under the same EEG feature input, reflecting the effectiveness of lightweight models in feature utilization.</p>
<p>In the field of deep learning, 3-layerCNN-LSTM and 3-layerCNN-BILSTM models improve classification accuracy compared with traditional machine learning methods by fusing the temporal features of EEG signals, which verifies the critical impact of signal temporal information on classification performance. The training loss is shown in <xref ref-type="fig" rid="F9">Figure 9</xref>. It can be seen from <xref ref-type="fig" rid="F7">Figure 7</xref> that Multi-Attention Convolutional Neural Network (MACNN) converges relatively slower than both CNN-LSTM and the proposed algorithm in this paper, and its final accuracy is also lower than the proposed algorithm. Meanwhile, in terms of time consumption, the MACNN algorithm takes longer. However, the proposed algorithm further breaks through the limitation of a single feature dimension through a multi-scale feature extraction strategy. In horizontal comparison, although the performance effect is slightly lower than that of 3CNN-Bilstm, it is stronger than 3CNN-Bilstm in terms of model lightweight and computational time. In summary, the proposed method shows significant advantages in three dimensions: classification accuracy, number of parameters and recognition time, and provides a more practical solution for the classification of depression based on EEG signals.</p>
<fig position="float" id="F9">
<label>Figure 9</label>
<caption><p>Training loss curves of CNN-LSTM, CNN-BiLSTM and LightGBM.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0009.tif">
<alt-text>Line graph comparing loss over 100 epochs for four models: CNN-LSTM, CNN-BiLSTM, MACNN, and LightGBM. The y-axis represents loss, ranging from 0 to 5, while the x-axis represents epochs. All models show a downward trend, and it can be seen that the LightGBM model has the smallest final loss.</alt-text>
</graphic>
</fig>
<p>A one-way analysis of variance was used to measure the significant difference level between the comparative methods and the proposed method. As shown in <xref ref-type="fig" rid="F10">Figure 10</xref>, there are significant differences (<italic>p</italic> &#x02264; 0.001) between the proposed method and other comparative methods, These comparison algorithms have lower model recognition accuracy than the proposed method on the same dataset, with only 3CNN-BiLSTM being slightly higher than the algorithm in this paper by 0.3%. However, as can be seen from <xref ref-type="table" rid="T3">Table 3</xref>, the computation time of 3CNN-BiLSTM is 2.9 ms, while that of the algorithm in this paper is only 2.23 ms. Considering both accuracy and timeliness, the algorithm in this paper has certain effectiveness in this classification task.</p>
<fig position="float" id="F10">
<label>Figure 10</label>
<caption><p>Significance testing and comparative results of different recognition methods. Significance markers denote statistical differences (paired <italic>t</italic>-test): <sup>&#x0002A;&#x0002A;&#x0002A;</sup><italic>p</italic> &#x0003C; 0.001.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fnins-19-1651762-g0010.tif">
<alt-text>Bar chart comparing the accuracy percentages of various machine learning methods: SVM, RF, LightBGM, 2CNN-LSTM, 3CNN-LSTM, 2CNN-BiLSTM, 3CNN-BiLSTM, and &#x0201C;Our method.&#x0201D; All methods show high accuracy, with &#x0201C;Our method&#x0201D; performing slightly better. Significance levels are indicated above the bars.</alt-text>
</graphic>
</fig>
</sec>
<sec>
<title>3.4 Ablation and feature selection experiments</title>
<p>To further screen the influence of features and verify the effectiveness of each layer in the proposed model, we conducted ablation experiments, and the results are shown in <xref ref-type="table" rid="T4">Table 4</xref>.</p>
<table-wrap position="float" id="T4">
<label>Table 4</label>
<caption><p>Results of ablation experiments and feature selection.</p></caption>
<table frame="box" rules="all">
<thead>
<tr>
<th valign="top" align="left"><bold>Indicator parameters</bold></th>
<th valign="top" align="center"><bold>Acc/%</bold></th>
<th valign="top" align="center"><bold>Pre/%</bold></th>
<th valign="top" align="center"><bold>Re/%</bold></th>
<th valign="top" align="center"><bold>F1/%</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">lightGBM &#x0002B;RawData</td>
<td valign="top" align="center">93.78 &#x000B1; 0.02</td>
<td valign="top" align="center">93.62 &#x000B1; 0.03</td>
<td valign="top" align="center">93.72 &#x000B1; 0.01</td>
<td valign="top" align="center">93.64 &#x000B1; 0.02</td>
</tr>
<tr>
<td valign="top" align="left">lightGBM &#x0002B;VMD</td>
<td valign="top" align="center">95.27 &#x000B1; 0.25</td>
<td valign="top" align="center">95.32 &#x000B1; 0.22</td>
<td valign="top" align="center">95.18 &#x000B1; 0.20</td>
<td valign="top" align="center">95.23 &#x000B1; 0.18</td>
</tr>
<tr>
<td valign="top" align="left">lightGBM &#x0002B;VMD &#x0002B; Sample entropy</td>
<td valign="top" align="center">97.42 &#x000B1; 0.27</td>
<td valign="top" align="center">97.25 &#x000B1; 0.22</td>
<td valign="top" align="center">97.33 &#x000B1; 0.22</td>
<td valign="top" align="center">97.33 &#x000B1; 0.21</td>
</tr>
<tr>
<td valign="top" align="left">lightGBM &#x0002B;VMD &#x0002B; RMS</td>
<td valign="top" align="center">96.58 &#x000B1; 0.18</td>
<td valign="top" align="center">96.37 &#x000B1; 0.21</td>
<td valign="top" align="center">96.41 &#x000B1; 0.15</td>
<td valign="top" align="center">96.53 &#x000B1; 0.16</td>
</tr>
<tr>
<td valign="top" align="left">lightGBM &#x0002B;VMD &#x0002B; PSD</td>
<td valign="top" align="center">96.28 &#x000B1; 0.16</td>
<td valign="top" align="center">96.18 &#x000B1; 0.17</td>
<td valign="top" align="center">96.16 &#x000B1; 0.14</td>
<td valign="top" align="center">96.13 &#x000B1; 0.12</td>
</tr></tbody>
</table>
</table-wrap>
<p>In terms of dataset division, to avoid the situation where the dataset has high feature similarity due to sliding windows, we divided the entire sample into two parts in a ratio of 5:5. The first 50% of the total sample dataset is the training set, and the latter 50% is the test set. Meanwhile, we extracted 70% from the training set for model training, and 30% from the test set as the test set for the model. First, we used the original EEG data without any feature extraction and input it into the lightGBM model, achieving an accuracy of 93.78%. This is because there is a certain difference between patients and healthy people in binary classification data. Then, we added VMD for modal decomposition of the model, where <italic>K</italic> = 4. Each channel obtained 4 modal decomposition vectors, and then 12-dimensional features were input into the model for classification. Multiple measurements showed a classification accuracy of 95.27%. Finally, after adding the sample entropy, the overall accuracy increased by 2.15%, which proves the effectiveness of the sample entropy feature.</p>
<p>To further verify the effectiveness of the features, we selected three common features, namely the sample entropy feature, RMS, and PSD feature. After VMD decomposition into 12-dimensional vector data, we performed a sliding window with a window length of 20 data points, extracted feature vectors, and input them into the model for recognition. Tests showed that the sample entropy feature was slightly higher than the RMS and PSD features, thus proving the effectiveness of the model.</p>
</sec>
</sec>
<sec sec-type="discussion" id="s4">
<title>4 Discussion</title>
<p>Generalized undifferentiated symptoms refer to pain, fatigue, gastrointestinal, cardiovascular, and other symptoms that cannot be fully explained by MUS, which are very common in the elderly population. Combined with case analysis, we found that these medically unexplainable phenomena often occur in middle-aged and elderly people, mainly caused by mental illnesses, with depression being the most common condition.</p>
<p>Currently, the main medical diagnostic tool for depression is psychological scales, which are highly subjective. Additionally, patients&#x00027; resistance to psychological scales leads to inaccurate judgments. Therefore, EEG signals, as a more objective evaluation criterion, have been widely used in the diagnosis of mental diseases. Conventional detection using 128-channel EEG signals provides relatively complete data features but cannot meet real-time detection requirements. Considering the strong correlation between the prefrontal lobe and emotional processes as well as mental illnesses, three electrodes (Fp1, Fpz, and Fp2) were selected on the prefrontal lobe for measurement. This significantly reduces data volume and improves calculation speed, but the collected data may suffer from decreased judgment accuracy due to incomplete features. Therefore, a new algorithm is needed to improve classification accuracy.</p>
<p>In terms of feature extraction, we chose to decompose EEG signals using VMD and optimized parameters through enumeration to obtain five groups of components as shown in the figures. Since the frequency distribution of IMF5 differs significantly from the remaining components, only IMF1-IMF4 were selected as the main IMF components for feature extraction. Power spectral density was calculated for these five groups of components, revealing that effective feature components are stored in a small number of IMF components. Therefore, we selected the first four components as input features, resulting in a total of 12 groups of feature vectors from three channels. Sample entropy was calculated for these feature vectors to obtain the input features for the model.</p>
<p>After inputting the features into the LightGBM classification model, we considered the impact of learning rate and number of leaves on model classification accuracy. As shown in <xref ref-type="table" rid="T1">Table 1</xref>, the model achieved the highest classification accuracy with a maximum number of leaves set to 50 and a learning rate of 0.01. To further verify the model&#x00027;s generalization ability, <xref ref-type="table" rid="T2">Table 2</xref> shows that after 5-fold cross-validation, the accuracy of the model proposed in this paper on each fold of the test set remained above 97.58%, with an average recognition accuracy of 97.42%. Among them, the fourth fold of data achieved the highest accuracy, while the first fold had relatively the lowest accuracy. Therefore, the proposed model exhibits good stability, and its accuracy on the test set can distinguish data of different categories, further verifying the model&#x00027;s effectiveness.</p>
<p>In the comparative experiments, we selected mainstream machine learning and deep learning algorithms for comparison. The experimental results in <xref ref-type="table" rid="T3">Table 3</xref> show that the proposed method achieves the best classification performance, with an average recognition accuracy of 97.42% and a total time consumption of 2.23 ms. Considering both timeliness and accuracy, it is superior to other algorithms. Although deep learning models have slightly higher accuracy, their complexity leads to longer time consumption. Therefore, to realize engineering applications, the lightweight algorithm proposed in this paper has high application value.</p>
<p>Therefore, the classification model proposed in this paper balances accuracy and real-time performance, and is superior to other common depression detection algorithms, providing a solid foundation for the application of EEG signals in depression emotion detection. In this paper, 1 s of EEG data is used for pre-triage of patients with Medically Unexplained Symptoms to rule out psychological factors such as somatization and depression, which can effectively improve the efficiency of medical diagnosis. Meanwhile, the algorithm proposed in this paper enhances the real-time performance of detection. Although individual differences may lead to a slight decrease in the accuracy of the algorithm, as a front-end module for pre-triage, it provides a solution for EEG signal devices in real-time depression detection and pre-triage of MUS patients.</p>
</sec>
<sec sec-type="conclusions" id="s5">
<title>5 Conclusions</title>
<p>MUS is one of the emerging fields in current research. Among middle-aged and elderly patients, most MUS symptoms are mainly caused by depression. However, because the symptoms do not meet the international diagnostic criteria for depression somatization, doctors cannot make an effective judgment on depression. This may delay treatment time, thereby exacerbating depression and threatening lives. In current research, many scholars hope to judge whether one suffers from depression through EEG signals. However, due to the complexity of EEG signals, their susceptibility to noise pollution, the need for a large number of channels to collect, and the long computation time, the application of EEG in depression diagnosis is limited. To improve the applicability of EEG in the diagnosis of depression, this paper proposes a deep learning model for diagnosing depression using three-channel electroencephalogram (EEG) signals. The signal is decomposed by variational mode decomposition (VMD), and the number of intrinsic mode functions (IMFs) is determined by power spectrum analysis, thereby enhancing the feature dimension of the model. Sample entropy is used to extract features from the collected information, and a classification accuracy of 97.42% is finally achieved. Through 5-fold cross-validation, the model is significantly superior to other traditional algorithms, demonstrating certain generalization ability.</p>
<p>The fast detection algorithm proposed in this paper uses only 3 channels. While pursuing high timeliness, it acquires a small amount of data and contains a small number of EEG features. To achieve high classification accuracy, we use the VMD algorithm for decomposition, thereby upgrading the 3-channel data to 12 dimensions, and use sample entropy for feature extraction to increase the feature dimension of the signal, thus achieving high classification accuracy. This strategy further breaks through the limitation of a single feature dimension, achieves the best recognition performance in horizontal comparison, and balances the requirements of model stability and lightweight. Therefore, the algorithm proposed in this paper provides a solution for real-time depression monitoring using EEG signal equipment.</p>
</sec>
</body>
<back>
<sec sec-type="data-availability" id="s6">
<title>Data availability statement</title>
<p>The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.</p>
</sec>
<sec sec-type="ethics-statement" id="s7">
<title>Ethics statement</title>
<p>The studies involving humans were approved by Second Hospital of Lanzhou University, China. The studies were conducted in accordance with the local legislation and institutional requirements. The participants provided their written informed consent to participate in this study.</p>
</sec>
<sec sec-type="author-contributions" id="s8">
<title>Author contributions</title>
<p>XG: Conceptualization, Data curation, Funding acquisition, Investigation, Methodology, Writing &#x02013; original draft. ZG: Data curation, Formal analysis, Software, Writing &#x02013; original draft. TX: Supervision, Validation, Writing &#x02013; review &#x00026; editing.</p>
</sec>
<sec sec-type="funding-information" id="s9">
<title>Funding</title>
<p>The author(s) declare that no financial support was received for the research and/or publication of this article.</p>
</sec>
<ack><p>We would like to express our gratitude to the Second Hospital of Lanzhou University for providing the public EEG dataset, as well as all volunteers and staff who have contributed to the dataset.</p>
</ack>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="ai-statement" id="s10">
<title>Generative AI statement</title>
<p>The author(s) declare that no Gen AI was used in the creation of this manuscript.</p>
<p>Any alternative text (alt text) provided alongside figures in this article has been generated by Frontiers with the support of artificial intelligence and reasonable efforts have been made to ensure accuracy, including review by the authors wherever possible. If you identify any issues, please contact us.</p>
</sec>
<sec sec-type="disclaimer" id="s11">
<title>Publisher&#x00027;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Akbari</surname> <given-names>H.</given-names></name> <name><surname>Sadiq</surname> <given-names>M.</given-names></name> <name><surname>Payan</surname> <given-names>M.</given-names></name> <name><surname>Esmaili</surname> <given-names>S.</given-names></name> <name><surname>Baghri</surname> <given-names>H.</given-names></name> <name><surname>Bagheri</surname> <given-names>H.</given-names></name></person-group> (<year>2021</year>). <article-title>Depression detection based on geometrical features extracted from sodp shape of eeg signals and binary pso</article-title>. <source>Traitement Du Signal</source> <volume>38</volume>:<fpage>13</fpage>&#x02013;<lpage>26</lpage>. <pub-id pub-id-type="doi">10.18280/ts.380102</pub-id><pub-id pub-id-type="pmid">36359630</pub-id></citation></ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Alhalaseh</surname> <given-names>R.</given-names></name> <name><surname>Alasasfeh</surname> <given-names>S.</given-names></name></person-group> (<year>2020</year>). <article-title>Machine-learning-based emotion recognition system using EEG signals</article-title>. <source>Computers</source> <volume>9</volume>:<fpage>95</fpage>. <pub-id pub-id-type="doi">10.3390/computers9040095</pub-id></citation>
</ref>
<ref id="B3">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Aydemir</surname> <given-names>E.</given-names></name> <name><surname>Tuncer</surname> <given-names>T.</given-names></name> <name><surname>Dogan</surname> <given-names>S.</given-names></name> <name><surname>Gururajan</surname> <given-names>R.</given-names></name> <name><surname>Acharya</surname> <given-names>U.</given-names></name></person-group> (<year>2021</year>). <article-title>Automated major depressive disorder detection using melamine pattern with eeg signals</article-title>. <source>Appl. Intell</source>. <volume>51</volume>, <fpage>6449</fpage>&#x02013;<lpage>6466</lpage>. <pub-id pub-id-type="doi">10.1007/s10489-021-02426-y</pub-id></citation>
</ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bachmann</surname> <given-names>M.</given-names></name> <name><surname>Paeske</surname> <given-names>L.</given-names></name> <name><surname>Kalev</surname> <given-names>K.</given-names></name> <name><surname>Aarma</surname> <given-names>K.</given-names></name> <name><surname>Lehtmets</surname> <given-names>A.</given-names></name> <name><surname>Oopik</surname> <given-names>P.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>Methods for classifying depression in single channel EEG using linear and nonlinear signal analysis</article-title>. <source>Comput. Methods Programs Biomed</source>. <volume>155</volume>, <fpage>11</fpage>&#x02013;<lpage>17</lpage>. <pub-id pub-id-type="doi">10.1016/j.cmpb.2017.11.023</pub-id><pub-id pub-id-type="pmid">29512491</pub-id></citation></ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Barsky</surname> <given-names>A.</given-names></name> <name><surname>Orav</surname> <given-names>E.</given-names></name> <name><surname>Bates</surname> <given-names>D.</given-names></name></person-group> (<year>2005</year>). <article-title>Somatization increases medical utilization and costs independent of psychiatric and medical comorbidity</article-title>. <source>Arch. Gen. Psychiatry</source> <volume>62</volume>, <fpage>903</fpage>&#x02013;<lpage>910</lpage>. <pub-id pub-id-type="doi">10.1001/archpsyc.62.8.903</pub-id><pub-id pub-id-type="pmid">16061768</pub-id></citation></ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Berger</surname> <given-names>C.</given-names></name> <name><surname>Dueck</surname> <given-names>A.</given-names></name> <name><surname>Perin</surname> <given-names>F.</given-names></name> <name><surname>Wunsch</surname> <given-names>K.</given-names></name> <name><surname>Buchmann</surname> <given-names>J.</given-names></name> <name><surname>Kolch</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2021</year>). <article-title>Brain arousal as measured by EEG-assessment differs between children and adolescents with attention-deficit/hyperactivity disorder (ADHD) and depression</article-title>. <source>Front. Psychiatry</source> <volume>12</volume>:<fpage>633880</fpage>. <pub-id pub-id-type="doi">10.3389/fpsyt.2021.633880</pub-id><pub-id pub-id-type="pmid">34777030</pub-id></citation></ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cai</surname> <given-names>H.</given-names></name> <name><surname>Chen</surname> <given-names>Y.</given-names></name> <name><surname>Han</surname> <given-names>J.</given-names></name> <name><surname>Zhang</surname> <given-names>X.</given-names></name> <name><surname>Hu</surname> <given-names>B.</given-names></name></person-group> (<year>2018</year>). <article-title>Study on feature selection methods for depression detection using three-electrode EEG data</article-title>. <source>Interdiscip. Sci. Comput. Life Sci</source>. <volume>10</volume>, <fpage>558</fpage>&#x02013;<lpage>565</lpage>. <pub-id pub-id-type="doi">10.1007/s12539-018-0292-5</pub-id><pub-id pub-id-type="pmid">29728983</pub-id></citation></ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cai</surname> <given-names>H.</given-names></name> <name><surname>Gao</surname> <given-names>Y.</given-names></name> <name><surname>Sun</surname> <given-names>S.</given-names></name> <name><surname>Li</surname> <given-names>N.</given-names></name> <name><surname>Hu</surname> <given-names>B.</given-names></name></person-group> (<year>2020</year>). <article-title>MODMA dataset: a multi-model open dataset for mental- disorder analysis</article-title>. <source>arXiv preprint</source> arXiv:2002.09283.</citation>
</ref>
<ref id="B9">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cao</surname> <given-names>Z.</given-names></name> <name><surname>Lin</surname> <given-names>C.</given-names></name> <name><surname>Ding</surname> <given-names>W.</given-names></name> <name><surname>Chen</surname> <given-names>M.</given-names></name> <name><surname>Li</surname> <given-names>C.</given-names></name> <name><surname>Su</surname> <given-names>T.</given-names></name></person-group> (<year>2019</year>). <article-title>Identifying ketamine responses in treatment-resistant depression using a wearable forehead EEG</article-title>. <source>IEEE Trans. Biomed. Eng</source>. <volume>66</volume>, <fpage>1668</fpage>&#x02013;<lpage>1679</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2018.2877651</pub-id><pub-id pub-id-type="pmid">30369433</pub-id></citation></ref>
<ref id="B10">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Chen</surname> <given-names>F.</given-names></name> <name><surname>Zhao</surname> <given-names>L.</given-names></name> <name><surname>Li</surname> <given-names>B.</given-names></name> <name><surname>Yang</surname> <given-names>L.</given-names></name></person-group> (<year>2020</year>). <article-title>Depression evaluation based on prefrontal EEG signals in resting state using fuzzy measure entropy</article-title>. <source>Physiol. Meas</source>. <volume>41</volume>:<fpage>95007</fpage>. <pub-id pub-id-type="doi">10.1088/1361-6579/abb144</pub-id><pub-id pub-id-type="pmid">33021227</pub-id></citation></ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Claassen-van Dessel</surname> <given-names>N.</given-names></name> <name><surname>van der Wouden</surname> <given-names>J.</given-names></name> <name><surname>Hoekstra</surname> <given-names>T.</given-names></name> <name><surname>Dekker</surname> <given-names>J.</given-names></name> <name><surname>van der Horst</surname> <given-names>H.</given-names></name></person-group> (<year>2018</year>). <article-title>The 2-year course of medically unexplained physical symptoms (mups) in terms of symptom severity and functional status: results of the prospects cohort study</article-title>. <source>J. Psychosom. Res</source>. <volume>104</volume>, <fpage>76</fpage>&#x02013;<lpage>87</lpage>. <pub-id pub-id-type="doi">10.1016/j.jpsychores.2017.11.012</pub-id><pub-id pub-id-type="pmid">29275789</pub-id></citation></ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Creed</surname> <given-names>F.</given-names></name> <name><surname>Barsky</surname> <given-names>A.</given-names></name></person-group> (<year>2004</year>). <article-title>A systematic review of the epidemiology of somatisation disorder and hypochondriasis</article-title>. <source>J. Psychosom. Res</source>. <volume>56</volume>, <fpage>391</fpage>&#x02013;<lpage>408</lpage>. <pub-id pub-id-type="doi">10.1016/S0022-3999(03)00622-6</pub-id><pub-id pub-id-type="pmid">15094023</pub-id></citation></ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>&#x0010C;uki&#x00107;</surname> <given-names>M.</given-names></name> <name><surname>Stoki&#x00107;</surname> <given-names>M.</given-names></name> <name><surname>Simi&#x00107;</surname> <given-names>S.</given-names></name> <name><surname>Pokrajac</surname> <given-names>D.</given-names></name></person-group> (<year>2020</year>). <article-title>The successful discrimination of depression from EEG could be attributed to proper feature extraction and not to a particular classification method</article-title>. <source>Cogn. Neurodyn</source>. <volume>14</volume>, <fpage>443</fpage>&#x02013;<lpage>455</lpage>. <pub-id pub-id-type="doi">10.1007/s11571-020-09581-x</pub-id><pub-id pub-id-type="pmid">32655709</pub-id></citation></ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>De la Salle</surname> <given-names>S.</given-names></name> <name><surname>Choueiry</surname> <given-names>J.</given-names></name> <name><surname>Shah</surname> <given-names>D.</given-names></name> <name><surname>Bowers</surname> <given-names>H.</given-names></name> <name><surname>McIntosh</surname> <given-names>J.</given-names></name> <name><surname>Ilivitsky</surname> <given-names>V.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Effects of ketamine on resting-state EEG activity and their relationship to perceptual/dissociative symptoms in healthy humans</article-title>. <source>Front. Pharmacol</source>. <volume>7</volume>:<fpage>348</fpage>. <pub-id pub-id-type="doi">10.3389/fphar.2016.00348</pub-id><pub-id pub-id-type="pmid">27729865</pub-id></citation></ref>
<ref id="B15">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>de Waal</surname> <given-names>M.</given-names></name> <name><surname>Arnold</surname> <given-names>I.</given-names></name> <name><surname>Eekhof</surname> <given-names>J.</given-names></name> <name><surname>van Hemert</surname> <given-names>A.</given-names></name></person-group> (<year>2004</year>). <article-title>Somatoform disorders in general practice: prevalence, functional impairment and comorbidity with anxiety and depressive disorders</article-title>. <source>Br. J. Psychiatry</source> <volume>184</volume>, <fpage>470</fpage>&#x02013;<lpage>476</lpage>. <pub-id pub-id-type="doi">10.1192/bjp.184.6.470</pub-id><pub-id pub-id-type="pmid">15172939</pub-id></citation></ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>DiSantostefano</surname> <given-names>J.</given-names></name></person-group> (<year>2009</year>). <article-title>International classification of diseases 10th revision (ICD-10)</article-title>. <source>J. Nurse Practit.</source> <volume>5</volume>, <fpage>56</fpage>&#x02013;<lpage>57</lpage>. <pub-id pub-id-type="doi">10.1016/j.nurpra.2008.09.020</pub-id></citation>
</ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Dragomiretskiy</surname> <given-names>K.</given-names></name> <name><surname>Zosso</surname> <given-names>D.</given-names></name></person-group> (<year>2013</year>). <article-title>Variational mode decomposition</article-title>. <source>IEEE Trans. Signal Process</source>. <volume>62</volume>, <fpage>531</fpage>&#x02013;<lpage>544</lpage>. <pub-id pub-id-type="doi">10.1109/TSP.2013.2288675</pub-id></citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Ehman</surname> <given-names>E. C.</given-names></name> <name><surname>Johnson</surname> <given-names>G. B.</given-names></name> <name><surname>Villanueva-Meyer</surname> <given-names>J. E.</given-names></name> <name><surname>Cha</surname> <given-names>S.</given-names></name> <name><surname>Leynes</surname> <given-names>A. P.</given-names></name> <name><surname>Larson</surname> <given-names>P. E. Z.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>PET/MRI: where might it replace pet/ct?</article-title> <source>J. Magn. Reson. Imaging</source> <volume>46</volume>, <fpage>1247</fpage>&#x02013;<lpage>1262</lpage>. <pub-id pub-id-type="doi">10.1002/jmri.25711</pub-id><pub-id pub-id-type="pmid">28370695</pub-id></citation></ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>El-Dahshan</surname> <given-names>E.</given-names></name> <name><surname>Bassiouni</surname> <given-names>M.</given-names></name> <name><surname>Khare</surname> <given-names>S.</given-names></name> <name><surname>Tan</surname> <given-names>R.</given-names></name> <name><surname>Acharya</surname> <given-names>U.</given-names></name></person-group> (<year>2024</year>). <article-title>Exhyptnet: an explainable diagnosis of hypertension using efficientnet with ppg signals</article-title>. <source>Expert Syst. Appl</source>. <volume>239</volume>:<fpage>122388</fpage>. <pub-id pub-id-type="doi">10.1016/j.eswa.2023.122388</pub-id></citation>
</ref>
<ref id="B20">
<citation citation-type="book"><person-group person-group-type="author"><name><surname>Fan</surname> <given-names>Y.</given-names></name> <name><surname>Yu</surname> <given-names>R.</given-names></name> <name><surname>Li</surname> <given-names>J.</given-names></name> <name><surname>Zhu</surname> <given-names>J.</given-names></name> <name><surname>Li</surname> <given-names>X.</given-names></name></person-group> (<year>2020</year>). <article-title>&#x0201C;EEG-based mild depression recognition using multi-kernel convolutional and spatial-temporal feature,&#x0201D;</article-title> in <source>2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM)</source> (<publisher-loc>IEEE</publisher-loc>: <publisher-name>Seoul, Korea</publisher-name>), <fpage>1777</fpage>&#x02013;<lpage>1784</lpage>. <pub-id pub-id-type="doi">10.1109/BIBM49941.2020.9313499</pub-id></citation>
</ref>
<ref id="B21">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Feldmann</surname> <given-names>L.</given-names></name> <name><surname>Piechaczek</surname> <given-names>C.</given-names></name> <name><surname>Grunewald</surname> <given-names>B.</given-names></name> <name><surname>Pehl</surname> <given-names>V.</given-names></name> <name><surname>Bartling</surname> <given-names>J.</given-names></name> <name><surname>Frey</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>Resting frontal EEG asymmetry in adolescents with major depression: impact of disease state and comorbid anxiety disorder</article-title>. <source>Clin. Neurophysiol</source>. <volume>129</volume>, <fpage>2577</fpage>&#x02013;<lpage>2585</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2018.09.028</pub-id><pub-id pub-id-type="pmid">30415151</pub-id></citation></ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Fink</surname> <given-names>P.</given-names></name> <name><surname>Hansen</surname> <given-names>M.</given-names></name> <name><surname>Oxhoj</surname> <given-names>M.</given-names></name></person-group> (<year>2004</year>). <article-title>The prevalence of somatoform disorders among internal medical inpatients</article-title>. <source>J. Psychosom. Res</source>. <volume>56</volume>, <fpage>413</fpage>&#x02013;<lpage>418</lpage>. <pub-id pub-id-type="doi">10.1016/S0022-3999(03)00624-X</pub-id><pub-id pub-id-type="pmid">15094025</pub-id></citation></ref>
<ref id="B23">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Grieve</surname> <given-names>P.</given-names></name> <name><surname>Fifer</surname> <given-names>W.</given-names></name> <name><surname>Cousy</surname> <given-names>N.</given-names></name> <name><surname>Monk</surname> <given-names>C.</given-names></name> <name><surname>Stark</surname> <given-names>R.</given-names></name> <name><surname>Gingrich</surname> <given-names>J.</given-names></name> <etal/></person-group>. (<year>2019</year>). <article-title>Neonatal infant eeg bursts are altered by prenatal maternal depression and serotonin selective reuptake inhibitor use</article-title>. <source>Clin. Neurophysiol</source>. <volume>130</volume>, <fpage>2019</fpage>&#x02013;<lpage>2025</lpage>. <pub-id pub-id-type="doi">10.1016/j.clinph.2019.08.021</pub-id><pub-id pub-id-type="pmid">31539768</pub-id></citation></ref>
<ref id="B24">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Gureje</surname> <given-names>O.</given-names></name> <name><surname>Reed</surname> <given-names>G.</given-names></name></person-group> (<year>2016</year>). <article-title>Bodily distress disorder in ICD-11: problems and prospects</article-title>. <source>World Psychiatry</source> <volume>15</volume>, <fpage>291</fpage>&#x02013;<lpage>292</lpage>. <pub-id pub-id-type="doi">10.1002/wps.20353</pub-id><pub-id pub-id-type="pmid">27717252</pub-id></citation></ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Harris</surname> <given-names>A.</given-names></name> <name><surname>Orav</surname> <given-names>E.</given-names></name> <name><surname>Bates</surname> <given-names>D.</given-names></name> <name><surname>Barsky</surname> <given-names>A.</given-names></name></person-group> (<year>2009</year>). <article-title>Somatization increases disability independent of comorbidity</article-title>. <source>J. Gen. Intern. Med</source>. <volume>24</volume>, <fpage>155</fpage>&#x02013;<lpage>161</lpage>. <pub-id pub-id-type="doi">10.1007/s11606-008-0845-0</pub-id><pub-id pub-id-type="pmid">19031038</pub-id></citation></ref>
<ref id="B26">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hasanzadeh</surname> <given-names>F.</given-names></name> <name><surname>Mohebbi</surname> <given-names>M.</given-names></name> <name><surname>Rostami</surname> <given-names>R.</given-names></name></person-group> (<year>2020</year>). <article-title>Graph theory analysis of directed functional brain networks in major depressive disorder based on EEG signal</article-title>. <source>J. Neural Eng</source>. <volume>17</volume>:<fpage>26010</fpage>. <pub-id pub-id-type="doi">10.1088/1741-2552/ab7613</pub-id><pub-id pub-id-type="pmid">32053813</pub-id></citation></ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Huijbregts</surname> <given-names>K.</given-names></name> <name><surname>van der Feltz-Cornelis</surname> <given-names>C.</given-names></name> <name><surname>van Marwijk</surname> <given-names>H.</given-names></name> <name><surname>de Jong</surname> <given-names>F.</given-names></name> <name><surname>van der Windt</surname> <given-names>D.</given-names></name> <name><surname>Beekman</surname> <given-names>A.</given-names></name></person-group> (<year>2010</year>). <article-title>Negative association of concomitant physical symptoms with the course of major depressive disorder: a systematic review</article-title>. <source>J. Psychosom. Res</source>. <volume>68</volume>, <fpage>511</fpage>&#x02013;<lpage>519</lpage>. <pub-id pub-id-type="doi">10.1016/j.jpsychores.2009.11.009</pub-id><pub-id pub-id-type="pmid">20488267</pub-id></citation></ref>
<ref id="B28">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hung</surname> <given-names>C.</given-names></name> <name><surname>Liu</surname> <given-names>C.</given-names></name> <name><surname>Yang</surname> <given-names>C.</given-names></name></person-group> (<year>2019</year>). <article-title>Persistent depressive disorder has long-term negative impacts on depression, anxiety, and somatic symptoms at 10-year followup among patients with major depressive disorder</article-title>. <source>J. Affect. Disord</source>. <volume>243</volume>, <fpage>255</fpage>&#x02013;<lpage>261</lpage>. <pub-id pub-id-type="doi">10.1016/j.jad.2018.09.068</pub-id><pub-id pub-id-type="pmid">30248637</pub-id></citation></ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Klooster</surname> <given-names>D.</given-names></name> <name><surname>Voetterl</surname> <given-names>H.</given-names></name> <name><surname>Baeken</surname> <given-names>C.</given-names></name> <name><surname>Arns</surname> <given-names>M.</given-names></name></person-group> (<year>2023</year>). <article-title>Evaluating robustness of brain stimulation biomarkers for depression: a systematic review of mri and eeg studies</article-title>. <source>Biol. Psychiatry</source>. <volume>95</volume>, <fpage>553</fpage>&#x02013;<lpage>63</lpage>. <pub-id pub-id-type="doi">10.1016/j.biopsych.2023.09.009</pub-id><pub-id pub-id-type="pmid">37734515</pub-id></citation></ref>
<ref id="B30">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Kurita</surname> <given-names>G.</given-names></name> <name><surname>Sjogren</surname> <given-names>P.</given-names></name> <name><surname>Juel</surname> <given-names>K.</given-names></name> <name><surname>Hojsted</surname> <given-names>J.</given-names></name> <name><surname>Ekholm</surname> <given-names>O.</given-names></name></person-group> (<year>2012</year>). <article-title>The burden of chronic pain: a cross-sectional survey focussing on diseases, immigration, and opioid use</article-title>. <source>Pain</source> <volume>153</volume>, <fpage>2332</fpage>&#x02013;<lpage>2338</lpage>. <pub-id pub-id-type="doi">10.1016/j.pain.2012.07.023</pub-id><pub-id pub-id-type="pmid">22959600</pub-id></citation></ref>
<ref id="B31">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leiknes</surname> <given-names>K.</given-names></name> <name><surname>Finset</surname> <given-names>A.</given-names></name> <name><surname>Moum</surname> <given-names>T.</given-names></name> <name><surname>Sandanger</surname> <given-names>I.</given-names></name></person-group> (<year>2006</year>). <article-title>Methodological issues concerning lifetime medically unexplained and medically explained symptoms of the composite international diagnostic interview: a prospective 11-year followup study</article-title>. <source>J. Psychosom. Res</source>. <volume>61</volume>, <fpage>169</fpage>&#x02013;<lpage>179</lpage>. <pub-id pub-id-type="doi">10.1016/j.jpsychores.2006.01.007</pub-id><pub-id pub-id-type="pmid">16880019</pub-id></citation></ref>
<ref id="B32">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Leiknes</surname> <given-names>K.</given-names></name> <name><surname>Finset</surname> <given-names>A.</given-names></name> <name><surname>Moum</surname> <given-names>T.</given-names></name> <name><surname>Sandanger</surname> <given-names>I.</given-names></name></person-group> (<year>2007</year>). <article-title>Course and predictors of medically unexplained pain symptoms in the general population</article-title>. <source>J. Psychosom. Res</source>. <volume>62</volume>, <fpage>119</fpage>&#x02013;<lpage>128</lpage>. <pub-id pub-id-type="doi">10.1016/j.jpsychores.2006.08.009</pub-id><pub-id pub-id-type="pmid">17270569</pub-id></citation></ref>
<ref id="B33">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Malviya</surname> <given-names>L.</given-names></name> <name><surname>Mal</surname> <given-names>S.</given-names></name></person-group> (<year>2023</year>). <article-title>CIS feature selection based dynamic ensemble selection model for human stress detection from eeg signals</article-title>. <source>Cluster Comput</source>. <volume>2</volume>, <fpage>1</fpage>&#x02013;<lpage>15</lpage>. <pub-id pub-id-type="doi">10.1007/s10586-023-04008-8</pub-id></citation>
</ref>
<ref id="B34">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mayou</surname> <given-names>R.</given-names></name> <name><surname>Kirmayer</surname> <given-names>L.</given-names></name> <name><surname>Simon</surname> <given-names>G.</given-names></name> <name><surname>Kroenke</surname> <given-names>K.</given-names></name> <name><surname>Sharpe</surname> <given-names>M.</given-names></name></person-group> (<year>2005</year>). <article-title>Somatoform disorders: time for a new approach in dsm-v</article-title>. <source>Am. J. Psychiatry</source> <volume>162</volume>, <fpage>847</fpage>&#x02013;<lpage>855</lpage>. <pub-id pub-id-type="doi">10.1176/appi.ajp.162.5.847</pub-id><pub-id pub-id-type="pmid">15863783</pub-id></citation></ref>
<ref id="B35">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>McFarlane</surname> <given-names>A.</given-names></name> <name><surname>Ellis</surname> <given-names>N.</given-names></name> <name><surname>Barton</surname> <given-names>C.</given-names></name> <name><surname>Browne</surname> <given-names>D.</given-names></name> <name><surname>Van Hooff</surname> <given-names>M.</given-names></name></person-group> (<year>2008</year>). <article-title>The conundrum of medically unexplained symptoms: questions to consider</article-title>. <source>Psychosomatics</source> <volume>49</volume>, <fpage>369</fpage>&#x02013;<lpage>377</lpage>. <pub-id pub-id-type="doi">10.1176/appi.psy.49.5.369</pub-id><pub-id pub-id-type="pmid">18794504</pub-id></citation></ref>
<ref id="B36">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mergl</surname> <given-names>R.</given-names></name> <name><surname>Seidscheck</surname> <given-names>I.</given-names></name> <name><surname>Allgaier</surname> <given-names>A.</given-names></name> <name><surname>Moller</surname> <given-names>H.</given-names></name> <name><surname>Hegerl</surname> <given-names>U.</given-names></name> <name><surname>Henkel</surname> <given-names>V.</given-names></name></person-group> (<year>2007</year>). <article-title>Depressive, anxiety, and somatoform disorders in primary care: prevalence and recognition</article-title>. <source>Depress. Anxiety</source> <volume>24</volume>, <fpage>185</fpage>&#x02013;<lpage>195</lpage>. <pub-id pub-id-type="doi">10.1002/da.20192</pub-id><pub-id pub-id-type="pmid">16900465</pub-id></citation></ref>
<ref id="B37">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mohammadi</surname> <given-names>Y.</given-names></name> <name><surname>Moradi</surname> <given-names>M.</given-names></name></person-group> (<year>2021</year>). <article-title>Prediction of depression severity scores based on functional connectivity and complexity of the eeg signal</article-title>. <source>Clin. EEG Neurosci</source>. <volume>52</volume>, <fpage>52</fpage>&#x02013;<lpage>60</lpage>. <pub-id pub-id-type="doi">10.1177/1550059420965431</pub-id><pub-id pub-id-type="pmid">33040603</pub-id></citation></ref>
<ref id="B38">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Murphy</surname> <given-names>M.</given-names></name> <name><surname>Whitton</surname> <given-names>A. E.</given-names></name> <name><surname>Deccy</surname> <given-names>S.</given-names></name> <name><surname>Ironside</surname> <given-names>M. L.</given-names></name> <name><surname>Rutherford</surname> <given-names>A.</given-names></name> <name><surname>Beltzer</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>Abnormalities in electroencephalographic microstates are state and trait markers of major depressive disorder</article-title>. <source>Neuropsychopharmacology</source> <volume>45</volume>, <fpage>2030</fpage>&#x02013;<lpage>2037</lpage>. <pub-id pub-id-type="doi">10.1038/s41386-020-0749-1</pub-id><pub-id pub-id-type="pmid">32590838</pub-id></citation></ref>
<ref id="B39">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nusslock</surname> <given-names>R.</given-names></name> <name><surname>Shackman</surname> <given-names>A.</given-names></name> <name><surname>McMenamin</surname> <given-names>B.</given-names></name> <name><surname>Greischar</surname> <given-names>L.</given-names></name> <name><surname>Davidson</surname> <given-names>R.</given-names></name> <name><surname>Kovacs</surname> <given-names>M.</given-names></name></person-group> (<year>2018</year>). <article-title>Comorbid anxiety moderates the relationship between depression history and prefrontal eeg asymmetry</article-title>. <source>Psychophysiology</source> <volume>55</volume>:<fpage>e13164</fpage>. <pub-id pub-id-type="doi">10.1111/psyp.12953</pub-id><pub-id pub-id-type="pmid">28755454</pub-id></citation></ref>
<ref id="B40">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Panier</surname> <given-names>L.</given-names></name> <name><surname>Bruder</surname> <given-names>G.</given-names></name> <name><surname>Svob</surname> <given-names>C.</given-names></name> <name><surname>Wickramaratne</surname> <given-names>P.</given-names></name> <name><surname>Gameroff</surname> <given-names>M.</given-names></name> <name><surname>Weissman</surname> <given-names>M.</given-names></name> <etal/></person-group>. (<year>2020</year>). <article-title>Predicting depression symptoms in families at risk for depression: interrelations of posterior eeg alpha and religion/spirituality</article-title>. <source>J. Affect. Disord</source>. <volume>274</volume>, <fpage>969</fpage>&#x02013;<lpage>976</lpage>. <pub-id pub-id-type="doi">10.1016/j.jad.2020.05.084</pub-id><pub-id pub-id-type="pmid">32664041</pub-id></citation></ref>
<ref id="B41">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Richman</surname> <given-names>J.</given-names></name></person-group> (<year>2011</year>). <article-title>Multivariate neighborhood sample entropy: a method for data reduction and prediction of complex data</article-title>. <source>Methods Enzymol</source>. <volume>487</volume>, <fpage>397</fpage>&#x02013;<lpage>408</lpage>. <pub-id pub-id-type="doi">10.1016/B978-0-12-381270-4.00013-5</pub-id><pub-id pub-id-type="pmid">21187232</pub-id></citation></ref>
<ref id="B42">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Sharpe</surname> <given-names>M.</given-names></name> <name><surname>Mayou</surname> <given-names>R.</given-names></name> <name><surname>Walker</surname> <given-names>J.</given-names></name></person-group> (<year>2006</year>). <article-title>Bodily symptoms: new approaches to classification</article-title>. <source>J. Psychosom. Res</source>. <volume>60</volume>, <fpage>35</fpage>&#x02013;<lpage>36</lpage>. <pub-id pub-id-type="doi">10.1016/j.jpsychores.2006.01.020</pub-id><pub-id pub-id-type="pmid">16581358</pub-id></citation></ref>
<ref id="B43">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Simon</surname> <given-names>G.</given-names></name> <name><surname>VonKorff</surname> <given-names>M.</given-names></name> <name><surname>Piccinelli</surname> <given-names>M.</given-names></name> <name><surname>Fullerton</surname> <given-names>C.</given-names></name> <name><surname>Ormel</surname> <given-names>J.</given-names></name></person-group> (<year>1999</year>). <article-title>An international study of the relation between somatic symptoms and depression</article-title>. <source>N. Engl. J. Med</source>. <volume>341</volume>, <fpage>1329</fpage>&#x02013;<lpage>1335</lpage>. <pub-id pub-id-type="doi">10.1056/NEJM199910283411801</pub-id><pub-id pub-id-type="pmid">10536124</pub-id></citation></ref>
<ref id="B44">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Siuly</surname> <given-names>S.</given-names></name> <name><surname>Khare</surname> <given-names>S. K.</given-names></name> <name><surname>Kabir</surname> <given-names>E.</given-names></name> <name><surname>Sadiq</surname> <given-names>M. T.</given-names></name> <name><surname>Wang</surname> <given-names>H.</given-names></name></person-group> (<year>2024</year>). <article-title>An efficient Parkinson&#x00027;s disease detection framework: leveraging time-frequency representation and alexnet convolutional neural network</article-title>. <source>Comput. Biol. Med</source>. <volume>174</volume>:<fpage>108462</fpage>. <pub-id pub-id-type="doi">10.1016/j.compbiomed.2024.108462</pub-id><pub-id pub-id-type="pmid">38599069</pub-id></citation></ref>
<ref id="B45">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Steinbrecher</surname> <given-names>N.</given-names></name> <name><surname>Koerber</surname> <given-names>S.</given-names></name> <name><surname>Frieser</surname> <given-names>D.</given-names></name> <name><surname>Hiller</surname> <given-names>W.</given-names></name></person-group> (<year>2011</year>). <article-title>The prevalence of medically unexplained symptoms in primary care</article-title>. <source>Psychosomatics</source> <volume>52</volume>, <fpage>263</fpage>&#x02013;<lpage>271</lpage>. <pub-id pub-id-type="doi">10.1016/j.psym.2011.01.007</pub-id><pub-id pub-id-type="pmid">21565598</pub-id></citation></ref>
<ref id="B46">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wang</surname> <given-names>B.</given-names></name> <name><surname>Kang</surname> <given-names>Y.</given-names></name> <name><surname>Huo</surname> <given-names>D.</given-names></name> <name><surname>Chen</surname> <given-names>D.</given-names></name> <name><surname>Song</surname> <given-names>W.</given-names></name> <name><surname>Zhang</surname> <given-names>F.</given-names></name></person-group> (<year>2023</year>). <article-title>Depression signal correlation identification from different eeg channels based on cnn feature extraction</article-title>. <source>Psychiatry Res. Neuroimaging</source> <volume>328</volume>:<fpage>111582</fpage>. <pub-id pub-id-type="doi">10.1016/j.pscychresns.2022.111582</pub-id><pub-id pub-id-type="pmid">36565553</pub-id></citation></ref>
<ref id="B47">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wittchen</surname> <given-names>H.</given-names></name> <name><surname>Jacobi</surname> <given-names>F.</given-names></name> <name><surname>Rehm</surname> <given-names>J.</given-names></name> <name><surname>Gustavsson</surname> <given-names>A.</given-names></name> <name><surname>Svensson</surname> <given-names>M.</given-names></name> <name><surname>Jonsson</surname> <given-names>B.</given-names></name> <etal/></person-group>. (<year>2011</year>). <article-title>The size and burden of mental disorders and other disorders of the brain in Europe 2010</article-title>. <source>Eur. Neuropsychopharmacol</source>. <volume>21</volume>, <fpage>655</fpage>&#x02013;<lpage>679</lpage>. <pub-id pub-id-type="doi">10.1016/j.euroneuro.2011.07.018</pub-id><pub-id pub-id-type="pmid">21896369</pub-id></citation></ref>
<ref id="B48">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>Z.</given-names></name> <name><surname>Meng</surname> <given-names>Q.</given-names></name> <name><surname>Jin</surname> <given-names>L.</given-names></name> <name><surname>Wang</surname> <given-names>H.</given-names></name> <name><surname>Hou</surname> <given-names>H.</given-names></name></person-group> (<year>2024</year>). <article-title>A novel EEG-based graph convolution network for depression detection: incorporating secondary subject partitioning and attention mechanism</article-title>. <source>Expert Syst. Appl</source>. <volume>239</volume>:<fpage>122356</fpage>. <pub-id pub-id-type="doi">10.1016/j.eswa.2023.122356</pub-id></citation>
</ref>
<ref id="B49">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhu</surname> <given-names>J.</given-names></name> <name><surname>Wang</surname> <given-names>Y.</given-names></name> <name><surname>La</surname> <given-names>R.</given-names></name> <name><surname>Zhan</surname> <given-names>J.</given-names></name> <name><surname>Niu</surname> <given-names>J.</given-names></name> <name><surname>Zeng</surname> <given-names>S.</given-names></name> <etal/></person-group>. (<year>2019</year>). <article-title>Multimodal mild depression recognition based on eeg-em synchronization acquisition network</article-title>. <source>IEEE Access</source> <volume>7</volume>, <fpage>28196</fpage>&#x02013;<lpage>28210</lpage>. <pub-id pub-id-type="doi">10.1109/ACCESS.2019.2901950</pub-id></citation>
</ref>
<ref id="B50">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zuchowicz</surname> <given-names>U.</given-names></name> <name><surname>Wozniak-Kwasniewska</surname> <given-names>A.</given-names></name> <name><surname>Szekely</surname> <given-names>D.</given-names></name> <name><surname>Olejarczyk</surname> <given-names>E.</given-names></name> <name><surname>David</surname> <given-names>O.</given-names></name></person-group> (<year>2019</year>). <article-title>EEG phase synchronization in persons with depression subjected to transcranial magnetic stimulation</article-title>. <source>Front. Neurosci</source>. <volume>12</volume>:<fpage>1037</fpage>. <pub-id pub-id-type="doi">10.3389/fnins.2018.01037</pub-id><pub-id pub-id-type="pmid">30692906</pub-id></citation></ref>
</ref-list>
</back>
</article>