<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" article-type="research-article">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Big Data</journal-id>
<journal-title>Frontiers in Big Data</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Big Data</abbrev-journal-title>
<issn pub-type="epub">2624-909X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fdata.2021.752406</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Big Data</subject>
<subj-group>
<subject>Original Research</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Runoff Forecasting Using Machine-Learning Methods: Case Study in the Middle Reaches of Xijiang River</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name><surname>Xiao</surname> <given-names>Lu</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name><surname>Zhong</surname> <given-names>Ming</given-names></name>
<xref ref-type="aff" rid="aff1"><sup>1</sup></xref>
<xref ref-type="aff" rid="aff2"><sup>2</sup></xref>
<xref ref-type="corresp" rid="c001"><sup>&#x0002A;</sup></xref>
<uri xlink:href="http://loop.frontiersin.org/people/1329063/overview"/>
</contrib>
<contrib contrib-type="author">
<name><surname>Zha</surname> <given-names>Dawei</given-names></name>
<xref ref-type="aff" rid="aff3"><sup>3</sup></xref>
</contrib>
</contrib-group>
<aff id="aff1"><sup>1</sup><institution>Department of Land Resources and Environment, School of Geography and Planning, Sun Yat-sen University</institution>, <addr-line>Guangzhou</addr-line>, <country>China</country></aff>
<aff id="aff2"><sup>2</sup><institution>Southern Marine Science and Engineering Guangdong Laboratory (Zhuhai)</institution>, <addr-line>Zhuhai</addr-line>, <country>China</country></aff>
<aff id="aff3"><sup>3</sup><institution>Pearl River Water Resources Research Institute</institution>, <addr-line>Guangzhou</addr-line>, <country>China</country></aff>
<author-notes>
<fn fn-type="edited-by"><p>Edited by: Naijun Zhou, University of Maryland, College Park, United States</p></fn>
<fn fn-type="edited-by"><p>Reviewed by: Zihan Lin, Michigan State University, United States; Xiaoming Guo, Henan University, China</p></fn>
<corresp id="c001">&#x0002A;Correspondence: Ming Zhong <email>zhongm37&#x00040;mail.sysu.edu.cn</email></corresp>
<fn fn-type="other" id="fn001"><p>This article was submitted to Data-driven Climate Sciences, a section of the journal Frontiers in Big Data</p></fn></author-notes>
<pub-date pub-type="epub">
<day>04</day>
<month>02</month>
<year>2022</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>4</volume>
<elocation-id>752406</elocation-id>
<history>
<date date-type="received">
<day>03</day>
<month>08</month>
<year>2021</year>
</date>
<date date-type="accepted">
<day>17</day>
<month>12</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#x000A9; 2022 Xiao, Zhong and Zha.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Xiao, Zhong and Zha</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/"><p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p></license> </permissions>
<abstract>
<p>Runoff forecasting is useful for flood early warning and water resource management. In this study, backpropagation (BP) neural network, generalized regression neural network (GRNN), extreme learning machine (ELM), and wavelet neural network (WNN) models were employed, and a high-accuracy runoff forecasting model was developed at Wuzhou station in the middle reaches of Xijiang River. The GRNN model was selected as the optimal runoff forecasting model and was also used to predict the streamflow and water level by considering the flood propagation time. Results show that (1) the GRNN presents the best performance in the 7-day lead time of streamflow; (2) the WNN model shows the highest accuracy in the 7-day lead time of water level; (3) the GRNN model performs well in runoff forecasting by considering flood propagation time, increasing the Qualification Rate (<italic>QR</italic>) of mean streamflow and water level forecast to 98.36 and 82.74%, respectively, and illustrates scientifically of the peak underestimation in streamflow and water level. This research proposes a high-accuracy runoff forecasting model using machine learning, which would improve the early warning capabilities of floods and droughts, the results also lay an important foundation for the mid-long-term runoff forecasting.</p></abstract>
<kwd-group>
<kwd>streamflow</kwd>
<kwd>water level</kwd>
<kwd>forecast</kwd>
<kwd>machine learning</kwd>
<kwd>wavelet neural network (WNN)</kwd>
<kwd>generalized regression neural network (GRNN)</kwd>
</kwd-group>
<counts>
<fig-count count="7"/>
<table-count count="4"/>
<equation-count count="16"/>
<ref-count count="30"/>
<page-count count="11"/>
<word-count count="6134"/>
</counts>
</article-meta>
</front>
<body>
<sec sec-type="intro" id="s1">
<title>Introduction</title>
<p>Runoff forecasting is the foundation of water resource management, deployment, and efficient utilization. It is of great significance to reservoir operation, water resource emergency scheduling, hydro-power generation, and irrigation management decisions (Niu et al., <xref ref-type="bibr" rid="B15">2018</xref>). The river runoff is sensitive to various factors, such as catchment response times and the accuracy of meteorological forecasts with time variability and uncertainty (Lima et al., <xref ref-type="bibr" rid="B11">2016</xref>). Furthermore, it is more difficult to forecast accurately when extreme climatic events occur. The establishment of hydrological models provides important support for runoff forecasting. The runoff process is simulated and forecasted from the perspective of the physical mechanism. However, hydrological model driving relies on the input of a large amount of meteorological data and watershed characteristics parameters. The forecasting process is relatively complicated, and its accuracy is limited to the accuracy and completeness of the data (Nourani, <xref ref-type="bibr" rid="B16">2017</xref>).</p>
<p>With the evolution of big data, runoff forecasting methods become more and more diversified. The research craze for artificial intelligence based on big data has risen. Compared with the traditional hydrological models, the machine-learning models show the advantages of high accuracy, high efficiency, and convenient application so that it has been widely used in runoff forecasting and achieved better forecasting results. The major machine-learning models are applied to runoff forecasting, including artificial neural networks (ANNs), support vector machine (SVM), support vector regression (SVR), and neuro-fuzzy (Mosavi et al., <xref ref-type="bibr" rid="B13">2018</xref>). Badrzadeh et al. (<xref ref-type="bibr" rid="B2">2015</xref>) applied four different types of ANNs to forecast real-time floods at Casino station on Richmond River, Australia. Tongal and Booij (<xref ref-type="bibr" rid="B21">2018</xref>) developed a simulation framework by coupling a baseflow separation method to three machine-learning methods and discussed performances of models in simulation and forecasting of streamflow regarding model types, input structures, and catchment dynamics in detail. Shortridge et al. (<xref ref-type="bibr" rid="B20">2016</xref>) utilized multiple regression and machine-learning approaches to simulate monthly streamflow in five highly seasonal rivers in the highlands of Ethiopia and compare their performance in terms of predictive accuracy, error structure and bias, model interpretability, and uncertainty when faced with extreme climate conditions. Guo et al. (<xref ref-type="bibr" rid="B6">2011</xref>) proposed an improved SVM model with adaptive insensitive factors to predict monthly streamflow. Yaseen et al. (<xref ref-type="bibr" rid="B25">2016</xref>) explored the potential of the extreme learning machine (ELM) method for forecasting monthly streamflow discharge rates in the Tigris River, Iraq and ELM showed better forecasting performance compared with SVR and the generalized regression neural network (GRNN) models.</p>
<p>The Xijiang River is the longest mainstream of the Pearl River. To investigate the runoff mechanism in the context of climate change, lots of studies have been conducted on projecting hydrological processes and responses in the Xijiang River basin. Wu et al. (<xref ref-type="bibr" rid="B23">2015</xref>) investigated the changes in hydrological drought frequency over the Xijiang River basin through an analysis of daily streamflow data observed at major hydrological stations along the river. Yuan et al. (<xref ref-type="bibr" rid="B27">2017</xref>) established a modeling chain framework to project the future hydrological changes in the Xijiang River basin and found that extreme low flow would undergo a considerable reduction in the future, indicating that drought risk in the Xijiang River basin was expected to increase significantly. Zhu et al. (<xref ref-type="bibr" rid="B30">2019</xref>) analyzed the correlation between the monthly streamflow and the monthly rainfall in Xijiang River through several correlation test methods and clarified that the changes of the monthly discharge are still controlled by natural precipitation variations in Xijiang&#x00027;s fluvial system. With the frequent occurrence of extreme hydrological events caused by climate change and the increasing impact of human activities on natural river runoff, the hydrological process in the Xijiang River basin becomes more random and complicated. Therefore, it is of great significance to carry out high precision runoff forecasting for grasping the future flood and drought conditions of the whole Pearl River basin and ensuring the coordination of water resources.</p>
<p>In this study, a combination of hydrological data and meteorological factors was used as input parameters, and the four different machine-learning models, including backpropagation (BP) neural network, GRNN, ELM, and wavelet neural network (WNN) models, were applied for the forecast of mean streamflow and water level in the 7-day lead time. Moreover, to improve forecast accuracy, the flood propagation mechanism was considered. The objectives of this study are as follows: (1) to propose a more reliable runoff forecasting model; (2) to improve the accuracy and efficiency of runoff forecasting; and (3) to explore the relationship between flood propagation mechanism and runoff in the basin. These findings are expected to provide a more accurate guidance for the early warning of floods and droughts.</p>
</sec>
<sec sec-type="materials" id="s2">
<title>Materials</title>
<sec>
<title>Study Area</title>
<p>The Xijiang River is the largest river in Guangxi, China. The total area of the river basin in Guangxi is 20.21 &#x000D7; 10<sup>4</sup> km<sup>2</sup>, accounting for 85.39% of the total land area of Guangxi. The river basin area above Wuzhou station is 32.70 &#x000D7; 10<sup>4</sup> km<sup>2</sup>, accounting for 92.88% of the total area of the whole Xijiang River basin. The runoff in the basin is unevenly distributed throughout the year. The annual wet season is from April to September, the streamflow accounts for about 78% of the whole year; the dry season is from October to March of the next year, the streamflow accounts for about 22% of the whole year correspondingly. The mean streamflow of the driest month usually occurs from December to February of the following year, mostly in January.</p>
<p>The study area was chosen in this research is the Wuxuan-Wuzhou reaches of the Xijiang River in Guangxi with a total length of about 247 km that includes Qianjiang, Xunjiang, and Xijiang River sections. The hydrographic stations from upstream to downstream are Wuxuan station, Dahuangjiangkou station, and Wuzhou station, respectively (<xref ref-type="fig" rid="F1">Figure 1</xref>).</p>
<fig id="F1" position="float">
<label>Figure 1</label>
<caption><p>Map of the study area.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fdata-04-752406-g0001.tif"/>
</fig>
</sec>
<sec>
<title>Data Collection</title>
<p>In this study, daily time series mean streamflow and water level data from 2009 to 2019 measured at Wuxuan station, Dahuangjiangkou station, and Wuzhou station were utilized. The Wuzhou meteorological data were collected from China Meteorological Data Service Center (<ext-link ext-link-type="uri" xlink:href="http://data.cma.cn">http://data.cma.cn</ext-link>) that includes daily precipitation (P), average air pressure (PRS), average temperature, mean water vapor pressure (WVP), mean relative humidity (RH), and maximum wind speed (U<sub>max</sub>). After preprocessing operations, such as interpolation, filling, and deletion, the data during 2009&#x02013;2017 were selected for training and the remainder during 2018&#x02013;2019 for testing.</p>
</sec>
</sec>
<sec sec-type="methods" id="s3">
<title>Methodology</title>
<p>The ANN is a technology based on intelligence imitating signal processing in the human brain. It is used as a black-box model with the ability to learn and find out non-linear relationships between the system inputs and outputs. ANNs can efficiently deal with correlation problems when physical processes are not understood or are very complex (Pliego Marugan et al., <xref ref-type="bibr" rid="B18">2018</xref>). The generalization capability of ANN allows it to process unseen data more quickly and simply after learning using a few measured data sets. BP neural network, GRNN, ELM, and WNN models belong to four different types of ANNs (Elsheikh et al., <xref ref-type="bibr" rid="B5">2019</xref>; Lee et al., <xref ref-type="bibr" rid="B9">2019</xref>). These models have been fully used in runoff forecasting, which proves their applicability in accurate prediction (Modaresi et al., <xref ref-type="bibr" rid="B12">2018</xref>; Mosavi et al., <xref ref-type="bibr" rid="B13">2018</xref>; Yaseen et al., <xref ref-type="bibr" rid="B24">2018</xref>; Zhang et al., <xref ref-type="bibr" rid="B29">2018</xref>; Pradhan et al., <xref ref-type="bibr" rid="B19">2020</xref>).</p>
<sec>
<title>Backpropagation Neural Network</title>
<p>Backpropagation neural network is a kind of multi-layer forward neural network based on BP. As a typical machine-learning algorithm of ANN, BP neural network architecture includes an input layer, hidden layers, and output layer. Each layer i&#x0002B;s is composed of several neurons (nodes), the output value of each node is determined by the input value, function, and threshold value. The learning process of the BP neural network includes two processes (Bisoyi et al., <xref ref-type="bibr" rid="B3">2019</xref>): information forward propagation and error <bold>back propagation</bold>. In the forward propagation process, the input information is transmitted from the input layer to the output layer through the hidden layers, and the output value is compared with the expected value after the transfer function operation. If there is an error, the error propagates back and returns along the original connection path. Reduce the error by modifying the weight of each layer of neurons layer by layer, and loop until the output result meets the accuracy requirements (Hameed et al., <xref ref-type="bibr" rid="B7">2017</xref>).</p>
<p>The node number of the hidden layer can be determined by Zhang et al. (<xref ref-type="bibr" rid="B29">2018</xref>):</p>
<disp-formula id="E1"><label>(1)</label><mml:math id="M1"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>l</mml:mi><mml:mo>&#x0003C;</mml:mo><mml:msqrt><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>m</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mi>n</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>&#x0002B;</mml:mo><mml:mi>a</mml:mi></mml:mrow></mml:msqrt></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Where <italic>l</italic> is the node number of implicit layer; <italic>m</italic> is the node number of output layer; <italic>n</italic> is the node number of input layer; <italic>a</italic> is a constant with any value of 1&#x02013;10. The optimal value of <italic>l</italic> is determined by trial calculation. The training parameters were assigned as follows: the learning rate is 0.01, the allowable biggest step of the training is 5,000, and the minimum error was set to 10<sup>&#x02212;5</sup>.</p>
</sec>
<sec>
<title>Generalized Regression Neural Network</title>
<p>Generalized regression neural network is a kind of radial basis neural network, which has strong non-linear mapping ability, flexible network structure, high fault tolerance, and robustness. It is suitable for solving non-linear problems. GRNN has better performance than traditional radial basis function (RBF) networks in terms of approximation ability and learning speed. The network converges to the optimized regression surface with more samples accumulated, and also has better simulation results when processing fewer samples (Li et al., <xref ref-type="bibr" rid="B10">2013</xref>).</p>
<p>Generalized regression neural network model structure consists of the input layer, pattern layer, summation layer, and output layer. The procedure of the GRNN can be represented as (Cigizoglu and Alp, <xref ref-type="bibr" rid="B4">2006</xref>):</p>
<p>If <italic>f</italic> (<italic>x, y</italic>) represents the known joint continuous probability density function of a vector random variable <italic>x</italic> and a scalar random variable <italic>y</italic>, the conditional mean of <italic>y</italic> given <italic>X</italic> (also called the regression of <italic>y</italic> on <italic>X</italic>) is given by</p>
<disp-formula id="E2"><label>(2)</label><mml:math id="M2"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>&#x00176;</mml:mi><mml:mo>=</mml:mo><mml:mi>E</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>y</mml:mi><mml:mo>|</mml:mo><mml:mi>X</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x0222B;</mml:mo></mml:mrow><mml:mrow><mml:mo>-</mml:mo><mml:mi>&#x0221E;</mml:mi></mml:mrow><mml:mrow><mml:mi>&#x0221E;</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:mi>y</mml:mi><mml:mi>f</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>y</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mi>d</mml:mi><mml:mi>y</mml:mi></mml:mrow><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x0222B;</mml:mo></mml:mrow><mml:mrow><mml:mo>-</mml:mo><mml:mi>&#x0221E;</mml:mi></mml:mrow><mml:mrow><mml:mi>&#x0221E;</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:mi>f</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>y</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mi>d</mml:mi><mml:mi>y</mml:mi></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Parzen non-parametric estimation was used to estimate the density function <inline-formula><mml:math id="M3"><mml:mover accent="true"><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mo>^</mml:mo></mml:mover><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>y</mml:mi><mml:mtext>&#x000A0;</mml:mtext></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula>:</p>
<disp-formula id="E3"><label>(3)</label><mml:math id="M4"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mover accent="true"><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mo>^</mml:mo></mml:mover><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>y</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:msup><mml:mrow><mml:mi>n</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mn>2</mml:mn><mml:mi>&#x003C0;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mfrac><mml:mrow><mml:mi>p</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac></mml:mrow></mml:msup><mml:msup><mml:mrow><mml:mi>&#x003C3;</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>&#x0002B;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:mo class="qopname">exp</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mo>-</mml:mo><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>X</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>T</mml:mi></mml:mrow></mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>X</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mrow><mml:mi>&#x003C3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x02003;</mml:mtext><mml:mo class="qopname">exp</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mo>-</mml:mo><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>Y</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mrow><mml:mi>&#x003C3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Substituting <inline-formula><mml:math id="M6"><mml:mover accent="true"><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mo>^</mml:mo></mml:mover><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>y</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:math></inline-formula> for Equation (2):</p>
<disp-formula id="E5"><label>(4)</label><mml:math id="M7"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>&#x00176;</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:msub><mml:mrow><mml:mi>Y</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo class="qopname">exp</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mo>-</mml:mo><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>X</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>T</mml:mi></mml:mrow></mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>X</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mrow><mml:mi>&#x003C3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:mo class="qopname">exp</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mo>-</mml:mo><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>X</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>T</mml:mi></mml:mrow></mml:msup><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>X</mml:mi><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>X</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mrow><mml:mi>&#x003C3;</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:mfrac></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>&#x00176; (<italic>X</italic>) is the weighted average of the observed value <italic>Y</italic><sub><italic>i</italic></sub> of all samples, and the weight factor of each observation is Euclidean distance squared exponent between the corresponding sample <italic>X</italic><sub><italic>i</italic></sub> and <italic>X</italic>. When the smoothing parameter &#x003C3; is made large, &#x00176; (<italic>X</italic>) is approximately the mean of all the sample dependent variables. On the contrary, the smaller value of &#x003C3; is, the closer &#x00176; (<italic>X</italic>) is to the training sample. When the point to be predicted is included in the training sample set, the predicted value of the dependent variable will be very close to the corresponding dependent variable in the sample. However, once it encounters a point that is not included in the training sample, the prediction effect may be very poor.</p>
</sec>
<sec>
<title>Extreme Learning Machine</title>
<p>Extreme learning machine is an innovative machine-learning algorithm proposed for the deficiency of single-hidden layer feedforward neural network (SLFN). The algorithm randomly generates the continuous weights between the input layer and the hidden layer and the threshold of the hidden layer neurons, and there is no need to adjust during the training process. The unique optimal solution can be obtained only by setting the number of neurons in the hidden layer. Compared with the traditional training method, it has the advantages of fast learning speed and excellent generalization performance (Yaseen et al., <xref ref-type="bibr" rid="B26">2019</xref>; Parisouj et al., <xref ref-type="bibr" rid="B17">2020</xref>; Niu and Fen, <xref ref-type="bibr" rid="B14">2021</xref>).</p>
<p>Mathematically, the ELM model can best be summarized by assuming that there are <italic>N</italic> arbitrarily different training data set samples {(<italic>x</italic><sub>1</sub>, <italic>y</italic><sub>1</sub><italic>y</italic><sub>1</sub>), &#x02026;, (<italic>x</italic><sub><italic>t</italic></sub>, <italic>y</italic><sub><italic>t</italic></sub>)}, <italic>t</italic> = 1, 2, &#x02026;, <italic>N</italic>. In this assumption, <italic>x</italic><sub><italic>t</italic></sub> is the explanatory variable and <italic>y</italic><sub><italic>t</italic></sub> is the response variable. <italic>x</italic><sub><italic>i</italic></sub> &#x003F5; <bold><italic>R</italic></bold><sup><italic>d</italic></sup> and <italic>y</italic><sub><italic>i</italic></sub> &#x003F5; <bold><italic>R</italic></bold>. The output of SLFN can be expressed as (Huang et al., <xref ref-type="bibr" rid="B8">2006</xref>):</p>
<disp-formula id="E6"><label>(5)</label><mml:math id="M8"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>L</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msub><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:msub><mml:mrow><mml:mi>g</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>a</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x000B7;</mml:mo><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>b</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>z</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>t</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mo>&#x022EF;</mml:mo><mml:mspace width="0.3em" class="thinspace"/><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>N</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>where the <italic>L</italic> is hidden nodes number, <italic>g</italic><sub><italic>i</italic></sub> (<italic>a</italic><sub><italic>i</italic></sub>&#x000B7; <italic>x</italic><sub><italic>i</italic></sub> &#x0002B; <italic>b</italic><sub><italic>i</italic></sub>) is a hidden layer output function, &#x0201C;Sigmoid&#x0201D; was chosen in this article, <italic>a</italic><sub><italic>i</italic></sub> is the weight factor connecting input node and the <italic>i</italic>th hidden node, <italic>b</italic><sub><italic>i</italic></sub> is the bias of the <italic>i</italic>th hidden node, <italic>B</italic><sub><italic>i</italic></sub> is the weight factor connecting the <italic>i</italic>th hidden node and output node, and <italic>z</italic><sub><italic>t</italic></sub> is the output of <italic>t</italic>th input.</p>
<p>If the feedforward neural network with <italic>L</italic> hidden nodes can approximate the <italic>N</italic> samples with zero error, there exist <italic>a</italic><sub><italic>i</italic></sub>, <italic>b</italic><sub><italic>i</italic></sub>, and <italic>B</italic><sub><italic>i</italic></sub> such that:</p>
<disp-formula id="E7"><label>(6)</label><mml:math id="M9"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>L</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msub><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:msub><mml:mrow><mml:mi>g</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>a</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x000B7;</mml:mo><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x0002B;</mml:mo><mml:msub><mml:mrow><mml:mi>b</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>y</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>t</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mo>&#x022EF;</mml:mo><mml:mspace width="0.3em" class="thinspace"/><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>N</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Equation (5) can be simplified as:</p>
<disp-formula id="E8"><label>(7)</label><mml:math id="M10"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>H</mml:mi><mml:mi>B</mml:mi><mml:mo>=</mml:mo><mml:mi>Y</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p><italic>H</italic> is called the hidden layer output matrix of the neural network, the <italic>i</italic>th column of <italic>H</italic> is the <italic>i</italic>th hidden node output with respect to inputs <italic>x</italic><sub>1</sub>, <italic>x</italic><sub>2</sub>, &#x02026;, <italic>x</italic><sub>N</sub>. In the ELM model, the output weights and deviations can be given randomly, and the hidden layer output matrix <italic>H</italic> becomes a certain matrix so that the training of the feedforward neural network can be transformed into a problem of solving the least square solution of the output weight matrix. The minimum norm square solution of Equation. (7) is:</p>
<disp-formula id="E9"><label>(8)</label><mml:math id="M11"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mover accent="true"><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mo>^</mml:mo></mml:mover><mml:mo>=</mml:mo><mml:msup><mml:mrow><mml:mi>H</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x0002B;</mml:mo></mml:mrow></mml:msup><mml:mi>Y</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>The <italic>H</italic><sup>&#x0002B;</sup> is the Moore-Penrose inverse of hidden layer output matrix <italic>H</italic>.</p>
</sec>
<sec>
<title>Wavelet Neural Network</title>
<p>Wavelet neural network is a multi-layer feedforward network proposed on the basis of wavelet analysis, which integrates the merits of ANN and wavelet analysis (Zhang and Benveniste, <xref ref-type="bibr" rid="B28">1992</xref>). It can not only avoid local optimal fundamentally but also accelerate the learning speed and reduce the training times. In this study, the Morlet wavelet function is used as the mother wavelet, the BP neural network topology is taken as the basis, and the transfer function of the neural network hidden nodes is replaced by the wavelet function. The corresponding weights from the input layer to the hidden layer and the threshold value of the hidden layer are replaced by the scaling factor and translation factor of the wavelet function, respectively (Abghari et al., <xref ref-type="bibr" rid="B1">2012</xref>; Wei et al., <xref ref-type="bibr" rid="B22">2013</xref>).</p>
<p>Assuming that there is a set of input samples <italic>x</italic><sub><italic>i</italic></sub> (<italic>i</italic> =1, 2, &#x02026;, <italic>k</italic>), the output of the hidden layer can be constructed using the equation:</p>
<disp-formula id="E10"><label>(9)</label><mml:math id="M12"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>h</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mi>h</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo stretchy="true">(</mml:mo><mml:mrow><mml:mfrac><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:msub><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:msub><mml:mrow><mml:mi>x</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>b</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mrow><mml:mi>a</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:mrow><mml:mo stretchy="true">)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Where <italic>h</italic>(<italic>j</italic>) is the output of the <italic>j</italic>th hidden layer node, &#x003C9;<sub><italic>ij</italic></sub> is the weight from the input layer to the hidden layer, <italic>h</italic><sub><italic>j</italic></sub> is the wavelet function, <italic>a</italic><sub><italic>j</italic></sub> is the scaling factor of the wavelet function, and <italic>b</italic><sub><italic>j</italic></sub> is the translation factor of the wavelet function.</p>
<p>The calculation formula of the output layer of WNN is:</p>
<disp-formula id="E11"><label>(10)</label><mml:math id="M13"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>y</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msub><mml:mrow><mml:mi>&#x003C9;</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mo>&#x022EF;</mml:mo><mml:mspace width="0.3em" class="thinspace"/><mml:mo>,</mml:mo><mml:mtext>&#x000A0;</mml:mtext><mml:mi>m</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Where &#x003C9;<sub><italic>ik</italic></sub> is the weight from hidden layer to the output layer, <italic>h</italic>(<italic>i</italic>) is the output of the <italic>i</italic>th hidden layer node, <italic>l</italic> is the number of hidden layer nodes, and <italic>m</italic> is the number of output layer nodes.</p>
<p>The gradient learning algorithm is applied to modify the weights, scaling factor, and translation factor. The optimized factors are trained by WNN to obtain the optimal output.</p>
</sec>
<sec>
<title>Verification Model</title>
<p>To evaluate the performance of the four modeling approaches, the following statistical criteria were used:</p>
<list list-type="simple">
<list-item><p>(1) Mean absolute error (<italic>MAE</italic>):</p></list-item>
</list>
<disp-formula id="E12"><label>(11)</label><mml:math id="M14"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>M</mml:mi><mml:mi>A</mml:mi><mml:mi>E</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:mo>|</mml:mo><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>|</mml:mo></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<list list-type="simple">
<list-item><p>(2) Deterministic coefficient (<italic>DC</italic>):</p></list-item>
</list>
<disp-formula id="E13"><label>(12)</label><mml:math id="M15"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>D</mml:mi><mml:mi>C</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>-</mml:mo><mml:mfrac><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mover accent="false" class="mml-overline"><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mo accent="true">&#x000AF;</mml:mo></mml:mover></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<list list-type="simple">
<list-item><p>(3) Correlation coefficient (<italic>R</italic><sup>2</sup>):</p></list-item>
</list>
<disp-formula id="E14"><label>(13)</label><mml:math id="M16"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:msup><mml:mrow><mml:mi>R</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:msup><mml:mrow><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mfrac><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mover accent="false" class="mml-overline"><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mo accent="true">&#x000AF;</mml:mo></mml:mover></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mover accent="false" class="mml-overline"><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mo accent="true">&#x000AF;</mml:mo></mml:mover></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:msqrt><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mover accent="false" class="mml-overline"><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mo accent="true">&#x000AF;</mml:mo></mml:mover></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:msqrt><mml:msqrt><mml:mrow><mml:mstyle displaystyle="true"><mml:msubsup><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:msubsup></mml:mstyle><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:mover accent="false" class="mml-overline"><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mo accent="true">&#x000AF;</mml:mo></mml:mover></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:msqrt></mml:mrow></mml:mfrac></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<list list-type="simple">
<list-item><p>(4) Mean relative error (<italic>MRE</italic>):</p></list-item>
</list>
<disp-formula id="E15"><label>(14)</label><mml:math id="M17"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>M</mml:mi><mml:mi>R</mml:mi><mml:mi>E</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:mfrac><mml:mrow><mml:mo>|</mml:mo><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>|</mml:mo></mml:mrow><mml:mrow><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<list list-type="simple">
<list-item><p>(5) Root mean square error (<italic>RMSE</italic>):</p></list-item>
</list>
<disp-formula id="E16"><label>(15)</label><mml:math id="M18"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>R</mml:mi><mml:mi>M</mml:mi><mml:mi>S</mml:mi><mml:mi>E</mml:mi><mml:mo>=</mml:mo><mml:msqrt><mml:mrow><mml:mfrac><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:mfrac><mml:mstyle displaystyle="true"><mml:munderover accentunder="false" accent="false"><mml:mrow><mml:mo>&#x02211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi></mml:mrow></mml:munderover></mml:mstyle><mml:msup><mml:mrow><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>-</mml:mo><mml:msub><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow></mml:msqrt></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<list list-type="simple">
<list-item><p>(6) Qualification rate (<italic>QR</italic>):</p></list-item>
</list>
<disp-formula id="E17"><label>(16)</label><mml:math id="M19"><mml:mtable class="eqnarray" columnalign="left"><mml:mtr><mml:mtd><mml:mi>Q</mml:mi><mml:mi>R</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mi>n</mml:mi></mml:mrow><mml:mrow><mml:mi>m</mml:mi></mml:mrow></mml:mfrac><mml:mo>&#x000D7;</mml:mo><mml:mn>100</mml:mn><mml:mi>%</mml:mi></mml:mtd></mml:mtr></mml:mtable></mml:math></disp-formula>
<p>Where <italic>O</italic><sub><italic>i</italic></sub> is the <italic>i</italic>th observation, <italic>P</italic><sub><italic>i</italic></sub> is the forecasted value of the <italic>i</italic>th model, <italic>N</italic> is the number of samples, <inline-formula><mml:math id="M20"><mml:mover accent="false" class="mml-overline"><mml:mrow><mml:mi>O</mml:mi></mml:mrow><mml:mo accent="true">&#x000AF;</mml:mo></mml:mover><mml:mtext>&#x000A0;</mml:mtext></mml:math></inline-formula>is the average of observed values <italic>O</italic><sub><italic>i</italic></sub>, <inline-formula><mml:math id="M21"><mml:mover accent="false" class="mml-overline"><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mo accent="true">&#x000AF;</mml:mo></mml:mover></mml:math></inline-formula> is the average of model forecasted values <italic>P</italic><sub><italic>i</italic></sub>, <italic>n</italic> is the number of qualified forecasts, and <italic>m</italic> is the total number of forecasts. The closer the value of <italic>MAE</italic> is to 0, the better is the prediction result. When <italic>DC</italic> is between 0 and 1, the closer <italic>DC</italic> is to 1, which implies higher consistency between the forecasted value and the observed value and ultimately reflects in a better model prediction. When <italic>DC</italic> is less than 0, it implies that the forecasted result is undesirable. The closer <italic>R</italic><sup>2</sup> is to 1, the higher the degree of correlation between the forecasted values and the observed values. The closer <italic>MRE</italic> is to 0, the better the prediction. The closer <italic>RMSE</italic> is to 0, the smaller the prediction deviation is, and the model is more reliable. The higher the <italic>QR</italic>, the better prediction of the model.</p>
</sec>
</sec>
<sec sec-type="results" id="s4">
<title>Results</title>
<sec>
<title>Correlation Analysis of Input Parameters</title>
<p>The factors that include mean streamflow before 7 days (<italic>Q</italic><sub><sub><italic>t</italic></sub>&#x02212;7</sub>), 10 days (<italic>Q</italic><sub><sub><italic>t</italic></sub>&#x02212;10</sub>), and 15 days (<italic>Q</italic><sub><sub><italic>t</italic></sub>&#x02212;15</sub>), mean water level before 7 days (<italic>H</italic><sub><sub><italic>t</italic></sub>&#x02212;7</sub>), 10 days (<italic>H</italic><sub><sub><italic>t</italic></sub>&#x02212;10</sub>), and 15 days (<italic>H</italic><sub><sub><italic>t</italic></sub>&#x02212;15</sub>), <italic>P, PRS, T, WVP, RH</italic>, and <italic>U</italic><sub>max</sub> of Wuzhou station were used as input parameters of the model to predict the mean streamflow and mean water level. Pearson correlation coefficient between input parameters and mean streamflow and mean water level of Wuzhou station were calculated respectively, and the results are displayed in <xref ref-type="table" rid="T1">Table 1</xref>. The results show that the mean streamflow, water level, and meteorological factors of Wuzhou station before 7, 10, and 15 days are significantly correlated with the mean streamflow and water level of the day. The runoff forecasting can be reasonably carried out by input of these 8 factors into the machine-learning models.</p>
<table-wrap position="float" id="T1">
<label>Table 1</label>
<caption><p>Correlation of input parameters with mean streamflow and mean water level of Wuzhou station.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th valign="top" align="left"><bold>Parameter</bold></th>
<th valign="top" align="center"><bold><italic>Q</italic><sub><sub><italic>t</italic></sub>&#x02013;7</sub> (m<sup>3</sup><bold>&#x000B7;</bold>s<sup><bold>&#x02212;1</bold></sup>)</bold></th>
<th valign="top" align="center"><bold><italic>H</italic><sub><sub><italic>t</italic></sub>&#x02013;7</sub> (m)</bold></th>
<th valign="top" align="center"><bold><italic>Q</italic><sub><sub><italic>t</italic></sub>&#x02013;10</sub> (m<sup>3</sup><bold>&#x000B7;</bold>s<sup><bold>&#x02212;1</bold></sup>)</bold></th>
<th valign="top" align="center"><bold><italic>H</italic><sub><sub><italic>t</italic></sub>&#x02013;10</sub> (m)</bold></th>
<th valign="top" align="center"><bold><italic>Q</italic><sub><sub><italic>t</italic></sub>&#x02013;15</sub> (m<sup>3</sup>&#x000B7;s<sup><bold>&#x02212;1</bold></sup>)</bold></th>
<th valign="top" align="center"><bold><italic>H</italic><sub><sub><italic>t</italic></sub>&#x02013;15</sub> (m)</bold></th>
<th valign="top" align="center"><bold><italic>P</italic> (mm)</bold></th>
<th valign="top" align="center"><bold><italic>PRS</italic> (hPa)</bold></th>
<th valign="top" align="center"><bold><italic>T</italic> (<bold>&#x000B0;</bold>C)</bold></th>
<th valign="top" align="center"><bold><italic>WVP</italic> (hPa)</bold></th>
<th valign="top" align="center"><bold><italic>RH</italic> (%)</bold></th>
<th valign="top" align="center"><bold><italic>U</italic><sub>max</sub> (m&#x000B7;s <sup><bold>&#x02212;1</bold></sup>)</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Streamflow</td>
<td valign="top" align="center">0.670&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.698&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.614&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.648&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.565&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.605&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.244&#x0002A;&#x0002A;</td>
<td valign="top" align="center">&#x02212;0.541&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.473&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.571&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.316&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.237&#x0002A;&#x0002A;</td>
</tr>
<tr>
<td valign="top" align="left">Water level</td>
<td valign="top" align="center">0.704&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.755&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.653&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.705&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.605&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.659&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.247&#x0002A;&#x0002A;</td>
<td valign="top" align="center">&#x02212;0.603&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.532&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.637&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.353&#x0002A;&#x0002A;</td>
<td valign="top" align="center">0.253&#x0002A;&#x0002A;</td>
</tr>
</tbody>
</table>
<table-wrap-foot>
<p><italic>Pearson correlation coefficient between 0.8 and 1.0, very strong correlation; 0.6&#x02013;0.8, strong correlation; 0.4&#x02013;0.6, moderate correlation; 0.2&#x02013;0.4, weak correlation; 0.0&#x02013;0.2, very weak correlation or no correlation; &#x0201C;&#x0002A;&#x0002A;&#x0201D; represents significant correlation at 0.01 and &#x0201C;&#x0002A;&#x0201D; represents significant correlation at 0.05</italic>.</p>
</table-wrap-foot>
</table-wrap>
</sec>
<sec>
<title>Determination of the Lead Time</title>
<p>To select a mid-long-term runoff forecasting lead time with satisfactory forecast accuracy, the effects of machine-learning models (i.e., BP, GRNN, ELM, and WNN models) on daily mean streamflow and water level in 7-, 10-, and 15-day lead time were compared. Performance indices in <xref ref-type="table" rid="T2">Tables 2</xref>, <xref ref-type="table" rid="T3">3</xref> show that the forecast results of mean streamflow and water level in the 7-day lead time are better than those in the 10- and 15-day lead time. Taking the forecast results by BP neural network as an example, the <italic>MAE</italic> values of mean streamflow for 7-, 10-, and 15-day lead time are 1,772.7856, 1,934.0324, and 2,098.2541 m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup> respectively; <italic>DC</italic> values are 0.2081, 0.0951 and &#x02212;0.2322, respectively; <italic>R</italic><sup>2</sup> values are 0.5224, 0.4541, and 0.4333, respectively; <italic>MRE</italic> values are 0.2630, 0.2995, and 0.3715, respectively; <italic>RMSE</italic> values are 3,036.2640, 3,268.3675, and 3,304.2589 m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup> respectively; <italic>QR</italic> values are 64.88, 68.86, and 53.50%, respectively. The <italic>MAE</italic> values of mean water level for 7-, 10-, and 15-day lead time are 1.3460, 1.4400, and 1.5976 m, respectively; <italic>DC</italic> values are 0.5205, 0.3759, and 0.2415, respectively; <italic>R</italic><sup>2</sup> values are 0.6365, 0.5816, and 0.5125, respectively; <italic>MRE</italic> values are 0.2100, 0.2313, and 0.2712, respectively; <italic>RMSEs</italic> are 1.8897, 1.9926, and 2.1758 m, respectively; <italic>QR</italic> values are 67.22, 63.24, and 60.77%, respectively. It shows that the longer the lead time, the more the model is affected by the uncertainty of input parameters. Therefore, this article discusses the runoff forecasting results of the 7-day lead time.</p>
<table-wrap position="float" id="T2">
<label>Table 2</label>
<caption><p>Performance indices of Wuzhou station mean streamflow forecast in the 7-, 10-, and 15-day lead time.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><bold>MAE (m</bold><sup><bold>3</bold></sup><bold>&#x000B7;</bold><bold>s</bold><sup><bold>&#x02212;1</bold></sup><bold>)</bold></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><bold>DC</bold></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><bold>R</bold><sup><bold>2</bold></sup></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><bold>MRE</bold></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><bold>RMSE (m</bold><sup><bold>3</bold></sup><bold>&#x000B7;</bold><bold>s</bold><sup><bold>&#x02212;1</bold></sup><bold>)</bold></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><bold>QR (%)</bold></th>
</tr>
<tr>
<th/>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">BP</td>
<td valign="top" align="center">1,772.7856</td>
<td valign="top" align="center">1,934.0324</td>
<td valign="top" align="center">2,098.2541</td>
<td valign="top" align="center">0.2081</td>
<td valign="top" align="center">0.0951</td>
<td valign="top" align="center">&#x02212;0.2322</td>
<td valign="top" align="center">0.5224</td>
<td valign="top" align="center">0.4541</td>
<td valign="top" align="center">0.4333</td>
<td valign="top" align="center">0.2630</td>
<td valign="top" align="center">0.2995</td>
<td valign="top" align="center">0.3715</td>
<td valign="top" align="center">3,036.2640</td>
<td valign="top" align="center">3,268.3675</td>
<td valign="top" align="center">3,304.2589</td>
<td valign="top" align="center">64.88</td>
<td valign="top" align="center">68.86</td>
<td valign="top" align="center">53.50</td>
</tr>
<tr>
<td valign="top" align="left">GRNN</td>
<td valign="top" align="center">1,783.1870</td>
<td valign="top" align="center">1,888.3839</td>
<td valign="top" align="center">2,067.9450</td>
<td valign="top" align="center">0.5082</td>
<td valign="top" align="center">0.4338</td>
<td valign="top" align="center">0.3811</td>
<td valign="top" align="center">0.5138</td>
<td valign="top" align="center">0.4497</td>
<td valign="top" align="center">0.3848</td>
<td valign="top" align="center">0.2657</td>
<td valign="top" align="center">0.2767</td>
<td valign="top" align="center">0.3451</td>
<td valign="top" align="center">3,066.2337</td>
<td valign="top" align="center">3,290.1598</td>
<td valign="top" align="center">3,439.9025</td>
<td valign="top" align="center">71.06</td>
<td valign="top" align="center">75.31</td>
<td valign="top" align="center">61.45</td>
</tr>
<tr>
<td valign="top" align="left">ELM</td>
<td valign="top" align="center">1,940.0608</td>
<td valign="top" align="center">2,102.5341</td>
<td valign="top" align="center">2,165.5322</td>
<td valign="top" align="center">0.4763</td>
<td valign="top" align="center">0.4122</td>
<td valign="top" align="center">0.3705</td>
<td valign="top" align="center">0.5008</td>
<td valign="top" align="center">0.4420</td>
<td valign="top" align="center">0.4051</td>
<td valign="top" align="center">0.3156</td>
<td valign="top" align="center">0.3696</td>
<td valign="top" align="center">0.3741</td>
<td valign="top" align="center">3,164.0892</td>
<td valign="top" align="center">3,352.1745</td>
<td valign="top" align="center">3,469.1480</td>
<td valign="top" align="center">56.24</td>
<td valign="top" align="center">51.03</td>
<td valign="top" align="center">55.28</td>
</tr>
<tr>
<td valign="top" align="left">WNN</td>
<td valign="top" align="center">1,778.2273</td>
<td valign="top" align="center">2,129.8711</td>
<td valign="top" align="center">2,097.3720</td>
<td valign="top" align="center">0.4940</td>
<td valign="top" align="center">0.3928</td>
<td valign="top" align="center">0.3556</td>
<td valign="top" align="center">0.5003</td>
<td valign="top" align="center">0.3932</td>
<td valign="top" align="center">0.3688</td>
<td valign="top" align="center">0.2951</td>
<td valign="top" align="center">0.3705</td>
<td valign="top" align="center">0.3576</td>
<td valign="top" align="center">3,110.2468</td>
<td valign="top" align="center">3,407.3199</td>
<td valign="top" align="center">3,510.0276</td>
<td valign="top" align="center">59.81</td>
<td valign="top" align="center">50.62</td>
<td valign="top" align="center">56.10</td>
</tr>
</tbody>
</table>
</table-wrap>
<table-wrap position="float" id="T3">
<label>Table 3</label>
<caption><p>Performance indices of Wuzhou station mean water level forecast in the 7-, 10-, and 15-day lead time.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><italic><bold>MAE</bold></italic> <bold>(m)</bold></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><italic><bold>DC</bold></italic></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><bold><italic><bold>R</bold></italic><sup><bold>2</bold></sup></bold></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><italic><bold>MRE</bold></italic></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><italic><bold>RMSE</bold></italic> <bold>(m)</bold></th>
<th valign="top" align="center" colspan="3" style="border-bottom: thin solid #000000;"><italic><bold>QR</bold></italic> <bold>(%)</bold></th>
</tr>
<tr>
<th/>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
<th valign="top" align="center"><bold>7-day</bold></th>
<th valign="top" align="center"><bold>10-day</bold></th>
<th valign="top" align="center"><bold>15-day</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">BP</td>
<td valign="top" align="center">1.3460</td>
<td valign="top" align="center">1.4400</td>
<td valign="top" align="center">1.5976</td>
<td valign="top" align="center">0.5205</td>
<td valign="top" align="center">0.3759</td>
<td valign="top" align="center">0.2415</td>
<td valign="top" align="center">0.6365</td>
<td valign="top" align="center">0.5816</td>
<td valign="top" align="center">0.5125</td>
<td valign="top" align="center">0.2100</td>
<td valign="top" align="center">0.2313</td>
<td valign="top" align="center">0.2712</td>
<td valign="top" align="center">1.8897</td>
<td valign="top" align="center">1.9926</td>
<td valign="top" align="center">2.1758</td>
<td valign="top" align="center">67.22</td>
<td valign="top" align="center">63.24</td>
<td valign="top" align="center">60.77</td>
</tr>
<tr>
<td valign="top" align="left">GRNN</td>
<td valign="top" align="center">1.3499</td>
<td valign="top" align="center">1.4714</td>
<td valign="top" align="center">1.5761</td>
<td valign="top" align="center">0.6077</td>
<td valign="top" align="center">0.5458</td>
<td valign="top" align="center">0.4713</td>
<td valign="top" align="center">0.6167</td>
<td valign="top" align="center">0.5588</td>
<td valign="top" align="center">0.4781</td>
<td valign="top" align="center">0.2178</td>
<td valign="top" align="center">0.2409</td>
<td valign="top" align="center">0.2614</td>
<td valign="top" align="center">1.9068</td>
<td valign="top" align="center">2.0518</td>
<td valign="top" align="center">2.2138</td>
<td valign="top" align="center">66.12</td>
<td valign="top" align="center">63.65</td>
<td valign="top" align="center">65.98</td>
</tr>
<tr>
<td valign="top" align="left">ELM</td>
<td valign="top" align="center">1.3590</td>
<td valign="top" align="center">1.4842</td>
<td valign="top" align="center">1.5871</td>
<td valign="top" align="center">0.5948</td>
<td valign="top" align="center">0.5637</td>
<td valign="top" align="center">0.4687</td>
<td valign="top" align="center">0.6162</td>
<td valign="top" align="center">0.5823</td>
<td valign="top" align="center">0.5001</td>
<td valign="top" align="center">0.2142</td>
<td valign="top" align="center">0.2505</td>
<td valign="top" align="center">0.2729</td>
<td valign="top" align="center">1.9379</td>
<td valign="top" align="center">2.0111</td>
<td valign="top" align="center">2.2192</td>
<td valign="top" align="center">65.71</td>
<td valign="top" align="center">60.08</td>
<td valign="top" align="center">60.22</td>
</tr>
<tr>
<td valign="top" align="left">WNN</td>
<td valign="top" align="center">1.2725</td>
<td valign="top" align="center">1.5224</td>
<td valign="top" align="center">1.5850</td>
<td valign="top" align="center">0.6401</td>
<td valign="top" align="center">0.5255</td>
<td valign="top" align="center">0.4644</td>
<td valign="top" align="center">0.6412</td>
<td valign="top" align="center">0.5483</td>
<td valign="top" align="center">0.4662</td>
<td valign="top" align="center">0.2017</td>
<td valign="top" align="center">0.2632</td>
<td valign="top" align="center">0.2589</td>
<td valign="top" align="center">1.8264</td>
<td valign="top" align="center">2.0972</td>
<td valign="top" align="center">2.2282</td>
<td valign="top" align="center">69.68</td>
<td valign="top" align="center">63.79</td>
<td valign="top" align="center">64.61</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec>
<title>Streamflow Forecast Results of Wuzhou Station by Machine-Learning Model</title>
<p>By comparing all studies of the daily mean streamflow of Wuzhou station forecasting methods (i.e., BP, GRNN, ELM, and WNN models), the forecasting accuracy indices in the 7-day lead time are presented in <xref ref-type="table" rid="T2">Table 2</xref>. It shows that the four models have a certain forecasting ability for the mean streamflow of Wuzhou station, with <italic>MAE</italic> ranging between 1,772 and 1,941 m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup>, <italic>DC</italic> ranging between 0.20 and 0.50, <italic>R</italic><sup>2</sup> ranging between 0.50 and 0.53, <italic>MRE</italic> ranging between 0.26 and 0.32, <italic>RMSE</italic> ranging between 3,036 and 3,165 m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup> and <italic>QR</italic> ranging between 56 and 72%. In general, comparing the <italic>R</italic><sup>2</sup> values of each model, the evaluation accuracy has reached more than 0.50 and the difference is unobvious. But GRNN has the highest <italic>DC</italic> and <italic>QR</italic> values (<italic>DC</italic> = 0.5138, <italic>QR</italic> &#x0003E; 70%) and smaller errors in terms of <italic>MAE, MRE</italic>, and <italic>RMSE</italic> suggesting that the forecasting performance is better.</p>
<p>To understand the forecast performance of each model in more detail and intuitively, the scatter plots of the linear regression between forecasted and observed streamflow (<xref ref-type="fig" rid="F2">Figure 2</xref>) and hydrographs (<xref ref-type="fig" rid="F3">Figure 3</xref>) are displayed. Based on the graphical presentations in <xref ref-type="fig" rid="F2">Figures 2</xref>, <xref ref-type="fig" rid="F3">3</xref>, the four models perform better in the case of low flow values, but regarding the medium flow values, they are overestimated; for the high-flows values, these are underestimated. Obviously, it is difficult for these models to predict the extreme peak flow validly. The reason may be the probability of extreme flood events is low in the period of study, the models are unable to learn such events well.</p>
<fig id="F2" position="float">
<label>Figure 2</label>
<caption><p>Scatter plots of observed and simulated mean streamflow.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fdata-04-752406-g0002.tif"/>
</fig>
<fig id="F3" position="float">
<label>Figure 3</label>
<caption><p>Hydrographs of observed and simulated mean streamflow.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fdata-04-752406-g0003.tif"/>
</fig>
</sec>
<sec>
<title>Water Level Forecast Results of Wuzhou Station by Machine-Learning Model</title>
<p>Backpropagation, GRNN, ELM, and WNN models were used to forecast the daily mean water level of the Wuzhou station. The forecasting accuracy indices in the 7-day lead time are presented in <xref ref-type="table" rid="T3">Table 3</xref>. In general, the forecast accuracy of the four models on the mean water level is better than that on the streamflow, with <italic>MAE</italic> ranging between 1.27 and 1.36 m, <italic>DC</italic> ranging between 0.52 and 0.65, <italic>R</italic><sup>2</sup> ranging between 0.61 and 0.65, <italic>MRE</italic> ranging between 0.20 and 0.22, <italic>RMSE</italic> ranging between 1.82 and 1.94 m, and <italic>QR</italic> ranging between 65 and 70%. The WNN model shows the smallest error with the highest <italic>DC</italic>, <italic>R</italic><sup>2</sup>, and <italic>QR</italic> values, which are 0.6401, 0.6412, and 69.68%, respectively.</p>
<p><xref ref-type="fig" rid="F4">Figures 4</xref>, <xref ref-type="fig" rid="F5">5</xref> illustrate the scatter plots of the linear regression between forecasted and observed water level and hydrographs. Similar to the streamflow forecast results, the four models show better performance in the case of medium and low water levels, but significantly underestimate in the case of high water level.</p>
<fig id="F4" position="float">
<label>Figure 4</label>
<caption><p>Scatter plots of observed and simulated mean water level.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fdata-04-752406-g0004.tif"/>
</fig>
<fig id="F5" position="float">
<label>Figure 5</label>
<caption><p>Hydrographs of observed and simulated mean water level.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fdata-04-752406-g0005.tif"/>
</fig>
</sec>
<sec>
<title>Forecast Results of Considering the Flood Propagation Time</title>
<p>According to the analysis of Sections Determination of Lead Time, Streamflow Forecast Results of Wuzhou Station by Machine Learning Model, and Water Level Forecast Results of Wuzhou Station by Machine Learning Model, deviations still exist in the forecast of the streamflow and water level of Wuzhou station by meteorological and corresponding hydrological data in the 7-day lead time. There are problems, such as underestimation of the flood peak flow and water level in extreme flood events and lagging of the flood peak forecast. To improve the accuracy of forecasting, the streamflow and water level of Wuzhou station on the day were predicted considering the relationship between upstream and downstream flood propagation.</p>
<p>Wuxuan, Dahuangjiangkou, and Wuzhou stations are important hydrological control stations in the mainstream of Xijiang River. Based on the observed flood data of each station for a series of years, the distance between Wuxuan station and Dahuangjiangkou station is about 104 km, the flood propagation time is about 12 h; the distance between Dahuangjiangkou station and Wuzhou station is about 143 km, the flood propagation time is about 30 h. Therefore, the streamflow and water level data before 2 days at Wuxuan station and the data before 1 day at Dahuangjiangkou station were selected as input parameters to forecast the data of Wuzhou station.</p>
<p>The analysis indicates that the prediction performance of GRNN is more precise compared to other models, thus GRNN was used for further research. The forecast results are shown in <xref ref-type="table" rid="T4">Table 4</xref>. GRNN has a satisfying forecast in streamflow and water level with <italic>DC</italic> of 0.8884 and 0.9099, respectively; <italic>R</italic><sup>2</sup> of 0.9228 and 0.9169, respectively; and <italic>QR</italic> of 98.36 and 82.74%, respectively. Observed and forecasted streamflows and water level values using the GRNN models are shown in <xref ref-type="fig" rid="F6">Figures 6</xref>, <xref ref-type="fig" rid="F7">7</xref>. It is evident that there is a significant linear correlation between the forecasted and observed results, which improves the accuracy of high flow and high water level by only inputting hydrological data. Considering the relationship in flood propagation time between upstream and downstream, this method has high accuracy and convenient application, but the shortage is that the lead time is too short to satisfy mid-long term forecasting at present.</p>
<table-wrap position="float" id="T4">
<label>Table 4</label>
<caption><p>Performance indices of Wuxuan&#x02013;Dahuangjiangkou&#x02013;Wuzhou stations mean streamflow and water level forecast by GRNN.</p></caption>
<table frame="hsides" rules="groups">
<thead><tr>
<th/>
<th valign="top" align="center"><bold>MAE</bold></th>
<th valign="top" align="center"><bold>DC</bold></th>
<th valign="top" align="center"><bold>R<sup><bold>2</bold></sup></bold></th>
<th valign="top" align="center"><bold>MRE</bold></th>
<th valign="top" align="center"><bold>RMSE</bold></th>
<th valign="top" align="center"><bold>QR (%)</bold></th>
</tr>
</thead>
<tbody>
<tr>
<td valign="top" align="left">Mean streamflow</td>
<td valign="top" align="center">895.9491 (m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup>)</td>
<td valign="top" align="center">0.8884</td>
<td valign="top" align="center">0.9228</td>
<td valign="top" align="center">0.1302</td>
<td valign="top" align="center">1,459.9038 (m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup>)</td>
<td valign="top" align="center">98.36</td>
</tr>
<tr>
<td valign="top" align="left">Mean water level</td>
<td valign="top" align="center">0.7117 (m)</td>
<td valign="top" align="center">0.9099</td>
<td valign="top" align="center">0.9169</td>
<td valign="top" align="center">0.1346</td>
<td valign="top" align="center">0.9134(m)</td>
<td valign="top" align="center">82.74</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="F6" position="float">
<label>Figure 6</label>
<caption><p>Scatter plots of Wuxuan&#x02013;Dahuangjiangkou&#x02013;Wuzhou stations mean streamflow and water level observed and simulated by GRNN.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fdata-04-752406-g0006.tif"/>
</fig>
<fig id="F7" position="float">
<label>Figure 7</label>
<caption><p>Hydrograph of Wuxuan&#x02013;Dahuangjiangkou&#x02013;Wuzhou stations mean streamflow <bold>(A)</bold> and water level <bold>(B)</bold> observed and simulated by GRNN. GRNN, generalized regression neural network.</p></caption>
<graphic mimetype="image" mime-subtype="tiff" xlink:href="fdata-04-752406-g0007.tif"/>
</fig>
</sec>
</sec>
<sec sec-type="conclusions" id="s5">
<title>Conclusion</title>
<p>In this study, four different machine-learning methods were utilized to forecast the mean streamflow and water level, including BP, GRNN, ELM, and WNN. Taking Wuzhou station of Xijiang River as a case study, the performances of different models were compared. Furthermore, considering the flood propagation time, the upstream Wuxuan station and the Dahuangjiangkou station streamflow and water level data were used as input parameters to runoff forecasting. The major findings are as follows:</p>
<list list-type="order">
<list-item><p>GRNN model performs the best on the streamflow forecasting of Wuzhou station in the 7-day lead time, with <italic>DC</italic> = 0.5082, <italic>R</italic><sup>2</sup> = 0.5138, and <italic>QR</italic> = 71.06%.</p></list-item>
<list-item><p>WNN model shows the best prediction effect on the water level of Wuzhou station in the 7-day lead time, with <italic>DC</italic> = 0.6401, <italic>R</italic><sup>2</sup> = 0.6412, and <italic>QR</italic> = 69.68%. Overall prediction results can meet the accuracy requirements (&#x0003E;60.0%), but it is difficult to make an accurate prediction for extreme events.</p></list-item>
<list-item><p>Considering the relationship between upstream and downstream flood propagation, the accuracy of the machine-learning method is improved significantly. The GRNN model was used for streamflow forecasting with <italic>MAE</italic> of 895.9491 m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup>, <italic>DC</italic> of 0.8884, <italic>R</italic><sup>2</sup> of 0.9228, <italic>MRE</italic> of 0.1302, <italic>RMSE</italic> of 1,459.9038 m<sup>3</sup>&#x000B7;s<sup>&#x02212;1</sup>, and <italic>QR</italic> of 98.36%, and the water level forecasting with <italic>MAE</italic> of 0.7117 m, <italic>DC</italic> of 0.9099, <italic>R</italic><sup>2</sup> of 0.9169, <italic>MRE</italic> of 0.1346, <italic>RMSE</italic> of 0.9134 m, and <italic>QR</italic> of 82.74%. This method effectively solved the problem of underestimation in the case of high flow and high water level.</p></list-item>
</list>
<p>There are still several aspects that can be improved in this study. As revealed in this article, optimizing the structure of machine-learning models to improve the efficiency and accuracy of forecasting and extending the lead time for runoff forecasting utilizing the relationship between upstream and downstream flood propagation is waiting for further research.</p>
</sec>
<sec sec-type="data-availability" id="s6">
<title>Data Availability Statement</title>
<p>The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author/s.</p>
</sec>
<sec id="s7">
<title>Author Contributions</title>
<p>MZ: supervision and project administration. LX: model development. DZ: data collection and processing. All authors were involved in the production and writing of the manuscript. All authors contributed to the article and approved the submitted version.</p>
</sec>
<sec sec-type="funding-information" id="s8">
<title>Funding</title>
<p>The research was funded by the National Key Research and Development Program of China (Grant No. 2021YFC3001000), and the Innovation Group Project of Southern Marine Science and Engineering Guangdong Laboratory (Zhuhai) (Grant No. 311021018).</p>
</sec>
<sec sec-type="COI-statement" id="conf1">
<title>Conflict of Interest</title>
<p>The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec sec-type="disclaimer" id="s9">
<title>Publisher&#x00027;s Note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec> </body>
<back>

<ref-list>
<title>References</title>
<ref id="B1">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Abghari</surname> <given-names>H.</given-names></name> <name><surname>Ahmadi</surname> <given-names>H.</given-names></name> <name><surname>Besharat</surname> <given-names>S.</given-names></name> <name><surname>Rezaverdinejad</surname> <given-names>V.</given-names></name></person-group> (<year>2012</year>). <article-title>Prediction of Daily Pan Evaporation using Wavelet Neural Networks</article-title>. <source>Water Resour. Manag.</source> <volume>26</volume>, <fpage>3639</fpage>&#x02013;<lpage>3652</lpage>. <pub-id pub-id-type="doi">10.1007/s11269-012-0096-z</pub-id></citation>
</ref>
<ref id="B2">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Badrzadeh</surname> <given-names>H.</given-names></name> <name><surname>Sarukkalige</surname> <given-names>R.</given-names></name> <name><surname>Jayawardena</surname> <given-names>A. W.</given-names></name></person-group> (<year>2015</year>). <article-title>Hourly runoff forecasting for flood risk management: application of various computational intelligence models</article-title>. <source>J. Hydrol.</source> <volume>529</volume>, <fpage>1633</fpage>&#x02013;<lpage>1643</lpage>. <pub-id pub-id-type="doi">10.1016/j.jhydrol.2015.07.057</pub-id></citation>
</ref>
<ref id="B3">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Bisoyi</surname> <given-names>N.</given-names></name> <name><surname>Gupta</surname> <given-names>H.</given-names></name> <name><surname>Padhy</surname> <given-names>N. P.</given-names></name> <name><surname>Chakrapani</surname> <given-names>G. J.</given-names></name></person-group> (<year>2019</year>). <article-title>Prediction of daily sediment discharge using a back propagation neural network training algorithm: a case study of the Narmada River, India</article-title>. <source>Int. J. Sediment Res.</source> <volume>34</volume>, <fpage>125</fpage>&#x02013;<lpage>135</lpage>. <pub-id pub-id-type="doi">10.1016/j.ijsrc.2018.10.010</pub-id></citation>
</ref>
<ref id="B4">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Cigizoglu</surname> <given-names>H. K.</given-names></name> <name><surname>Alp</surname> <given-names>M.</given-names></name></person-group> (<year>2006</year>). <article-title>Generalized regression neural network in modelling river sediment yield</article-title>. <source>Adv. Eng. Softw.</source> <volume>37</volume>, <fpage>63</fpage>&#x02013;<lpage>68</lpage>. <pub-id pub-id-type="doi">10.1016/j.advengsoft.2005.05.002</pub-id></citation>
</ref>
<ref id="B5">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Elsheikh</surname> <given-names>A. H.</given-names></name> <name><surname>Sharshir</surname> <given-names>S. W.</given-names></name> <name><surname>Abd Elaziz</surname> <given-names>M.</given-names></name> <name><surname>Kabeel</surname> <given-names>A. E.</given-names></name> <name><surname>Wang</surname> <given-names>G.</given-names></name> <etal/></person-group>. (<year>2019</year>). <article-title>Modeling of solar energy systems using artificial neural network: a comprehensive review</article-title>. <source>Sol. Energy</source> <volume>180</volume>, <fpage>622</fpage>&#x02013;<lpage>639</lpage>. <pub-id pub-id-type="doi">10.1016/j.solener.2019.01.037</pub-id></citation>
</ref>
<ref id="B6">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Guo</surname> <given-names>J.</given-names></name> <name><surname>Zhou</surname> <given-names>J.</given-names></name> <name><surname>Qin</surname> <given-names>H.</given-names></name> <name><surname>Zou</surname> <given-names>Q.</given-names></name> <name><surname>Li</surname> <given-names>Q.</given-names></name></person-group> (<year>2011</year>). <article-title>Monthly streamflow forecasting based on improved support vector machine model</article-title>. <source>Expert Syst. Appl.</source> <volume>38</volume>, <fpage>13073</fpage>&#x02013;<lpage>13081</lpage>. <pub-id pub-id-type="doi">10.1016/j.eswa.2011.04.114</pub-id></citation>
</ref>
<ref id="B7">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Hameed</surname> <given-names>M.</given-names></name> <name><surname>Sharqi</surname> <given-names>S. S.</given-names></name> <name><surname>Yaseen</surname> <given-names>Z. M.</given-names></name> <name><surname>Afan</surname> <given-names>H. A.</given-names></name> <name><surname>Hussain</surname> <given-names>A.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>Application of artificial intelligence (AI) techniques in water quality index prediction: a case study in tropical region, Malaysia</article-title>. <source>Neural Comput. Appl.</source> <volume>28</volume>, <fpage>S893</fpage>&#x02013;<lpage>S905</lpage>. <pub-id pub-id-type="doi">10.1007/s00521-016-2404-7</pub-id></citation>
</ref>
<ref id="B8">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Huang</surname> <given-names>G.</given-names></name> <name><surname>Zhu</surname> <given-names>Q.</given-names></name> <name><surname>Siew</surname> <given-names>C.</given-names></name></person-group> (<year>2006</year>). <article-title>Extreme learning machine: theory and applications</article-title>. <source>Neurocomputing</source> <volume>70</volume>, <fpage>489</fpage>&#x02013;<lpage>501</lpage>. <pub-id pub-id-type="doi">10.1016/j.neucom.2005.12.126</pub-id></citation>
</ref>
<ref id="B9">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lee</surname> <given-names>S.</given-names></name> <name><surname>Lee</surname> <given-names>K.</given-names></name> <name><surname>Yoon</surname> <given-names>H.</given-names></name></person-group> (<year>2019</year>). <article-title>Using artificial neural network models for groundwater level forecasting and assessment of the relative impacts of influencing factors</article-title>. <source>Hydrogeol. J.</source> <volume>27</volume>, <fpage>567</fpage>&#x02013;<lpage>579</lpage>. <pub-id pub-id-type="doi">10.1007/s10040-018-1866-3</pub-id></citation>
</ref>
<ref id="B10">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Li</surname> <given-names>H.</given-names></name> <name><surname>Guo</surname> <given-names>S.</given-names></name> <name><surname>Li</surname> <given-names>C.</given-names></name> <name><surname>Sun</surname> <given-names>J.</given-names></name></person-group> (<year>2013</year>). <article-title>A hybrid annual power load forecasting model based on generalized regression neural network with fruit fly optimization algorithm</article-title>. <source>Knowl-Based Syst.</source> <volume>37</volume>, <fpage>378</fpage>&#x02013;<lpage>387</lpage>. <pub-id pub-id-type="doi">10.1016/j.knosys.2012.08.015</pub-id></citation>
</ref>
<ref id="B11">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Lima</surname> <given-names>A. R.</given-names></name> <name><surname>Cannon</surname> <given-names>A. J.</given-names></name> <name><surname>Hsieh</surname> <given-names>W. W.</given-names></name></person-group> (<year>2016</year>). <article-title>Forecasting daily streamflow using online sequential extreme learning machines</article-title>. <source>J. Hydrol.</source> <volume>537</volume>, <fpage>431</fpage>&#x02013;<lpage>443</lpage>. <pub-id pub-id-type="doi">10.1016/j.jhydrol.2016.03.017</pub-id></citation>
</ref>
<ref id="B12">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Modaresi</surname> <given-names>F.</given-names></name> <name><surname>Araghinejad</surname> <given-names>S.</given-names></name> <name><surname>Ebrahimi</surname> <given-names>K.</given-names></name></person-group> (<year>2018</year>). <article-title>A comparative assessment of artificial neural network, generalized regression neural network, least-square support vector regression, and K-Nearest neighbor regression for monthly streamflow forecasting in linear and nonlinear conditions</article-title>. <source>Water Resour. Manag.</source> <volume>32</volume>, <fpage>243</fpage>&#x02013;<lpage>258</lpage>. <pub-id pub-id-type="doi">10.1007/s11269-017-1807-2</pub-id></citation>
</ref>
<ref id="B13">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Mosavi</surname> <given-names>A.</given-names></name> <name><surname>Ozturk</surname> <given-names>P.</given-names></name> <name><surname>Chau</surname> <given-names>K.</given-names></name></person-group> (<year>2018</year>). <article-title>Flood prediction using machine learning models: Literature review</article-title>. <source>Water</source> <volume>10</volume>:<fpage>1536</fpage>. <pub-id pub-id-type="doi">10.3390/w10111536</pub-id></citation>
</ref>
<ref id="B14">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Niu</surname> <given-names>W.</given-names></name> <name><surname>Fen</surname> <given-names>Z.</given-names></name></person-group> (<year>2021</year>). <article-title>Evaluating the performances of several artificial intelligence methods in forecasting daily streamflow time series for sustainable water resources management</article-title>. <source>Sustain. Cities Soc.</source> <volume>64</volume>:<fpage>102562</fpage>. <pub-id pub-id-type="doi">10.1016/j.scs.2020.102562</pub-id></citation>
</ref>
<ref id="B15">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Niu</surname> <given-names>W.</given-names></name> <name><surname>Feng</surname> <given-names>Z.</given-names></name> <name><surname>Cheng</surname> <given-names>C.</given-names></name> <name><surname>Zhou</surname> <given-names>J.</given-names></name></person-group> (<year>2018</year>). <article-title>Forecasting daily runoff by extreme learning machine based on quantum-behaved particle swarm optimization</article-title>. <source>J. Hydrol. Eng.</source> <volume>23</volume>:<fpage>04018002</fpage>. <pub-id pub-id-type="doi">10.1061/(ASCE)HE.1943-5584.0001625</pub-id></citation>
</ref>
<ref id="B16">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Nourani</surname> <given-names>V..</given-names></name></person-group> (<year>2017</year>). <article-title>An emotional ANN (EANN) approach to modeling rainfall-runoff process</article-title>. <source>J. Hydrol.</source> <volume>544</volume>, <fpage>267</fpage>&#x02013;<lpage>277</lpage>. <pub-id pub-id-type="doi">10.1016/j.jhydrol.2016.11.033</pub-id></citation>
</ref>
<ref id="B17">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Parisouj</surname> <given-names>P.</given-names></name> <name><surname>Mohebzadeh</surname> <given-names>H.</given-names></name> <name><surname>Lee</surname> <given-names>T.</given-names></name></person-group> (<year>2020</year>). <article-title>Employing machine learning algorithms for streamflow prediction: a case study of four river basins with different climatic zones in the United States</article-title>. <source>Water Resour. Manag.</source> <volume>34</volume>, <fpage>4113</fpage>&#x02013;<lpage>4131</lpage>. <pub-id pub-id-type="doi">10.1007/s11269-020-02659-5</pub-id></citation>
</ref>
<ref id="B18">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pliego Marugan</surname> <given-names>A.</given-names></name> <name><surname>Garcia Marquez</surname> <given-names>F. P.</given-names></name> <name><surname>Pinar Perez</surname> <given-names>J. M.</given-names></name> <name><surname>Ruiz-Hernandez</surname> <given-names>D.</given-names></name></person-group> (<year>2018</year>). <article-title>A survey of artificial neural network in wind energy systems</article-title>. <source>Appl. Energ.</source> <volume>228</volume>, <fpage>1822</fpage>&#x02013;<lpage>1836</lpage>. <pub-id pub-id-type="doi">10.1016/j.apenergy.2018.07.084</pub-id></citation>
</ref>
<ref id="B19">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Pradhan</surname> <given-names>P.</given-names></name> <name><surname>Tingsanchali</surname> <given-names>T.</given-names></name> <name><surname>Shrestha</surname> <given-names>S.</given-names></name></person-group> (<year>2020</year>). <article-title>Evaluation of soil and water assessment tool and artificial neural network models for hydrologic simulation in different climatic regions of Asia</article-title>. <source>Sci. Total Environ.</source> <volume>701</volume>:<fpage>134308</fpage>. <pub-id pub-id-type="doi">10.1016/j.scitotenv.2019.134308</pub-id><pub-id pub-id-type="pmid">31704397</pub-id></citation></ref>
<ref id="B20">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Shortridge</surname> <given-names>J. E.</given-names></name> <name><surname>Guikema</surname> <given-names>S. D.</given-names></name> <name><surname>Zaitchik</surname> <given-names>B. F.</given-names></name></person-group> (<year>2016</year>). <article-title>Machine learning methods for empirical streamflow simulation: a comparison of model accuracy, interpretability, and uncertainty in seasonal watersheds</article-title>. <source>Hydrol. Earth Syst. Sci.</source> <volume>20</volume>, <fpage>2611</fpage>&#x02013;<lpage>2628</lpage>. <pub-id pub-id-type="doi">10.5194/hess-20-2611-2016</pub-id></citation>
</ref>
<ref id="B21">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Tongal</surname> <given-names>H.</given-names></name> <name><surname>Booij</surname> <given-names>M. J.</given-names></name></person-group> (<year>2018</year>). <article-title>Simulation and forecasting of streamflows using machine learning models coupled with base flow separation</article-title>. <source>J. Hydrol.</source> <volume>564</volume>, <fpage>266</fpage>&#x02013;<lpage>282</lpage>. <pub-id pub-id-type="doi">10.1016/j.jhydrol.2018.07.004</pub-id></citation>
</ref>
<ref id="B22">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wei</surname> <given-names>S.</given-names></name> <name><surname>Yang</surname> <given-names>H.</given-names></name> <name><surname>Song</surname> <given-names>J.</given-names></name> <name><surname>Abbaspour</surname> <given-names>K.</given-names></name> <name><surname>Xu</surname> <given-names>Z.</given-names></name></person-group> (<year>2013</year>). <article-title>A wavelet-neural network hybrid modelling approach for estimating and predicting river monthly flows</article-title>. <source>Hydrolog. Sci. J.</source> <volume>58</volume>, <fpage>374</fpage>&#x02013;<lpage>389</lpage>. <pub-id pub-id-type="doi">10.1080/02626667.2012.754102</pub-id></citation>
</ref>
<ref id="B23">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Wu</surname> <given-names>Z.</given-names></name> <name><surname>Lin</surname> <given-names>Q.</given-names></name> <name><surname>Lu</surname> <given-names>G.</given-names></name> <name><surname>He</surname> <given-names>H.</given-names></name> <name><surname>Qu</surname> <given-names>J. J.</given-names></name></person-group> (<year>2015</year>). <article-title>Analysis of hydrological drought frequency for the Xijiang River Basin in South China using observed streamflow data</article-title>. <source>Nat. Hazards</source> <volume>77</volume>, <fpage>1655</fpage>&#x02013;<lpage>1677</lpage>. <pub-id pub-id-type="doi">10.1007/s11069-015-1668-z</pub-id></citation>
</ref>
<ref id="B24">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yaseen</surname> <given-names>Z. M.</given-names></name> <name><surname>Allawi</surname> <given-names>M. F.</given-names></name> <name><surname>Yousif</surname> <given-names>A. A.</given-names></name> <name><surname>Jaafar</surname> <given-names>O.</given-names></name> <name><surname>Hamzah</surname> <given-names>F. M.</given-names></name> <etal/></person-group>. (<year>2018</year>). <article-title>Non-tuned machine learning approach for hydrological time series forecasting</article-title>. <source>Neural Comput. Appl.</source> <volume>30</volume>, <fpage>1479</fpage>&#x02013;<lpage>1491</lpage>. <pub-id pub-id-type="doi">10.1007/s00521-016-2763-0</pub-id></citation>
</ref>
<ref id="B25">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yaseen</surname> <given-names>Z. M.</given-names></name> <name><surname>Jaafar</surname> <given-names>O.</given-names></name> <name><surname>Deo</surname> <given-names>R. C.</given-names></name> <name><surname>Kisi</surname> <given-names>O.</given-names></name> <name><surname>Adamowski</surname> <given-names>J.</given-names></name> <etal/></person-group>. (<year>2016</year>). <article-title>Stream-flow forecasting using extreme learning machines: a case study in a semi-arid region in Iraq</article-title>. <source>J. Hydrol.</source> <volume>542</volume>, <fpage>603</fpage>&#x02013;<lpage>614</lpage>. <pub-id pub-id-type="doi">10.1016/j.jhydrol.2016.09.035</pub-id></citation>
</ref>
<ref id="B26">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yaseen</surname> <given-names>Z. M.</given-names></name> <name><surname>Sulaiman</surname> <given-names>S. O.</given-names></name> <name><surname>Deo</surname> <given-names>R. C.</given-names></name> <name><surname>Chau</surname> <given-names>K.</given-names></name></person-group> (<year>2019</year>). <article-title>An enhanced extreme learning machine model for river flow forecasting: state-of-the-art, practical applications in water resource engineering area and future research direction</article-title>. <source>J. Hydrol.</source> <volume>569</volume>, <fpage>387</fpage>&#x02013;<lpage>408</lpage>. <pub-id pub-id-type="doi">10.1016/j.jhydrol.2018.11.069</pub-id></citation>
</ref>
<ref id="B27">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Yuan</surname> <given-names>F.</given-names></name> <name><surname>Zhao</surname> <given-names>C.</given-names></name> <name><surname>Jiang</surname> <given-names>Y.</given-names></name> <name><surname>Ren</surname> <given-names>L.</given-names></name> <name><surname>Shan</surname> <given-names>H.</given-names></name> <etal/></person-group>. (<year>2017</year>). <article-title>Evaluation on uncertainty sources in projecting hydrological changes over the Xijiang River basin in South China</article-title>. <source>J. Hydrol.</source> <volume>554</volume>, <fpage>434</fpage>&#x02013;<lpage>450</lpage>. <pub-id pub-id-type="doi">10.1016/j.jhydrol.2017.08.034</pub-id></citation>
</ref>
<ref id="B28">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Benveniste</surname> <given-names>A.</given-names></name></person-group> (<year>1992</year>). <article-title>Wavelet networks</article-title>. <source>IEEE Trans. Neural Netw.</source> <volume>3</volume>, <fpage>889</fpage>&#x02013;<lpage>898</lpage>. <pub-id pub-id-type="doi">10.1109/72.165591</pub-id><pub-id pub-id-type="pmid">18276486</pub-id></citation></ref>
<ref id="B29">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhang</surname> <given-names>Z.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name> <name><surname>Singh</surname> <given-names>V. P.</given-names></name></person-group> (<year>2018</year>). <article-title>Univariate streamflow forecasting using commonly used data-driven models: literature review and case study</article-title>. <source>Hydrol. Sci. J.</source> <volume>63</volume>, <fpage>1091</fpage>&#x02013;<lpage>1111</lpage>. <pub-id pub-id-type="doi">10.1080/02626667.2018.1469756</pub-id></citation>
</ref>
<ref id="B30">
<citation citation-type="journal"><person-group person-group-type="author"><name><surname>Zhu</surname> <given-names>Y.</given-names></name> <name><surname>Jiang</surname> <given-names>J.</given-names></name> <name><surname>Huang</surname> <given-names>C.</given-names></name> <name><surname>Chen</surname> <given-names>Y. D.</given-names></name> <name><surname>Zhang</surname> <given-names>Q.</given-names></name></person-group> (<year>2019</year>). <article-title>Applications of multiscale change point detections to monthly stream flow and rainfall in Xijiang River in southern China, part I: correlation and variance</article-title>. <source>Theor. Appl. Climatol.</source> <volume>136</volume>, <fpage>237</fpage>&#x02013;<lpage>248</lpage>. <pub-id pub-id-type="doi">10.1007/s00704-018-2480-y</pub-id></citation>
</ref>
</ref-list> 
</back>
</article>