<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="editorial" dtd-version="2.3" xml:lang="EN">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Plant Sci.</journal-id>
<journal-title>Frontiers in Plant Science</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Plant Sci.</abbrev-journal-title>
<issn pub-type="epub">1664-462X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="doi">10.3389/fpls.2024.1368694</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Plant Science</subject>
<subj-group>
<subject>Editorial</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Editorial: Remote sensing for field-based crop phenotyping</article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Liu</surname>
<given-names>Jiangang</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/429108"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/funding-acquisition/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-editing/"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname>Zhou</surname>
<given-names>Zhenjiang</given-names>
</name>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/2071907"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-editing/"/>
</contrib>
<contrib contrib-type="author" corresp="yes">
<name>
<surname>Li</surname>
<given-names>Bo</given-names>
</name>
<xref ref-type="aff" rid="aff3">
<sup>3</sup>
</xref>
<xref ref-type="author-notes" rid="fn001">
<sup>*</sup>
</xref>
<uri xlink:href="https://loop.frontiersin.org/people/2072211"/>
<role content-type="https://credit.niso.org/contributor-roles/conceptualization/"/>
<role content-type="https://credit.niso.org/contributor-roles/resources/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-original-draft/"/>
<role content-type="https://credit.niso.org/contributor-roles/writing-review-editing/"/>
</contrib>
</contrib-group>
<aff id="aff1">
<sup>1</sup>
<institution>State Key Laboratory of Vegetable Biobreeding, Institute of Vegetables and Flowers, Chinese Academy of Agricultural Sciences</institution>, <addr-line>Beijing</addr-line>, <country>China</country>
</aff>
<aff id="aff2">
<sup>2</sup>
<institution>College of Biosystems Engineering and Food Science, Zhejiang University</institution>, <addr-line>Hangzhou, Zhejiang</addr-line>, <country>China</country>
</aff>
<aff id="aff3">
<sup>3</sup>
<institution>Syngenta, Jealott&#x2019;s Hill International Research Centre</institution>, <addr-line>Bracknell</addr-line>, <country>United Kingdom</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>Edited and Reviewed by: Peng Chen, Anhui University, China</p>
</fn>
<fn fn-type="corresp" id="fn001">
<p>*Correspondence: Bo Li, <email xlink:href="mailto:bo.li-1@syngenta.com">bo.li-1@syngenta.com</email>
</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>26</day>
<month>01</month>
<year>2024</year>
</pub-date>
<pub-date pub-type="collection">
<year>2024</year>
</pub-date>
<volume>15</volume>
<elocation-id>1368694</elocation-id>
<history>
<date date-type="received">
<day>11</day>
<month>01</month>
<year>2024</year>
</date>
<date date-type="accepted">
<day>16</day>
<month>01</month>
<year>2024</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2024 Liu, Zhou and Li</copyright-statement>
<copyright-year>2024</copyright-year>
<copyright-holder>Liu, Zhou and Li</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.</p>
</license>
</permissions>
<related-article id="RA1" related-article-type="commentary-article" xlink:href="https://www.frontiersin.org/research-topics/49674" ext-link-type="uri">Editorial on the Research Topic <article-title>Remote sensing for field-based crop phenotyping</article-title>
</related-article>
<kwd-group>
<kwd>smart agriculture</kwd>
<kwd>remote sensing</kwd>
<kwd>crop phenotyping</kwd>
<kwd>computer vision</kwd>
<kwd>machine learning</kwd>
<kwd>data fusion</kwd>
</kwd-group>
<counts>
<fig-count count="0"/>
<table-count count="0"/>
<equation-count count="0"/>
<ref-count count="0"/>
<page-count count="3"/>
<word-count count="1371"/>
</counts>
<custom-meta-wrap>
<custom-meta>
<meta-name>section-in-acceptance</meta-name>
<meta-value>Sustainable and Intelligent Phytoprotection</meta-value>
</custom-meta>
</custom-meta-wrap>
</article-meta>
</front>
<body>
<p>With the population predicted to increase to over 9.6 billion by 2050, and food demand anticipated to increase by between 60 and 100%, sustainable and resilient agricultural production with a minimised impact on the environment is crucial particularly at the context of global climate change. Breeding and identifying crop varieties with high production and adopted specific environmental conditions requires considerable efforts to assess crop phenotypic traits (e.g., LAI, plant height, biomass, yield et&#xa0;al.), which contribute to the stable increased productivity and efficient use of resources. Traditional methods for determining crop phenotypic traits are mainly based on destructive field sampling, hand-held instrument measurement, which are characterized as time-consuming, limited representative. Remote sensing provides a novel solution to quantify crop structural and functional traits in a timely, rapid, non-invasive, and efficient manner for field crops. With the development of sensors and diversified algorithms, a range of crop phenotypic traits have been determined in a manner of high-throughput, including morphological parameters, spectral and textural characteristics, physiological traits, and responses to abiotic/biotic stresses under different environments.</p>
<p>This Research Topic presents 15 research articles and 2 reviews with an insight into the recent advances in crop phenotyping using remote sensing to address some high-priority challenges including comparing ground-based handheld and remote aerial system, plant biophysical parameters estimation with innovative traits, imagery data fusion and investigating the impact of experimental design on the performance of traits extraction.</p>
<p>The current state of art high-throughput phenotyping platform including ground-based and aerial platforms. The integrated imaging sensors such as visible light, 3D, hyperspectral and fluorescence sensors were reviewed by <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1219673">Cudjoe et&#xa0;al.</ext-link>, and pointed out that the lack of appropriate field phenotyping infrastructures is impeding the development of new crop cultivars with improved traits and will eventually have a negative impact on the agricultural sector and African food security. Due to the operational complexity and limited funding, the deployment of efficient and low-cost high-throughput phenotyping methods are still highly demanded in Africa. In addition to ground-based and aerial-based remote sensing, <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1214006">Lin et&#xa0;al.</ext-link> also described satellite-based remote sensing in potato yield prediction. Furthermore, strategies for potato yield prediction including remote sensing, crop growth models (CGM) and yield limiting factors were discussed in depth, and the application of different CGMs were analysed. As data from solely single sensor source often limits the performance of prediction model, <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1214006">Lin et&#xa0;al.</ext-link> proposed that multi-source data fusion and time-series data have enormous potential for future potato yield prediction. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1111575">Marzougui et&#xa0;al.</ext-link> investigated two data fusion methods for unmanned aerial systems (UAS) multispectral imagery and high-resolution satellite imagery in field pea yield prediction, showing the improved model performance by fusing multiple time point and multiple sensor source information. With using only UAS, <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1242948">Liu et&#xa0;al.</ext-link> implemented feature fusion between texture features extracted from RGB imagery and spectral features from multispectral imagery for estimating the frost damage index in lettuce. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1277672">Bai et&#xa0;al.</ext-link> applied more comprehensive sensor data fusion in an existing Field-based High-Throughput Plant Phenotyping (FHTPP) system and combined morphological, spectral, thermal, and environmental features as the inputs of the machine learning models to estimate the cover crop biomass. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1237988">Shi et&#xa0;al.</ext-link> recognised the limitation of using single-source remote sensing spectral or LiDAR waveform data for Leaf Area Index (LAI) estimation and presented another example showing the advantage of data fusion. Rather than using empirical model that always requires sufficient amount of field measurements and does not provide a good explanation of the fusion mechanism, physical model geometric-optical and radiative transfer (GORT) integrating both spectral imagery and LiDAR waveform showed an enhancement in LAI estimation comparing with using spectral or LiDAR data alone. As a critical physiological and biochemical parameter indicating crop growth status and yield potential, LAI estimation was also investigated for winter wheat and maize in another two studies by <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1272049">Zou et&#xa0;al.</ext-link> and <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1158837">Sun et&#xa0;al</ext-link>. Although both studies use single multispectral imaging sensor equipped on UAS, both spectral and texture features were extracted and fused as the input variables of machine learning model. Multiple multivariate statistical regression models were constructed and compared, and all presented high performances. All the texture features mentioned above are based on the grey level co-occurrence matrix (GLCM) method, however, window size and direction texture parameters are highly sensitive to texture metrics and default parameter values are always applied in previous studies. As a result, <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1284235">Liu et&#xa0;al.</ext-link> conducted a detailed study to understand the optimum window size and directional parameters at different growing stages of rice. As most of the prediction models were developed for a specific growth stage in previous studies, <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1248152">Pokhrel et&#xa0;al.</ext-link> integrated growing degree days (GDD) with VIs extracted from UAV-based multispectral imagery for estimating intercepted photosynthetically active radiation (IPARf), radiation use efficiency (RUE) and harvest index (HI), which were three yield-contributing physiological parameters of cotton. The incorporation of GDD allowed the prediction to be reliably made throughout the whole growing season.</p>
<p>Many VIs were developed in the past decades showing correlation with crop physiological parameters, however, none of the previous studies investigated VIs that can be applied to predict the corn yield throughout the season. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1168732">Shrestha et&#xa0;al.</ext-link> applied both correlation analysis and random forest model to identify the VIs with the most consistency and highest predictive power for corn yield prediction.</p>
<p>Crop traits evaluation on a row segment basis within plots is common in field phenotyping. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1202536">Tolley et&#xa0;al.</ext-link> conducted UAS flights and extracted crop traits using RGB, LiDAR and VNIR (visible and near infrared) sensors and concluded that significant difference was observed between different row selections, and with large plot size, excluding outer rows could lead to more robust model performance. It is believed that this study can support long-standing principles of experiment design in both agronomy and crop breeding with remote sensing.</p>
<p>Apart from aerial platform, ground-based system is also popular in crop phenotyping due to the high resolution imaging across a wide range of wavelengths. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1260772">Tuerxun et&#xa0;al.</ext-link> used a portable spectroradiometer to obtain the hyperspectral data, and multivariate regression model was developed for chlorophyll content estimation in jujube leaves after feature reduction. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1208404">He et&#xa0;al.</ext-link> applied a handheld spectrometer to generate VIs and a light quantum sensor to measure photosynthetic active radiation, which showed fairly good relationships with the dry mater of Choy Sum. Although the same VIs can be derived from both ground-based and aerial-based systems, it was interesting that <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1233892">Herr and Carter</ext-link> found poor correlation across remote sensing platforms, which suggested that data collected from different systems should not be used interchangeable.</p>
<p>Crop phenotyping not only can assess the crop growth status, but also support the decision of the crop management strategy. <ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1143525">Deng et&#xa0;al.</ext-link> collected phenotypic data to support the conclusion that the umbrella-shaped trellis system largely improved the production of Donghong Kiwifruit while maintaining the fruit quality.</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://doi.org/10.3389/fpls.2023.1146490">Zhou et&#xa0;al.</ext-link> presented the only research of ground-based 3D model reconstruction and analysis in this topic. New algorithm was proposed for skeleton point search, which facilitated the stem and leaf segmentation, further leading to the five phenotypic parameters estimation including plant height, stem diameter, main stem length, regional leaf length and leaf number.</p>
<p>The studies published in this Research Topic present novel computer vision algorithms and provide new knowledge in the broad field of crop phenotyping. It is obvious that UAS integrating with imaging sensors has shown great advantages in crop phenotyping. Future studies should investigate more on multi-sensor data fusion and multiple time point data fusion for more reliable crop traits estimation under various environmental conditions. As data collected from different remote sensing systems cannot be used interchangeable, standardisation of remote sensing pipeline for crop traits estimation needs to be investigated in the future to provide re-usable data and reduced operational cost. We hope the studies presented in this Research Topic will help consolidate the integration between remote sensing and crop phenotyping, resulting in more reliable and affordable phenotyping tools under dynamic environments.</p>
<sec id="s1" sec-type="author-contributions">
<title>Author contributions</title>
<p>JL: Conceptualization, Funding acquisition, Writing &#x2013; original draft, Writing &#x2013; review &amp; editing. ZZ: Writing &#x2013; review &amp; editing. BL: Conceptualization, Resources, Writing &#x2013; original draft, Writing &#x2013; review &amp; editing.</p>
</sec>
</body>
<back>
<sec id="s2" sec-type="funding-information">
<title>Funding</title>
<p>The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This work is supported by National Key R&amp;D Program of China (2023YFD2302100), National Natural Science Foundation of China (32372232), Key scientific and technological projects of Heilongjiang province in China (2021ZXJ05A05-03) awarded to Jiangang Liu, also Science and Technology Department of Ningxia - The Key Research and Development Program (2023BCF01017).</p>
</sec>
<sec id="s3" sec-type="COI-statement">
<title>Conflict of interest</title>
<p>Author BL was employed by company Syngenta.</p>
<p>The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<sec id="s4" sec-type="disclaimer">
<title>Publisher&#x2019;s note</title>
<p>All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.</p>
</sec>
</back>
</article>