AUTHOR=Laekeman Broes , Bonte Jochem , Dermauw Wannes , Christiaens Annelies , Gobin Bruno , Van Huylenbroeck Johan , Dhooghe Emmy , Lootens Peter TITLE=Species-level detection of thrips and whiteflies on yellow sticky traps using YOLO-based deep learning detection models JOURNAL=Frontiers in Plant Science VOLUME=Volume 16 - 2025 YEAR=2025 URL=https://www.frontiersin.org/journals/plant-science/articles/10.3389/fpls.2025.1668795 DOI=10.3389/fpls.2025.1668795 ISSN=1664-462X ABSTRACT=As of today, pest insects such as thrips and whiteflies cause the loss of 20% - 40% of the global agricultural yield. To reduce chemical pesticide use while maintaining high-quality horticultural standards, early detection of pest infestations is essential. Although AI-assisted pest monitoring systems using sticky trap images exist today, none currently enable effective species-level detection of thrips and/or whiteflies. However, early species-level identification would allow for more targeted, species-specific control strategies, leading to reduced, localized, and more efficient pesticide application. Therefore, in this study, we evaluated the potential and limitations of real-time species-level detection of thrips (Frankliniella occidentalis and Echinothrips americanus) and whiteflies (Bemisia tabaci and Trialeurodes vaporariorum) using non-microscopic, RGB yellow sticky trap images and recent YOLO-based deep learning detection models. To this end, a balanced and labelled image dataset was gathered, consisting of the studied pest species, caught on one type of yellow sticky trap. Subsequently, various versions of the YOLO11 and YOLO-NAS detection model architectures were trained and tested using this dataset at various (digitally reduced) pixel resolutions. All tested high-resolution dataset (pixel size: 5 µm) models achieved species-level detection of the studied pests on an independent test dataset (mAP@50: 79% - 89% | F1@50: 74% - 87%). Even the smallest model (YOLO11n) delivered feasible macro-averaged (mAP@50: 80% | F1@50: 77%) and classwise performance scores (AP@50: 72% - 85% | F1@50: 68% - 82%). The minimum required pixel resolution for feasible species-level detection in greenhouse horticulture was identified as 80 µm for both the YOLO11n and YOLO11x models, enabling the use of modern smartphones, action cameras, or low-cost standalone camera modules. Combined with the low complexity and decent performance of the YOLO11n model, these results demonstrate the potential of feasible, real-time, automated species-level monitoring of (yellow) sticky traps in greenhouse horticulture. Future research should focus on extending this technology to additional pest species, sticky trap types, and ambient light conditions.