Issue |
Ciência Téc. Vitiv.
Volume 40, Number 1, 2025
|
|
---|---|---|
Page(s) | 1 - 9 | |
DOI | https://doi.org/10.1051/ctv/ctv2025400101 | |
Published online | 19 February 2025 |
Article
Identification of green leafhoppers (cicadellidae) in vineyards through an automatic image acquisition system from yellow sticky traps associated with deep-learning
Identificação de cigarrinhas verdes (cicadellidae) em vinhas através de um sistema automático de aquisição de imagens de armadilhas amarelas adesivas associado a um algoritmo de deep-learning
1
MARE - Marine and Environmental Sciences Centre & ARNET - Aquatic Research Infrastructure Network Associated Laboratory and Department of Physics, Faculty of Sciences, University of Lisbon, Campo Grande, 1749-016, Lisboa, Portugal
2
CESAM_Ciências - Centre for Environmental and Marine Studies and Department of Animal Biology, Faculty of Sciences, University of Lisbon, Campo Grande, 1749-016, Lisboa, Portugal
3
Ce3C-CHANGE - Centre for Ecology, Evolution and Environmental Changes and Global Change & Sustainability Institute, Department of Animal Biology, Faculty of Sciences, University of Lisbon, 1749-016, Lisboa, Portugal
4
Department of Agronomy, Food, Natural Resources, Animals and Environment, University of Padova, viale dell'Università, 16 35020, Legnaro (PD), Italy
5
LEAF - Linking Landscape, Environment, Agriculture and Food, School of Agriculture, University of Lisbon, Tapada da Ajuda, 1349-017, Lisboa, Portugal
6
C-MAST - Centre for Mechanical and Aerospace Science and Technologies, Department of Electromechanical Engineering, University of Beira Interior, 6201-001, Covilhã, Portugal
7
CEF - Forest Research Centre, School of Agriculture, University of Lisbon, Tapada da Ajuda, 1349-017, Lisboa, Portugal
8
TERRA - Sustainable Land Use and Ecosystem Services Associated Laboratory, School of Agriculture, University of Lisbon, Tapada da Ajuda, 1349-017, Lisboa, Portugal
* Corresponding author: Tel.: + 351.213653226 e-mail: jsantossilva@isa.ulisboa.pt
Received:
24
July
2024
Accepted:
28
January
2025
This work presents an innovative approach to expedite the identification process of green leafhoppers by combining a deep-learning algorithm with an automatic camera system that captured high-resolution images from yellow sticky traps. Identifying and monitoring agricultural insects are crucial for implementing effective pest management strategies. Conventional insect identification and counting methods can be time-consuming and labor-intensive, urging the need for efficient and accurate automated solutions. The deep learning algorithm based on convolutional neural networks (CNNs) learn discriminators from a diverse set of green leafhopper images. The model’s architecture was optimized to handle variations in lighting conditions, angles, and orientations commonly found in field settings. To assess the algorithm’s efficacy, the test images were also evaluated by human curation and results accounted for in terms of false positives and false negatives. The results demonstrated the algorithm’s capability to accurately identify green leafhopper species, improving the speed of identification compared to conventional methods while maintaining a high level of precision (80%), and a harmonic mean of the precision and recall (F1) of 0.85. The combination of a deep learning algorithm and real-time data acquisition allows a fast decision-making by technicians and researchers, supporting the implementation of pest management strategies, and demonstrates the promising potential for specific and sustainable pest monitoring, contributing to the progress of precision farming practices.
Resumo
Este trabalho apresenta uma abordagem inovadora para agilizar a identificação de cigarrinhas verdes, combinando um algoritmo de deeplearning com um sistema automatizado que captou imagens de alta resolução de armadilhas amarelas adesivas. A monitorização e identificação de insetos agrícolas são cruciais para a implementação de estratégias eficazes de gestão de pragas. Os métodos convencionais de identificação e contagem de insetos podem ser demorados e árduos, exigindo soluções automatizadas eficientes e precisas. Os métodos de deep-learning baseados em redes neuronais convolucionais (CNNs) aprendem caraterísticas discriminativas associadas ao complexo de espécies de cigarrinhas verdes através de um processo de treino, recorrendo a um conjunto diversificado de imagens dos insetos. A arquitetura do modelo foi otimizada para variações nas condições de iluminação, ângulos e orientações comumente encontradas no campo. A eficácia do algoritmo foi avaliada sobre um conjunto extenso de imagens-teste, em que um especialista humano identificou as ocorrências, para se proceder posteriormente à contabilização de falsos positivos e falsos negativos detetados. Os resultados demonstraram a capacidade do algoritmo para identificar com precisão espécies de cigarrinhas verdes, melhorando a velocidade de identificação em comparação com métodos tradicionais, mantendo um alto nível de precisão (80%) e um F1=0,85. A combinação de um algoritmo de deep-learning e a aquisição de dados em tempo real permite uma rápida tomada de decisão, apoiando a implementação de estratégias de gestão de pragas, e demonstra um potencial promissor para a monitorização sustentável de pragas específicas, contribuindo para o progresso de práticas agrícolas de precisão.
Key words: Automatic traps / pest monitoring / cicadellids / Vitis vinifera
Palavras-chave: Armadilhas automáticas / monitorização de pragas / cicadelídeos / Vitis vinifera
© Proença et al., 2025
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.
Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.
Initial download of the metrics may take a while.