Knowledge Center Catalog

Temporal sentinel-2 imagery for wheat mapping and monitoring : Analyzing phenological stages with machine learning to improve mapping precision for small farms

Bikesh Bade

Temporal sentinel-2 imagery for wheat mapping and monitoring : Analyzing phenological stages with machine learning to improve mapping precision for small farms - Germany : Copernicus GmbH, 2025.

Peer review Open Access

Precise mapping and tracking of wheat crops are crucial to improve agricultural management, particularly for small farms in challenging landscapes such as Nepal. By utilizing temporal Sentinel-2 imagery, this research maps wheat fields by examining phenological stages using machine learning methods, which enhances classification accuracy. Sentinel-2, a component of the Copernicus program by the European Space Agency, offers high-quality multispectral images for precise monitoring of crop growth over time. Two classification models, Random Forest (RF) and Support Vector Machine (SVM), were employed to distinguish wheat from non-wheat areas. The accuracy of classification was improved by integrating in-situ data collected with Kobo Toolbox. The findings showed that Random Forest performed better than SVM, reaching 99% accuracy in training and 86% in validation, with 56%of the study region classified as wheat. RF's outstanding performance is due to its capacity to manage temporal and spectral intricacies, particularly in capturing the phenological cycle of crops. This research showcases how machine learning, specifically Random Forest, can enhance the accuracy of wheat mapping for small farms by analyzing phenological stages effectively, with plans to apply these methods to rice and maize in the future.


Text in English

2194-9042 2194-9050 (Online)

https://doi.org/10.5194/isprs-annals-X-G-2025-143-2025


Machine learning
Phenology
Small farms
Wheat


Nepal

International Maize and Wheat Improvement Center (CIMMYT) © Copyright 2021.
Carretera México-Veracruz. Km. 45, El Batán, Texcoco, México, C.P. 56237.
If you have any question, please contact us at
CIMMYT-Knowledge-Center@cgiar.org