Knowledge Center Catalog

Local cover image
Local cover image

Temporal sentinel-2 imagery for wheat mapping and monitoring : Analyzing phenological stages with machine learning to improve mapping precision for small farms

By: Contributor(s): Material type: ArticleLanguage: English Publication details: Germany : Copernicus GmbH, 2025.ISSN:
  • 2194-9042
  • 2194-9050 (Online)
Subject(s): Online resources: In: ISPRS Annals of the Photogrammetry Remote Sensing and Spatial Information Sciences Germany : Copernicus GmbH, 2025 v. G-2025, p. 143-149Summary: Precise mapping and tracking of wheat crops are crucial to improve agricultural management, particularly for small farms in challenging landscapes such as Nepal. By utilizing temporal Sentinel-2 imagery, this research maps wheat fields by examining phenological stages using machine learning methods, which enhances classification accuracy. Sentinel-2, a component of the Copernicus program by the European Space Agency, offers high-quality multispectral images for precise monitoring of crop growth over time. Two classification models, Random Forest (RF) and Support Vector Machine (SVM), were employed to distinguish wheat from non-wheat areas. The accuracy of classification was improved by integrating in-situ data collected with Kobo Toolbox. The findings showed that Random Forest performed better than SVM, reaching 99% accuracy in training and 86% in validation, with 56%of the study region classified as wheat. RF's outstanding performance is due to its capacity to manage temporal and spectral intricacies, particularly in capturing the phenological cycle of crops. This research showcases how machine learning, specifically Random Forest, can enhance the accuracy of wheat mapping for small farms by analyzing phenological stages effectively, with plans to apply these methods to rice and maize in the future.
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
Holdings
Item type Current library Collection Status
Conference paper CIMMYT Knowledge Center: John Woolston Library CIMMYT Staff Publications Collection Available
Total holds: 0

Peer review

Open Access

Precise mapping and tracking of wheat crops are crucial to improve agricultural management, particularly for small farms in challenging landscapes such as Nepal. By utilizing temporal Sentinel-2 imagery, this research maps wheat fields by examining phenological stages using machine learning methods, which enhances classification accuracy. Sentinel-2, a component of the Copernicus program by the European Space Agency, offers high-quality multispectral images for precise monitoring of crop growth over time. Two classification models, Random Forest (RF) and Support Vector Machine (SVM), were employed to distinguish wheat from non-wheat areas. The accuracy of classification was improved by integrating in-situ data collected with Kobo Toolbox. The findings showed that Random Forest performed better than SVM, reaching 99% accuracy in training and 86% in validation, with 56%of the study region classified as wheat. RF's outstanding performance is due to its capacity to manage temporal and spectral intricacies, particularly in capturing the phenological cycle of crops. This research showcases how machine learning, specifically Random Forest, can enhance the accuracy of wheat mapping for small farms by analyzing phenological stages effectively, with plans to apply these methods to rice and maize in the future.

Text in English

Dristy Bajimaya : Not in IRS staff list but CIMMYT Affiliation

Pinjarla, B. : Not in IRS staff list but CIMMYT Affiliation

Kamal, M. : Not in IRS staff list but CIMMYT Affiliation

Centro Internacional de Mejoramiento de MaĆ­z y Trigo (CIMMYT)

Click on an image to view it in the image viewer

Local cover image
Share

International Maize and Wheat Improvement Center (CIMMYT) © Copyright 2021.
Carretera México-Veracruz. Km. 45, El Batán, Texcoco, México, C.P. 56237.
If you have any question, please contact us at
CIMMYT-Knowledge-Center@cgiar.org