000 03840nab|a22003257a|4500
001 65765
003 MX-TxCIM
005 20221207173120.0
008 20231s2023||||mx |||p|op||||00||0|eng|d
022 _a2352-9385
024 8 _ahttps://doi.org/10.1016/j.rsase.2022.100859
040 _aMX-TxCIM
041 _aeng
100 1 _aMollick, T.
_910702
245 1 0 _aGeospatial-based machine learning techniques for land use and land cover mapping using a high-resolution unmanned aerial vehicle image
260 _bElsevier,
_c2023.
_aAmsterdam (Netherlands) :
500 _aPeer review
500 _aReference only
520 _aBangladesh is primarily an agricultural country where technological advancement in the agricultural sector can ensure the acceleration of economic growth and ensure long-term food security. This research was conducted in the south-western coastal zone of Bangladesh, where rice is the main crop and other crops are also grown. Land use and land cover (LULC) classification using remote sensing techniques such as the use of satellite or unmanned aerial vehicle (UAV) images can forecast the crop yield and can also provide information on weeds, nutrient deficiencies, diseases, etc. to monitor and treat the crops. Depending on the reflectance received by sensors, remotely sensed images store a digital number (DN) for each pixel. Traditionally, these pixel values have been used to separate clusters and classify various objects. However, it frequently generates a lot of discontinuity in a particular land cover, resulting in small objects within a land cover that provide poor image classification output. It is called the salt-and-pepper effect. In order to classify land cover based on texture, shape, and neighbors, Pixel-Based Image Analysis (PBIA) and Object-Based Image Analysis (OBIA) methods use digital image classification algorithms like Maximum Likelihood (ML), K-Nearest Neighbors (KNN), k-means clustering algorithm, etc. to smooth this discontinuity. The authors evaluated the accuracy of both the PBIA and OBIA approaches by classifying the land cover of an agricultural field, taking into consideration the development of UAV technology and enhanced image resolution. For classifying multispectral UAV images, we used the KNN machine learning algorithm for object-based supervised image classification and Maximum Likelihood (ML) classification (parametric) for pixel-based supervised image classification. Whereas, for unsupervised classification using pixels, we used the K-means clustering technique. For image analysis, Near-infrared (NIR), Red (R), Green (G), and Blue (B) bands of a high-resolution ground sampling distance (GSD) 0.0125m UAV image was used in this research work. The study found that OBIA was 21% more accurate than PBIA, indicating 94.9% overall accuracy. In terms of Kappa statistics, OBIA was 27% more accurate than PBIA, indicating Kappa statistics accuracy of 93.4%. It indicates that OBIA provides better classification performance when compared to PBIA for the classification of high-resolution UAV images. This study found that by suggesting OBIA for more accurate identification of types of crops and land cover, which will help crop management, agricultural monitoring, and crop yield forecasting be more effective.
546 _aText in English
591 _aMollick, T. : Not in IRS staff list but CIMMYT Affiliation
650 7 _aLand cover mapping
_2AGROVOC
_928423
650 7 _aMachine learning
_2AGROVOC
_911127
650 7 _aUnmanned aerial vehicles
_2AGROVOC
_911401
651 7 _2AGROVOC
_91424
_aBangladesh
700 1 _aAzam, M.G.
_929380
700 1 _aKarim, S.
_929381
773 0 _tRemote Sensing Applications: Society and Environment
_gv. 29, art. 100859
_dAmsterdam (Netherlands) : Elsevier, 2023
_x2352-9385
942 _cJA
_n0
_2ddc
999 _c65765
_d65757