NDVI Estimation Throughout the Whole Growth Period of Multi-Crops Using RGB Images and Deep Learning

The Normalized Difference Vegetation Index (NDVI) is an important remote sensing index that is widely used to assess vegetation coverage, monitor crop growth, and predict yields.Traditional NDVI calculation methods often rely on multispectral or hyperspectral imagery, which are costly and complex to operate, thus limiting their applicability in small-scale farms and developing countries.To address these limitations, this study proposes an NDVI estimation method based on low-cost RGB (red, green, and blue) UAV (unmanned aerial vehicle) imagery combined with deep learning techniques.This study utilizes field data from five major crops (cotton, rice, maize, rape, and wheat) click here throughout their whole growth periods.RGB images were used to extract conventional features, including color indices (CIs), texture features (TFs), and vegetation coverage, while convolutional features (CFs) were extracted using the deep learning network ResNet50 to optimize the model.

The results indicate that the model, optimized with CFs, significantly enhanced NDVI estimation accuracy.Specifically, the R2 values for maize, rape, and wheat during their whole growth periods reached 0.99, while those for rice and cotton were 0.96 and 0.93, respectively.

Notably, the accuracy improvement in later growth periods was most pronounced for cotton and maize, with average R2 increases of 0.15 and 0.14, respectively, whereas wheat exhibited a more modest improvement of only 0.04.This method leverages deep learning to capture structural changes in crop populations, optimizing conventional image features and improving NDVI estimation accuracy.

This study presents an NDVI estimation approach applicable to the whole growth period of common here crops, particularly those with significant population variations, and provides a valuable reference for estimating other vegetation indices using low-cost UAV-acquired RGB images.

Leave a Reply

Your email address will not be published. Required fields are marked *