Issue №4, Vol. 17
Dmitriev E., Zotov S., Melnik P. Retrieval of the composition of mixed forest stands based on the spectral and texture classification of high-resolution satellite images // Resources and Technology. 2020. №4, Vol. 17. P. 65‒79.



DOI: 10.15393/j2.art.2020.5502

Retrieval of the composition of mixed forest stands based on the spectral and texture classification of high-resolution satellite images

Dmitriev
   Egor
Institute of Numerical Mathematics, Russian Academy of Sciences, yegor@mail.ru
Zotov
   Sergey
Moscow Institute of Physics and Technology, zotov.sa@mipt.ru
Melnik
   Petr
Bauman Moscow State Technical University, melnik_petr@bk.ru
Key words:
remote sensing
pattern recognition
thematic processing
texture analysis
high-resolution satellite images
classification of tree stands
Summary: Development of satellite equipment makes it possible to obtain multispectral and panchromatic images of high spatial resolution, new possibilities open up to improve the accuracy and detail of remote sensing of the soil and vegetation cover through the combined use of spectral and textural features of the objects under study. In this paper, we propose a method for recognizing the species composition and age classes of mixed forest stands based on joint processing of multispectral and panchromatic satellite images of WorldView-2. The statistical features of Haralik were used to describe the texture features . A previously developed modified decoding method, which belongs to the class of ensemble classification methods was used to perform the trained classification. To assess the effectiveness of the proposed approach, test calculations were made for the joint processing of high-resolution images of the selected area of the Savvatievskoe forestry (Tver region), the results of which were compared with the data of ground forest inventory. A group of natural factors that cause a discrepancy between satellite and ground information was identified when interpreting the calculation results.

Displays: 611; Downloads: 371;