Integrating Convolutional Neural Network and Multiresolution Segmentation for Land Cover and Land Use Mapping Using Satellite Imagery

Atik S. O., İpbüker C.

APPLIED SCIENCES-BASEL, vol.11, no.12, 2021 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 11 Issue: 12
  • Publication Date: 2021
  • Doi Number: 10.3390/app11125551
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Agricultural & Environmental Science Database, Applied Science & Technology Source, Communication Abstracts, INSPEC, Metadex, Directory of Open Access Journals, Civil Engineering Abstracts
  • Istanbul Technical University Affiliated: Yes


Depletion of natural resources, population growth, urban migration, and expanding drought conditions are some of the reasons why environmental monitoring programs are required and regularly produced and updated. Additionally, the usage of artificial intelligence in the geospatial field of Earth observation (EO) and regional land monitoring missions is a challenging issue. In this study, land cover and land use mapping was performed using the proposed CNN-MRS model. The CNN-MRS model consisted of two main steps: CNN-based land cover classification and enhancing the classification with spatial filter and multiresolution segmentation (MRS). Different band numbers of Sentinel-2A imagery and multiple patch sizes (32 x 32, 64 x 64, and 128 x 128 pixels) were used in the first experiment. The algorithms were evaluated in terms of overall accuracy, precision, recall, F1-score, and kappa coefficient. The highest overall accuracy was obtained with the proposed approach as 97.31% in Istanbul test site area and 98.44% in Kocaeli test site area. The accuracies revealed the efficiency of the CNN-MRS model for land cover map production in large areas. The McNemar test measured the significance of the models used. In the second experiment, with the Zurich Summer dataset, the overall accuracy of the proposed approach was obtained as 92.03%. The results are compared quantitatively with state-of-the-art CNN model results and related works.