Channel-spatial attention-based pan-sharpening of very high-resolution satellite images

Wang P., Sertel E.

KNOWLEDGE-BASED SYSTEMS, vol.229, 2021 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 229
  • Publication Date: 2021
  • Doi Number: 10.1016/j.knosys.2021.107324
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, Applied Science & Technology Source, Computer & Applied Sciences, INSPEC, Library and Information Science Abstracts, Library, Information Science & Technology Abstracts (LISTA)
  • Keywords: Channel attention, Spatial attention, Pan-sharpening, Remote sensing, Residual networks, FUSION
  • Istanbul Technical University Affiliated: Yes


The pan-sharpening process aims to generate a new synthetic output image preserving the spatial details of panchromatic and spectral details of the multi-spectral image inputs. Recently, deep learning-based methods show substantial success in the remote sensing field mostly with the application of traditional Convolutional Neural Networks (CNNs). Most of the traditional CNN-based approaches treat all the channels equitably and cannot learn the correlation. Attention mechanism which can learn the correlations among the channels has been proven to be effective in super-resolution and object detection tasks. In this research, we introduced a novel deep learning framework, channel-spatial attention-based method for pan-sharpening (CSAPAN), by designing a Densely residual attention module (RAM). Besides, we train our model in the high-frequency domain and up-sample the low-resolution multispectral images by using the pixel shuffle method before stacking with the panchromatic images for further feature extraction. We evaluated our proposed CSAPAN along with traditional methods and CNN-based methods in reduced and full resolution and obtained satisfactory quantitative and qualitative results on Pleiades, Worldview-2, and QuickBird-2 satellite image datasets. (C) 2021 Elsevier B.V. All rights reserved.