FAUNet: Frequency Attention U-Net for Parcel Boundary Delineation in Satellite Images

Awad B., Erer I.

Remote Sensing, vol.15, no.21, 2023 (SCI-Expanded) identifier

  • Publication Type: Article / Article
  • Volume: 15 Issue: 21
  • Publication Date: 2023
  • Doi Number: 10.3390/rs15215123
  • Journal Name: Remote Sensing
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Academic Search Premier, CAB Abstracts, Compendex, INSPEC, Veterinary Science Database, Directory of Open Access Journals
  • Keywords: attention gates, boundary delineation, edge detection, frequency attention, high-pass filtering, U-Net
  • Istanbul Technical University Affiliated: Yes


Parcel detection and boundary delineation play an important role in numerous remote sensing applications, such as yield estimation, crop type classification, and farmland management systems. Consequently, achieving accurate boundary delineation remains a prominent research area within remote sensing literature. In this study, we propose a straightforward yet highly effective method for boundary delineation that leverages frequency attention to enhance the precision of boundary detection. Our approach, named Frequency Attention U-Net (FAUNet), builds upon the foundational and successful U-Net architecture by incorporating a frequency-based attention gate to enhance edge detection performance. Unlike many similar boundary delineation methods that employ three segmentation masks, our network employs only two, resulting in a more streamlined post-processing workflow. The essence of frequency attention lies in the integration of a frequency gate utilizing a high-pass filter. This high-pass filter output accentuates the critical high-frequency components within feature maps, thereby significantly improves edge detection performance. Comparative evaluation of FAUNet against alternative models demonstrates its superiority across various pixel-based and object-based metrics. Notably, FAUNet achieves a pixel-based precision, F1 score, and IoU of 0.9047, 0.8692, and 0.7739, respectively. In terms of object-based metrics, FAUNet demonstrates minimal over-segmentation (OS) and under-segmentation (US) errors, with values of 0.0341 and 0.1390, respectively.