Parameter-Efficient Fine-Tuning With Frequency Adapter for Enhanced Sea–Land Segmentation

Dongliang Ma, Likai Zhu, Fang Zhao, Yichen Xie, Ye Li, Min Liu

Research output: Contribution to journalArticlepeer-review

Abstract

Accurate sea-land segmentation (SLS) from satellite imagery is essential for monitoring coastline changes, which holds great significance for coastal regions. Recent advances in deep learning (DL), particularly convolutional neural networks (CNNs) and vision transformers (ViTs), have exhibited promising performance. However, these models struggle with the diverse characteristics of global coastlines and require extensive labeled data, which is often scarce. To overcome these challenges, we propose a novel approach leveraging foundation models for SLS. Our method proposes a parameter-efficient fine-tuning (PEFT) module called Freq-Adapter, which integrates frequency representation with minimal additional parameters into pretrained foundation models. Meanwhile, we introduce a parameter-efficient continual pretraining (PECP) with self-supervised learning (SSL) on unlabeled coastal remote sensing data, allowing the model to adapt to coastal image characteristics while preserving pretrained knowledge. Furthermore, we design a lightweight detail head (LDHead) to enhance image details and edges, improving the ability to detect irregular sea-land boundaries. Extensive experiments demonstrate the superior effectiveness and robust generalization of our method on the large-scale SLS datasets, highlighting its potential for accurate and efficient coastal monitoring.

Original languageEnglish
JournalIEEE Transactions on Geoscience and Remote Sensing
Volume63
DOIs
StatePublished - 2025

Keywords

  • Continual pretraining
  • deep learning (DL)
  • foundation models
  • frequency representation
  • satellite imagery
  • sea–land segmentation (SLS)

Fingerprint

Dive into the research topics of 'Parameter-Efficient Fine-Tuning With Frequency Adapter for Enhanced Sea–Land Segmentation'. Together they form a unique fingerprint.

Cite this