Low-frequency constrained seismic impedance inversion combining large kernel attention and long short-term memory

Zong Wei, Shu Li, Juan Ning, Xiao Chen, Xi Yang

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

In the seismic impedance inversion, the low-frequency information reflects the general trend of the impedance curve. Without low-frequency information, inversion results cannot accurately reflect stratigraphic changes. Seismic data are also spatially correlated, while the conventional inversion methods do not consider the spatial correlation of geological structures, which may lead to poor lateral continuity of the inversion results. To alleviate these problems, we propose a low-frequency constrained seismic impedance inversion method combining large kernel attention (LKA) and long short-term memory (LSTM). Our network structure is divided into an inversion module and a low-frequency feature extraction module. In the inversion module, we integrate LKA and LSTM into the network, which can improve the lateral continuity of the inversion results. The low-frequency feature extraction module constrains the entire network structure and extracts more refined low-frequency features. To demonstrate the reliability of the proposed method, we applied it to the SEAM model. Experiments show that our method has the best lateral continuity and accuracy, with mean squared error and Coefficient of Determination (R2) of 0.0485 and 0.9164, respectively, as well as strong noise immunity. This method also achieves favorable inversion results on the Volve field seismic data.

Original languageEnglish
Pages (from-to)4045-4062
Number of pages18
JournalActa Geophysica
Volume72
Issue number6
DOIs
StatePublished - Dec 2024
Externally publishedYes

Keywords

  • Large kernel attention (LKA)
  • Lateral continuity
  • Long short-term memory (LSTM)
  • Low-frequency information
  • Seismic impedance inversion

Fingerprint

Dive into the research topics of 'Low-frequency constrained seismic impedance inversion combining large kernel attention and long short-term memory'. Together they form a unique fingerprint.

Cite this