A common issue of seismic data analysis consists in the lack of regular and densely sampled seismic traces. This problem is commonly tackled by rank optimization or statistical features learning algorithms, which allow interpolation and denoising of corrupted data. In this paper, we propose a completely novel approach for reconstructing missing traces of pre-stack seismic data, taking inspiration from computer vision and image processing latest developments. More specifically, we exploit a specific kind of convolutional neural networks known as convolutional autoencoder. We illustrate the advantages of using deep learning strategies with respect to state-of-the-art by comparing the achieved results over a well-known seismic dataset.

Seismic data interpolation through convolutional autoencoder

Mandelli, Sara;Borra, Federico;Lipari, Vincenzo;Bestagini, Paolo;Sarti, Augusto;Tubaro, Stefano
2019-01-01

Abstract

A common issue of seismic data analysis consists in the lack of regular and densely sampled seismic traces. This problem is commonly tackled by rank optimization or statistical features learning algorithms, which allow interpolation and denoising of corrupted data. In this paper, we propose a completely novel approach for reconstructing missing traces of pre-stack seismic data, taking inspiration from computer vision and image processing latest developments. More specifically, we exploit a specific kind of convolutional neural networks known as convolutional autoencoder. We illustrate the advantages of using deep learning strategies with respect to state-of-the-art by comparing the achieved results over a well-known seismic dataset.
2019
2018 SEG International Exposition and Annual Meeting, SEG 2018
Geophysics
File in questo prodotto:
File Dimensione Formato  
Seismic_data_interpolation_through_convolutional_autoencoder.pdf

Accesso riservato

: Post-Print (DRAFT o Author’s Accepted Manuscript-AAM)
Dimensione 3.8 MB
Formato Adobe PDF
3.8 MB Adobe PDF   Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11311/1086339
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 47
  • ???jsp.display-item.citation.isi??? ND
social impact