摘要
常规的全波形反演(FWI)计算得到的梯度往往不做优化处理,并且每次计算模型更新量时还需要额外的波场延拓计算来获取迭代步长,导致收敛效率和反演精度降低.本文将深度学习中的Adam算法引入到全波形反演中来,其可在仅付出极小计算代价的前提下实现梯度的优化处理,由梯度直接计算出模型更新量,避免了迭代步长的计算,且能显著提升收敛效率和反演精度;同时针对Adam算法的默认参数优化和选取问题,通过数值实验系统分析了不同参数的全波形反演效果,并给出了更适合于全波形反演的优化参数.实验结果表明,相比于默认参数的Adam算法以及L-BFGS算法的全波形反演,基于优化参数的Adam算法其收敛速度和反演精度更高.
For the conventional Full Waveform Inversion(FWI),the gradient is not optimized,and extra wavefield extrapolations are needed to calculate the step-length for velocity updating usually,which slows down the convergence rate and the inversion accuracy.This paper introduces Adam,a deep learning algorithm into FWI,so as to improve thse convergence efficiency and inversion accuracy with less computational cost by optimizing the gradients and providing the optimized model update value directly.Meanwhile,we systematically analyze the effect of different parameters on the FWI by lots of numerical experiments and give optimized parameters which are more suitable for FWI.Model tests demonstrate that,the FWI based on Adam algorithm with the optimized parameters has much higher convergence rate and inversion accuracy than both the FWI based on Adam algorithm with the default parameters and the FWI based on limited-memory Broyden-Fletcher-Goldfarb-Shanno(L-BFGS)algorithm.
作者
王倩倩
宋鹏
华清峰
刘保华
李官保
王绍文
都国宁
WANG QianQian;SONG Peng;HUA QingFeng;LIU BaoHua;LI GuanBao;WANG ShaoWen;DU GuoNing(College of Marine Geo-sciences,Ocean University of China,Qingdao 266100,China;Laboratory for Marine Mineral Resource,Laoshan Laboratory,Qingdao 266100,China;Key Laboratory of Submarine Geosciences and Prospecting Techniques Ministry of Education,Qingdao 266100,China;First Institute of Oceanography,Ministry of Natural Resources,Qingdao 266061,China)
出处
《地球物理学报》
SCIE
EI
CAS
CSCD
北大核心
2023年第11期4654-4663,共10页
Chinese Journal of Geophysics
基金
国家自然科学基金项目(42074138)
崂山实验室科技创新项目(2021WHZZB0703)
山东省第一地质矿产院开放基金项目(2022DY03)
山东省重大科技创新工程项目(2019JZZY010803)
中国石油天然气集团有限公司科学研究与技术开发项目(2021ZG02)联合资助。
关键词
全波形反演
梯度优化
ADAM
参数优化
Full Waveform Inversion(FWI)
Gradient optimization
Adam
Parameters optimization