数学季刊 ›› 2014, Vol. 29 ›› Issue (1): 142-150.doi: 10.13371/j.cnki.chin.q.j.m.2014.01.017

• • 上一篇    下一篇

一种修正的LS共轭梯度方法及其收敛性

  

  1. School of Mathematics and Statistics, Chongqing Three Gorges University
  • 收稿日期:2013-08-07 出版日期:2014-03-30 发布日期:2023-02-14
  • 作者简介:LIU Jin-kui(1982-), male, native of Anyang, Henan, a lecturer of Chongqing Three Gorges University, Master, engages in optimization theory and applications.
  • 基金资助:
    Supported by The Youth Project Foundation of Chongqing Three Gorges University(13QN17); Supported by the Fund of Scientific Research in Southeast University(the Support Project of Fundamental Research)

A Descent Gradient Method and Its Global Convergence

  1. School of Mathematics and Statistics, Chongqing Three Gorges University
  • Received:2013-08-07 Online:2014-03-30 Published:2023-02-14
  • About author:LIU Jin-kui(1982-), male, native of Anyang, Henan, a lecturer of Chongqing Three Gorges University, Master, engages in optimization theory and applications.
  • Supported by:
    Supported by The Youth Project Foundation of Chongqing Three Gorges University(13QN17); Supported by the Fund of Scientific Research in Southeast University(the Support Project of Fundamental Research)

摘要: Y Liu and C Storey(1992) proposed the famous LS conjugate gradient method which has good numerical results. However, the LS method has very weak convergence under the Wolfe-type line search. In this paper, we give a new descent gradient method based on the LS method. It can guarantee the sufficient descent property at each iteration and the global convergence under the strong Wolfe line search. Finally, we also present extensive preliminary numerical experiments to show the efficiency of the proposed method by comparing with the famous PRP+ method. 

关键词: unconstrained optimization, conjugate gradient method, strong Wolfe line search, sufficient descent property, global convergence

Abstract: Y Liu and C Storey(1992) proposed the famous LS conjugate gradient method which has good numerical results. However, the LS method has very weak convergence under the Wolfe-type line search. In this paper, we give a new descent gradient method based on the LS method. It can guarantee the sufficient descent property at each iteration and the global convergence under the strong Wolfe line search. Finally, we also present extensive preliminary numerical experiments to show the efficiency of the proposed method by comparing with the famous PRP+ method. 

Key words: unconstrained optimization, conjugate gradient method, strong Wolfe line search, sufficient descent property, global convergence

中图分类号: