数学季刊 ›› 2004, Vol. 19 ›› Issue (2): 198-204.

• • 上一篇    下一篇

结合修正Curry-Altman步长搜索一个新的共轭梯度算法的收敛性

  

  1. Department of Mathematics, Shenzhen University, Shenzhen 518060, China; Department of Applied Mathematics, University of Petroleum, Dongying 257062, China
  • 收稿日期:2003-03-12 出版日期:2004-06-30 发布日期:2024-03-20
  • 作者简介:CAO Li-hua(1964-),female,native of Yantai,Shandong,a lecturer of Shenzhen University, M.S.D.,engages in asymptotics.

Global Convergence of a New Conjugate Gradient Methods with a Modification of the Curry-Altman Step-size Rule

  1. Department of Mathematics, Shenzhen University, Shenzhen 518060, China; Department of Applied Mathematics, University of Petroleum, Dongying 257062, China
  • Received:2003-03-12 Online:2004-06-30 Published:2024-03-20
  • About author:CAO Li-hua(1964-),female,native of Yantai,Shandong,a lecturer of Shenzhen University, M.S.D.,engages in asymptotics.

摘要: Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameter in the search directions. In this note, conditions are given on the parameter in the conjugate gradient directions to ensure the descent property of the search directions. Global convergence of such a class of methods is discussed. It is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continuously differentiable function with a modification of the Curry-Altman’s step-size rule and a bounded level set. Combining PR method with our new method, PR method is modified to have global convergence property. Numerical experiments show that the new methods are efficient by comparing with FR conjugate gradient method. 

关键词: nonlinear programming, forcing function, reverse modulus of continuity func- tion, convergence

Abstract: Conjugate gradient optimization algorithms depend on the search directions with different choices for the parameter in the search directions. In this note, conditions are given on the parameter in the conjugate gradient directions to ensure the descent property of the search directions. Global convergence of such a class of methods is discussed. It is shown that, using reverse modulus of continuity function and forcing function, the new method for solving unconstrained optimization can work for a continuously differentiable function with a modification of the Curry-Altman’s step-size rule and a bounded level set. Combining PR method with our new method, PR method is modified to have global convergence property. Numerical experiments show that the new methods are efficient by comparing with FR conjugate gradient method. 

Key words: nonlinear programming, forcing function, reverse modulus of continuity func- tion, convergence

中图分类号: