Chinese Quarterly Journal of Mathematics ›› 2003, Vol. 18 ›› Issue (2): 154-162.

Previous Articles     Next Articles

Global Convergence of a New Restarting Conjugate Gradient Method for Nonlinear Optimizations

  

  1. Department   of   Applied   Mathematics,Dalian    University    of   Technology,Dalian    116024,China;Depart- ment   of  Applied  Mathematics,University   of  Petroleum,Dongying   257061,China
  • Received:2001-10-30 Online:2003-06-30 Published:2024-04-16
  • About author:SUN Qing-ying(1966-),male,native of Qingzhou,Shandong,an associate professor of University of Petro- leum,M.S.D.,engages in nonlinear programming.

Abstract: Conjugate gradient optimization algorithms depend on the search directions with different  choices for the parameters in the search directions.In this note,by combining the nice numerical per-  formance of PR and HS methods with the global convergence property of the class of conjugate gradient methods presented by HU and STOREY(1991),a class of new restarting conjugate gradient methods is presented.Global convergences of the new method with two kinds of common line searches,are proved.Firstly,it  is  shown  that,using  reverse  modulus  of continuity  function  and  forcing  function,  the new method for solving unconstrained optimization can work for a continously differentiable function with Curry-Altman's step size rule and a bounded level set.Secondly,by using comparing technique,  some general convergence propecties of the new method with other kind of step size rule are established. Numerical experiments show that the new method is efficient by comparing with FR conjugate gradient method 

Key words:  nonlinear , programming;restarting ,  conjugate ,  gradient , method;forcing ,  function;reverse modulus of continuity function;convergence

CLC Number: