In this paper conjugate gradient methods with nonmonotone line search technique are introduced. A nonmonotone hybrid conjugate gradient method is proposed, in which the technique of the nonmonotone wolfe line search is used. The method incorporates the modified bfgs secant equation in an effort to include the second order information of the objective function. In this paper, we provide and analyze a new scaled conjugate gradient method and its performance, based on the modified secant equation of the broydenfletchergoldfarbshanno bfgs method and on a new modified nonmonotone line search technique. Dais nonmonotone conjugate gradient method is generalized to the riemannian case and global convergence of the new algorithm is established under standard assumptions. Li and yang journal of inequalities and applications a nonmonotone hybrid conjugate gradient method for unconstrained optimization wenyu li 0 yueting yang 0 0 school of mathematics and statistics, beihua university, jilin street no. A new nonmonotone spectral conjugate gradient method for. A class of nonmonotone conjugate gradient methods for. On the subspace minimization conjugate gradient method yuhong dai center for optimization and applications amss, chinese academy of sciences discussing with y. Krylov subspaces and conjugate gradients c 2006 gilbert strang 6. Abstract pdf 679 kb 2017 a nonmonotone prp conjugate gradient method for solving square and underdetermined systems of equations. A comparative study of non linear conjugate gradient methods.
We introduce two novel vector transports associated with the retraction constructed by the cayley transform. The spectral gradient method has been successfully extended in for solving square nonlinear systems of equations using grippos nonmonotone line search technique. Nonmonotone conjugate gradient methods for optimization. In this paper, we introduce a class of nonmonotone conjugate gradient methods, which include the wellknown polakribiere method and hestenesstiefel method as special cases. This makes them easy to implement and they do not require much storage.
The result is conjugate gradient on the normal equations cgnr. A riemannian conjugate gradient method for optimization on. A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. In this paper, combining the nonmonotone and monotone line search,a spectral conjugate gradient methods are used in this paper. A nonmonotone line search technique for newtons method. An introduction to the conjugate gradient method without the. Preliminary numerical results show that this method is very ef. This paper is devoted to a riemannian conjugate gradient method for solving problem 1. Method of conjugate gradient method of steepest descent was constructing steps with successive residual vectors being orthogonal. We then of n are being very large, say, n 106 or n 107.
Request pdf the convergence of conjugate gradient method with nonmonotone line search the conjugate gradient method is a useful and powerful approach for. It is well known that in the euclidean space, the conjugate gradient method generally outperforms the steepest descent method for its faster convergence and is more suitable than secondorder methods such as newtons method, quasinewton. Numerical results on a variety of lowrank test problems demonstrate the effectiveness of the new method. Although the steepest descent method converges, it is inef.
Image restoration is an illposed inverse problem, which has been introduced the regularization method to suppress over. This new line search technique is based on a relaxation of the strong wolfe conditions and it allows to accept larger steps. A scaled conjugate gradient method based on new bfgs. The spectral projected gradient spg method birgin, mart nez, and raydan2000,2001. The cg method has the simplicity and the very low memory requirement and the prp method is one of the most effective conjugate gradient methods. With exact line search, our method reduces to a nonlinear ver. However, for some illconditioned problems, orthogonality is quickly lost due to rounding errors, and convergence is much slower than expected. A modi ed of nonmonotone spectral conjugate gradient method. By making use of the moreauyosida regularization, a nonmonotone line search technique of 48 and a new secant equation of 43 derived by the authors earlier, we present a modified prp conjugate. Nonmonotone spectral projected gradient methods on convex sets. Numerical experiments show that the nonmonotone polakribiere method and hestenesstiefel method in this nonmonotone conjugate gradient class are. In theory, the successive gradients generated by the conjugate gradient method applied to a quadratic should be orthogonal.
The new residual is orthogonal to the whole space not only to one residual vector in the previous step. With exact line search, our method reduces to a nonlinear version of the hestenesstiefel conjugate gradient scheme. P may come from incomplete lu, or a few steps of a multigrid iteration, or. A nonmonotone conjugate gradient algorithm for unconstrained. When we write p 1, we never intend that an inverse will be explicitly computed. A nonmonotone conjugate gradient algorithm for unconstrained optimization article in journal of systems science and complexity 152 january 2002 with 60 reads how we measure reads.
A hybrid conjugate gradient method for optimization problems. In this paper, we propose a nonmonotone line search combining with the search direction g. In particular, a scaled version of the conjugate gradient method, suggested by perry 2, 9, which employ the spectral steplength of barzilai and borwein 1, 10, was. A modified spectral conjugate gradient methods with the new. A new conjugate gradient method with guaranteed descent.
The global convergence of the given method will be established under suitable condi. A nonmonotone prp conjugate gradient method for solving. A new nonlinear spectral conjugate descent method for solving unconstrained optimization problems is proposed on the basis of the cd method and the spectral conjugate gradient method. An introduction to the conjugate gradient method without the agonizing pain jonathan richard shewchuk march 7, 1994 cmucs94125 school of computer science carnegie mellon university pittsburgh, pa 152 abstract the conjugategradient method is themost prominent iterativemethod for solvingsparse systems of linear equations.
A new conjugate gradient method with guaranteed descent and. Nonlinear conjugate gradient methods, unconstrained optimization, nonlinear. A scaled conjugate gradient method based on new bfgs secant. Pdf the limited memory conjugate gradient method semantic. Both of them satisfy the ringwirth nonexpansive condition, which is fundamental for convergence analysis of riemannian conjugate gradient methods, and one of them is also. Wei, new line search methods for unconstrained optimization, journal of the korean statistical society, 382009, pp. Nov 10, 2016 in this paper we propose a new riemannian conjugate gradient method for optimization on the stiefel manifold. Nonmonotone conjugate gradient method nonmonotone line search global convergence unconstrained optimization. This lemma shows the advantage of the conjugate gradient method over the gradient method. In this paper, we introduce a class of nonmonotone conjugate gradient methods, which include the wellknown polakribire method and hestenesstiefel method as special cases. Under mild assumptions, we prove the global convergence and linear convergence rate of the method.
The conjugate gradient method can be applied to an arbitrary nbym matrix by applying it to normal equations a t a and righthand side vector a t b, since a t a is a symmetric positivesemidefinite matrix for any a. The convergence of conjugate gradient method with nonmonotone. The steepest descent and the conjugate gradient methods both do not require estimation of the hessian. This section establishes the multipreconditioned analogy of cg in a fashion similar to the derivation of the standard pcg, whose. A modified polakribierepolyak conjugate gradient algorithm. A limited memory version of the nonlinear conjugate gradient method is developed. In this paper, we combined a new nonmonotone techniques with the spectral conjugate gradient method to obtain a more e. Request pdf the convergence of conjugate gradient method with nonmonotone line search the conjugate gradient method is a useful and powerful approach for solving largescale minimization problems.
On the subspace minimization conjugate gradient method. A nonmonotone line search method for regression analysis. A nonmonotone hybrid conjugate gradient method for. An iterative conjugate gradient regularization method for. Therefore, there existed a wide space for variations and extensions of the bb original method. An iterative conjugate gradient regularization method for image restoration. This class of nonmonotone conjugate gradient methods is proved to be globally convergent when it is applied to solve unconstrained optimization problems with convex objective functions. A new class of conjugate gradient methods with extended. Both of these methods have a qlinear rate of convergence. The additional orthogonality reduces the gramschmidt process to threeterm recursion. Conjugate gradient method employs vectors that are aorthogonal or conjugate details of the derivation of the method are omitted r 1 r 0 0 t j 0 t d i ad. The spectral gradient and conjugate gradient methods are a class of methods that can suitably cope with largescale settings.
Optimization online a riemannian conjugate gradient method. A hybrid method of the polakribierepolyak prp method and the weiyaoliu wyl method is proposed for unconstrained optimization pro blems, which possesses the following properties. On the convergence of a new conjugate gradient algorithm. We applied the spectral steplength to the entire conjugate gradient direction rather than the negative gradient.
122 1144 610 1204 752 170 756 391 547 48 527 866 253 36 953 1152 467 281 932 828 231 945 618 1385 978 1145 1289 1097 470 681 532 133 1078