共轭梯度法学习 The Conjugate Gradient method

来源:互联网 发布:grapher软件绘图教学 编辑:程序博客网 时间:2024/04/20 00:01

共轭梯度法 The Conjugate Gradient method


简明教程与PPT下载(英文):http://download.csdn.net/detail/anshan1984/5083948


In steepest descent of nonlinear optimization the steps are along directions that undo some of the progress of the others. The basic idea of the conjugate gradient method is to move in non-interfering directions.


For quadratic functions formed from symmetric positive definite matrices, the conjugate gradient method converges to the unique global minimum in at most n steps, by moving along successive non-interfering directions.


The conjugate gradient algorithm really is a generalization of steepest descent.


Three primary advantages to conjugate gradient method of direction selection:

First, unless the solution is attained in less than n steps, the gradient is always non zero and linearly independent of all previous direction vectors.

Second, a more important advantage of the conjugate gradient method is the especially simple formula that is used to determine the new direction vector. This simplicity makes the method only slightly more complicated than steepest descent.

Third, because the directions are based on the gradients, the process makes good uniformprogress toward the solution at every step.


Conjugate Gradient is an intermediate between steepest descent and Newton’s method. It tries to achieve the quadratic convergence of Newton’s method without incurring the cost of computing Hf. At the same time, Conjugate Gradient will execute at least one gradient descent step per n steps. It has proved to be extremely effective in dealing with general objective functions and is considered among the best general purpose methods presently available.

 

欢迎来到我的CSDN博客:http://blog.csdn.net/anshan1984/

 


原创粉丝点击