12
M. J. D.
Powell
1977-12
http://link.springer.com/10.1007%2FBF01593790
The conjugate gradient method is particularly useful for minimizing functions of very many variables because it does not require the storage of any matrices. However the rate of convergence of the algorithm is only linear unless the iterative procedure is “restarted” occasionally. At present it is usual to restart everyn or (n + 1) iterations, wheren is the number of variables, but it is known that the frequency of restarts should depend on the objective function. Therefore the main purpose of this paper is to provide an algorithm with a restart procedure that takes account of the objective function automatically. Another purpose is to study a multiplying factor that occurs in the definition of the search direction of each iteration. Various expressions for this factor have been proposed and often it does not matter which one is used. However now some reasons are given in favour of one of these expressions. Several numerical examples are reported in support of the conclusions of this paper.
2019-04-10T17:35
false
en
articles
Restart procedures for the conjugate gradient method
241-254
research_article
https://scigraph.springernature.com/explorer/license/
1977-12-01
Springer Nature - SN SciGraph project
1436-4646
0025-5610
Mathematical Programming
1ea64f5f2f5e310a7434b564c71dc3ef83ef2edec9924675a08da33ebf45409e
readcube_id
pub.1006434394
dimensions_id
Applied Economics
1
Economics
10.1007/bf01593790
doi
A.E.R.E., Harwell, England
United Kingdom Atomic Energy Authority