FINDING THE OPTIMAL VALUES OF FUNCTIONS USING THE CONJUGATE GRADIENT METHOD
DOI:
https://doi.org/10.32851/tnv-tech.2021.3.1Keywords:
minimization, fastest descent, conjugate directions, conjugate gradients, iterative algorithmAbstract
The article considers the conjugate gradient method to solve ill-posed problems by the regularization method. A gradient is a vector, the value of which determines the rate of change of a function, and the direction coincides with the direction of the growth of this function itself. The vector indicating the direction of the greatest decrease in the function is called the antigradient of the function. The conjugate gradient method is used to solve unconstrained minimization problems, to find the extremal of the smoothing functional. This method is an iterative method. A common property of most iterative algorithms is a rapid drop in the minimization rate when approaching the minimum point of the functional. Therefore, an important characteristic of iterative algorithms is the actual minimum level of the residual functional values, to which it is possible to bring the real-time minimization process. The paper describes the steepest descent method as a method preceding the conjugate gradient method and combines two concepts: the objective function gradient and the conjugate direction of vectors. The method of conjugate directions and two methods of finding the weight coefficient are also given. The article analyzes gradient methods for finding the optimal values of quadratic functions and functions of a general form. The conjugate gradient method is a first order method, but its rate of convergence is quadratic, which makes this method comparing favorably with conventional gradient methods. The disadvantage of gradient search is that only the local extremum of the function can be found when using it. To find other local extrema, it is necessary to search from other starting points. An algorithm for minimizing the functional using the conjugate gradient method is constructed.
References
Регуляризирующие алгоритмы и априорная информация / А.Н. Тихонов и др. Москва : Наука, 1983. 200 с.
Golub G.H., O’Leary D.P. Some History of the Conjugate Gradient Methods and Lanczos Algorithms: 1948–1976. SIAM Rev. 1989. Vol. 31. P. 50–102.
Hestenes M.R., Stiefel E. Method of Conjugate Gradients for Solving Linear Systems. Journal of Research of the National Bureau of Standards. 1952. Vol. 49. P. 409–436.
Fletcher R., Reeves C.M. Function Minimization by Conjugate Gradients. Computer Journal. 1964. Vol. 7. P. 149–154.
Решение СЛУ методом сопряженных градиентов. 2012. URL: http://www.hpcc.unn.ru/?dir=847 (дата звернення: 20.05.21).
Некипелов Н. Метод сопряженных градиентов – математический аппарат. URL: https://basegroup.ru/community/articles/conjugate (дата звернення: 18.05.21).
Метод спряженого градієнта. 2021. URL: https://uk.wikipedia.org/wiki/Метод_спряженого_градієнта#Місцево_оптимальний_метод_найбільш_стрімкого_спуску (дата звернення: 18.05.21).
Безусловная оптимизация. Метод сопряженных градиентов. 2017. URL: https://simenergy.ru/math-analysis/solution-methods/90-conjugate-gradient (дата звернення: 11.08.21).