[01223] Descent hybrid four-term conjugate gradient methods for unconstrained optimization
Session Time & Room : 3C (Aug.23, 13:20-15:00) @A206
Type : Industrial Contributed Talk
Abstract : Conjugate gradient method (CGM) is widely acclaimed to be efficient for solving large-scale unconstrained optimization problems. This study proposes new modified schemes that contain four terms based on the linear combination of the update parameters of classical or early methods depending on the popular two- and three-term CGMs. Hybridized methods have been found to exhibit better performance than the classical methods (Stanimirovic et al., 2018). Several other methods in this category can be found in (Adeleke et al., 2018; Osinuga and Olofin, 2018; Stanimirovic et al., 2018).
In continuation of the previous results, we propose hybrid methods for the solution of large-scale unconstrained optimization problems as motivated and inspired by (Alhawarat et al., 2021, Yao et al.,2020, Stanimirovic et al.,2018). Modified methods are defined using appropriate combinations of the search directions and included parameters. In this case, our methods are hybridizations of HS and DY methods. In addition, we propose a class of Dai-Liao CGMs developed using new search directions developed using different values of included parameter.
Under some certain assumptions, descent and convergence properties were established with the underlying strong Wolfe line search. The results of the new schemes showed superior performance over the existing ones in the sense of performance profiles of Dolan and More (2002).
Keywords: unconstrained optimization, strong Wolfe line search, descent property, global convergence.
AMS subject classification. 49J52, 49J53, 90C30
References
[1] Adeleke, O. J., Osinuga, I. A. and Raji, R. A. 2021 A globally convergent hybrid FR-PRP
conjugate gradient method for unconstrained optimization problems, WSEAS Transactions
on Mathematics, 20, 736 -744, DOI: 10.37394/23206.2021.20.78.
[2] Alhawarat, A., Alhamzi, G., Masmali, I. and Salleh, Z. 2021 A descent four-term conjugate
gradient method with global convergence properties for unconstrained optimization
problems, Mathematical Problems in Engineering, Volume 2021, Art. ID. 6219062, 14 pp.
[3] Dai, Y. H. and Liao, L. Z. 2001 New conjugacy conditions and related nonlinear conjugate
gradient methods, Applied Mathematics and Optimization, 43 (1), 87 – 101.
[4] Dolan, E. and More, J. J. 2002 Benchmarking optimization software with performance
profile, Mathematical Programming, 91, 201 – 213.
[5] Osinuga, I. A. and Olofin, I. O. 2018 Extended hybrid conjugate gradient method for
unconstrained optimization. Journal of Computer Science and its Applications, 25 (2);
166–175
[6] Stanimirovic, P. S., Ivanov, B, Djorjevic, S. and Brajevic, I. 2018 New hybrid conjugate
gradient and Broyden-Fletcher-Goldfarb-Shanno conjugate gradient methods, Journal of
Optimization Theory and Applications, DOI: 10.1007/s10957-018-1324-3
[7] Yao, S., Ning, L., Tu, H. and Xu, J. 2020 A one-parameter class of three-term conjugate
gradient methods with an adaptive parameter choice, Optimization Methods and Software,
35 (6), 1051-1064.