Registered Data

[00301] On some modifications in the conjugate gradient method and its application in unconstrained optimization problems

  • Session Time & Room : 3E (Aug.23, 17:40-19:20) @D502
  • Type : Contributed Talk
  • Abstract : Conjugate gradient (CG) methods are preferably used to solve large-scale unconstrained optimization problems due to strong local and global convergence properties and low memory requirements. To enhanse its convergence, we introduce an improved hybrid form of the conjugate gradient method in this work. We propose a new form of CG parameter (βk), combining the Fletcher Reeves (FR) and three-term search directions. Our proposed search directions formula satisfies the sufficient descent condition independent of any line search and are bounded. For the global convergence, some proper assumptions on the objective function and its gradient have been taken into account, which fulfills the strong Wolfe-Powell line search conditions. Finally, numerical experiments have been carried out on some standard benchmark test functions and compared with other CG methods from the literature to check the validity of the proposed algorithm. Numerical results guarantee the efficiency and robustness of our proposed CG method.
  • Classification : 90-XX, 90C26, 90C30, 90bxx, 90-08
  • Format : Online Talk on Zoom
  • Author(s) :
    • Sweta Kumari (Birla Institute of Technology Mesra, Ranchi)
    • Darakhshan Jabeen Syeda (Birla Institute of Technology Mesra, Ranchi)