TY - JOUR

T1 - A Family of Hybrid Stochastic Conjugate Gradient Algorithms for Local and Global Minimization Problems

AU - Alnowibet, Khalid Abdulaziz

AU - Mahdi, Salem

AU - Alshamrani, Ahmad M.

AU - Sallam, Karam M.

AU - Mohamed, Ali Wagdy

N1 - Funding Information:
The authors present their appreciation to King Saud University for funding the publication of this research through Researchers Supporting Program (RSP-2021/305), King Saud University, Riyadh, Saudi Arabia.
Publisher Copyright:
© 2022 by the authors.

PY - 2022/10/1

Y1 - 2022/10/1

N2 - This paper contains two main parts, Part I and Part II, which discuss the local and global minimization problems, respectively. In Part I, a fresh conjugate gradient (CG) technique is suggested and then combined with a line-search technique to obtain a globally convergent algorithm. The finite difference approximations approach is used to compute the approximate values of the first derivative of the function f. The convergence analysis of the suggested method is established. The comparisons between the performance of the new CG method and the performance of four other CG methods demonstrate that the proposed CG method is promising and competitive for finding a local optimum point. In Part II, three formulas are designed by which a group of solutions are generated. This set of random formulas is hybridized with the globally convergent CG algorithm to obtain a hybrid stochastic conjugate gradient algorithm denoted by HSSZH. The HSSZH algorithm finds the approximate value of the global solution of a global optimization problem. Five combined stochastic conjugate gradient algorithms are constructed. The performance profiles are used to assess and compare the rendition of the family of hybrid stochastic conjugate gradient algorithms. The comparison results between our proposed HSSZH algorithm and four other hybrid stochastic conjugate gradient techniques demonstrate that the suggested HSSZH method is competitive with, and in all cases superior to, the four algorithms in terms of the efficiency, reliability and effectiveness to find the approximate solution of the global optimization problem that contains a non-convex function.

AB - This paper contains two main parts, Part I and Part II, which discuss the local and global minimization problems, respectively. In Part I, a fresh conjugate gradient (CG) technique is suggested and then combined with a line-search technique to obtain a globally convergent algorithm. The finite difference approximations approach is used to compute the approximate values of the first derivative of the function f. The convergence analysis of the suggested method is established. The comparisons between the performance of the new CG method and the performance of four other CG methods demonstrate that the proposed CG method is promising and competitive for finding a local optimum point. In Part II, three formulas are designed by which a group of solutions are generated. This set of random formulas is hybridized with the globally convergent CG algorithm to obtain a hybrid stochastic conjugate gradient algorithm denoted by HSSZH. The HSSZH algorithm finds the approximate value of the global solution of a global optimization problem. Five combined stochastic conjugate gradient algorithms are constructed. The performance profiles are used to assess and compare the rendition of the family of hybrid stochastic conjugate gradient algorithms. The comparison results between our proposed HSSZH algorithm and four other hybrid stochastic conjugate gradient techniques demonstrate that the suggested HSSZH method is competitive with, and in all cases superior to, the four algorithms in terms of the efficiency, reliability and effectiveness to find the approximate solution of the global optimization problem that contains a non-convex function.

KW - comparisons

KW - conjugate gradient methods

KW - efficient algorithm

KW - global optimization

KW - meta-heuristics

KW - numerical approximations of gradients

KW - performance profiles

KW - stochastic parameters

KW - testing

KW - unconstrained minimization

UR - http://www.scopus.com/inward/record.url?scp=85139754522&partnerID=8YFLogxK

U2 - 10.3390/math10193595

DO - 10.3390/math10193595

M3 - Article

AN - SCOPUS:85139754522

VL - 10

SP - 1

EP - 37

JO - Mathematics

JF - Mathematics

SN - 2227-7390

IS - 19

M1 - 3595

ER -