Abstract
The Spectral conjugate gradient method is an efficient method for solving large-scale unconstrained optimization problems. In this paper, we propose a new spectral conjugate gradient method in which performance is analyzed numerically. We establish the descent condition and global convergence property under some assump-tions and the strong Wolfe line search. Numerical experiments to evaluate the method’s efficiency are conducted using 98 problems with various dimensions and initial points. The numerical results based on the number of iterations and central processing unit time show that the new method has a high performance computational.
Original language | English |
---|---|
Pages (from-to) | 2053-2069 |
Number of pages | 17 |
Journal | Journal of Mathematical and Computational Science |
Volume | 10 |
Issue number | 5 |
DOIs | |
Publication status | Published - 2020 |
Keywords
- Descent condition
- Global convergence property
- Spectral conjugate gradient method
- Strong Wolfe line search
- Unconstrained optimization