New modification of the hestenes-stiefel with strong wolfe line search
. The method of the nonlinear conjugate gradient is widely used in solving large-scale unconstrained optimization since been proven in solving optimization problems without using large memory storage. In this paper, we proposed a new modification of the Hestenes-Stiefel conjugate gradient parameter...
保存先:
主要な著者: | , , |
---|---|
フォーマット: | Conference or Workshop Item |
言語: | English |
出版事項: |
2021
|
主題: | |
オンライン・アクセス: | http://eprints.uthm.edu.my/2643/1/P12682_fab91575b27daa5c82a8d41786ab381e.pdf http://eprints.uthm.edu.my/2643/ https://doi.org/10.1063/5.0053211 Published |
タグ: |
タグ追加
タグなし, このレコードへの初めてのタグを付けませんか!
|
要約: | . The method of the nonlinear conjugate gradient is widely used in solving large-scale unconstrained optimization since been proven in solving optimization problems without using large memory storage. In this paper, we proposed a new modification of the Hestenes-Stiefel conjugate gradient parameter that fulfils the condition of sufficient descent using a strong Wolfe-Powell line search. Besides, the conjugate gradient method with the proposed conjugate gradient also guarantees low computation of iteration and CPU time by comparing with other classical conjugate gradient parameters. Numerical results have shown that the conjugate gradient method with the proposed conjugate gradient parameter performed better than the conjugate gradient method with other classical conjugate gradient parameters. |
---|