Modification of Steepest Descent Method for Solving Unconstrained Optimization
The Classical steepest descent (SD) method is known as one of the earliest and the best method to minimize a function. Even though the convergence rate is quite slow, but its simplicity has made it one of the easiest methods to be used and applied especially in the form of computer codes.
保存先:
第一著者: | Zubai'dah Binti Zainal Abidin |
---|---|
フォーマット: | 学位論文 |
言語: | English |
出版事項: |
Universiti Malaysia Terengganu
2023
|
主題: | |
オンライン・アクセス: | http://umt-ir.umt.edu.my:8080/handle/123456789/17622 |
タグ: |
タグ追加
タグなし, このレコードへの初めてのタグを付けませんか!
|
類似資料
-
A new multi-step gradient method for optimization problem
著者:: Mahboubeh, Farid, 等
出版事項: (2010) -
The investigation of gradient method namely Steepest Descent and extending of Barzilai Borwein for solving unconstrained optimization problem / Nur Intan Syahirah Ismail & Nur Atikah Aziz
著者:: Ismail, Nur Intan Syahirah, 等
出版事項: (2019) -
Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization
著者:: Leong, Wah June, 等
出版事項: (2009) -
Preconditioning Subspace Quasi-Newton Method for Large Scale Unconstrained Optimization
著者:: Sim, Hong Sen
出版事項: (2011) -
Energy Management in Integrated Microgrids: An Optimal Schedule Controller Utilizing Gradient Descent Algorithm
著者:: Abdolrasol M.G.M., 等
出版事項: (2024)