Improved stochastic gradient descent algorithm with mean-gradient adaptive stepsize for solving large-scale optimization problems
Stochastic gradient descent (SGD) is one of the most common algorithms used in solving large unconstrained optimization problems. It utilizes the concept of classical gradient descent method with modification on the gradient selection. SGD uses random or batch data sets to compute gradient in solvin...
保存先:
主要な著者: | , , |
---|---|
フォーマット: | 論文 |
言語: | English |
出版事項: |
Persatuan Sains Matematik Malaysia
2023
|
オンライン・アクセス: | http://psasir.upm.edu.my/id/eprint/110372/1/document%20%284%29.pdf http://psasir.upm.edu.my/id/eprint/110372/ https://myjms.mohe.gov.my/index.php/dismath/article/view/24687 |
タグ: |
タグ追加
タグなし, このレコードへの初めてのタグを付けませんか!
|