Bounded activation functions for enhanced training stability of deep neural networks on visual pattern recognition problems

This paper focuses on the enhancement of the generalization ability and training stability of deep neural networks (DNNs). New activation functions that we call bounded rectified linear unit (ReLU), bounded leaky ReLU, and bounded bi-firing are proposed. These activation functions are defined based...

全面介紹

Saved in:
書目詳細資料
Main Authors: Liew, S. S., Khalil-Hani, M., Bakhteri, R.
格式: Article
出版: Elsevier B.V. 2016
主題:
在線閱讀:http://eprints.utm.my/id/eprint/71140/
https://www.scopus.com/inward/record.uri?eid=2-s2.0-84994477344&doi=10.1016%2fj.neucom.2016.08.037&partnerID=40&md5=5b940413f14332dd63cda37f4ebfbe4b
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!