Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning
Activation functions are essential for deep learning methods to learn and perform complex tasks such as image classification. Rectified Linear Unit (ReLU) has been widely used and become the default activation function across the deep learning community since 2012. Although ReLU has been popular, ho...
Saved in:
Main Authors: | Hock, Hung Chieng, Wahid, Noorhaniza, Ong, Pauline, Perla, Sai Raj Kishore |
---|---|
Format: | Article |
Language: | English |
Published: |
Program Studi Teknik Informatika
2018
|
Subjects: | |
Online Access: | http://eprints.uthm.edu.my/5227/1/AJ%202020%20%28102%29.pdf http://eprints.uthm.edu.my/5227/ http://dx.doi.org/10.26555/ijain.v4i2.249 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Parametric flatten-t swish: an adaptive nonlinear activation function for deep learning
by: Hock, Hung Chieng, et al.
Published: (2021) -
Improved thresholding and quantization techniques for image compression
by: Md Taujuddin, Nik Shahidah Afifi
Published: (2017) -
Content Based Retrieval Using Colour And Texture Of Wavelet Based Compressed Images [TA1637. I67 2008 f rb].
by: Abdul Fatah, Irfan Afif
Published: (2008) -
EEE 381 - PERHUBUNGAN GENTIAN OPTIK MAC 05.
by: PPKEE, Pusat Pengajian Kejuruteraan Elektrik & Elektronik
Published: (2005) -
EEE 502 - ADVANCED DIGITAL SIGNAL AND IMAGE PROCESSING MARCH 05.
by: PPKEE, Pusat Pengajian Kejuruteraan Elektrik & Elektronik
Published: (2005)