Transfer learning based performance comparison of the pre-trained deep neural networks

Deep learning has grown tremendously in recent years, having a substantial impact on practically every discipline. Transfer learning allows us to transfer the knowledge of a model that has been formerly trained for a particular task to a new model that is attempting to solve a related but not identi...

Full description

Saved in:
Bibliographic Details
Main Authors: Kumar, Jayapalan Senthil, Anuar, Syahid, Hassan, Noor Hafizah
Format: Article
Language:English
Published: Science and Information Organization 2022
Subjects:
Online Access:http://eprints.utm.my/id/eprint/100857/1/JayapalanSenthilKumar2022_TransferLearningbasedPerformanceComparison.pdf
http://eprints.utm.my/id/eprint/100857/
http://dx.doi.org/10.14569/IJACSA.2022.0130193
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.utm.100857
record_format eprints
spelling my.utm.1008572023-05-07T06:16:12Z http://eprints.utm.my/id/eprint/100857/ Transfer learning based performance comparison of the pre-trained deep neural networks Kumar, Jayapalan Senthil Anuar, Syahid Hassan, Noor Hafizah T Technology (General) Deep learning has grown tremendously in recent years, having a substantial impact on practically every discipline. Transfer learning allows us to transfer the knowledge of a model that has been formerly trained for a particular task to a new model that is attempting to solve a related but not identical problem. Specific layers of a pre-trained model must be retrained while the others must remain unmodified to adapt it to a new task effectively. There are typical issues in selecting the layers to be enabled for training and layers to be frozen, setting hyperparameter values, and all these concerns have a substantial effect on training capabilities as well as classification performance. The principal aim of this study is to compare the network performance of the selected pre-trained models based on transfer learning to help the selection of a suitable model for image classification. To accomplish the goal, we examined the performance of five pre-trained networks, such as SqueezeNet, GoogleNet, ShuffleNet, Darknet-53, and Inception-V3 with different Epochs, Learning Rates, and Mini-Batch Sizes to compare and evaluate the network’s performance using confusion matrix. Based on the experimental findings, Inception-V3 has achieved the highest accuracy of 96.98%, as well as other evaluation metrics, including precision, sensitivity, specificity, and f1-score of 92.63%, 92.46%, 98.12%, and 92.49%, respectively. Science and Information Organization 2022 Article PeerReviewed application/pdf en http://eprints.utm.my/id/eprint/100857/1/JayapalanSenthilKumar2022_TransferLearningbasedPerformanceComparison.pdf Kumar, Jayapalan Senthil and Anuar, Syahid and Hassan, Noor Hafizah (2022) Transfer learning based performance comparison of the pre-trained deep neural networks. International Journal of Advanced Computer Science and Applications, 13 (1). pp. 797-805. ISSN 2158-107X http://dx.doi.org/10.14569/IJACSA.2022.0130193 DOI: 10.14569/IJACSA.2022.0130193
institution Universiti Teknologi Malaysia
building UTM Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Teknologi Malaysia
content_source UTM Institutional Repository
url_provider http://eprints.utm.my/
language English
topic T Technology (General)
spellingShingle T Technology (General)
Kumar, Jayapalan Senthil
Anuar, Syahid
Hassan, Noor Hafizah
Transfer learning based performance comparison of the pre-trained deep neural networks
description Deep learning has grown tremendously in recent years, having a substantial impact on practically every discipline. Transfer learning allows us to transfer the knowledge of a model that has been formerly trained for a particular task to a new model that is attempting to solve a related but not identical problem. Specific layers of a pre-trained model must be retrained while the others must remain unmodified to adapt it to a new task effectively. There are typical issues in selecting the layers to be enabled for training and layers to be frozen, setting hyperparameter values, and all these concerns have a substantial effect on training capabilities as well as classification performance. The principal aim of this study is to compare the network performance of the selected pre-trained models based on transfer learning to help the selection of a suitable model for image classification. To accomplish the goal, we examined the performance of five pre-trained networks, such as SqueezeNet, GoogleNet, ShuffleNet, Darknet-53, and Inception-V3 with different Epochs, Learning Rates, and Mini-Batch Sizes to compare and evaluate the network’s performance using confusion matrix. Based on the experimental findings, Inception-V3 has achieved the highest accuracy of 96.98%, as well as other evaluation metrics, including precision, sensitivity, specificity, and f1-score of 92.63%, 92.46%, 98.12%, and 92.49%, respectively.
format Article
author Kumar, Jayapalan Senthil
Anuar, Syahid
Hassan, Noor Hafizah
author_facet Kumar, Jayapalan Senthil
Anuar, Syahid
Hassan, Noor Hafizah
author_sort Kumar, Jayapalan Senthil
title Transfer learning based performance comparison of the pre-trained deep neural networks
title_short Transfer learning based performance comparison of the pre-trained deep neural networks
title_full Transfer learning based performance comparison of the pre-trained deep neural networks
title_fullStr Transfer learning based performance comparison of the pre-trained deep neural networks
title_full_unstemmed Transfer learning based performance comparison of the pre-trained deep neural networks
title_sort transfer learning based performance comparison of the pre-trained deep neural networks
publisher Science and Information Organization
publishDate 2022
url http://eprints.utm.my/id/eprint/100857/1/JayapalanSenthilKumar2022_TransferLearningbasedPerformanceComparison.pdf
http://eprints.utm.my/id/eprint/100857/
http://dx.doi.org/10.14569/IJACSA.2022.0130193
_version_ 1768006576510074880
score 13.211869