Hyperparameter tuning of deep neural network in time series forecasting

Deep Artificial Neural Network (DANN) is a type of Artificial Neural Network (ANN) with multiple hidden layers, making them a 'deep' form of ANN. Since ANN is a type of Deep Neural Network (DNN), DANNs fall under the broader DNN category and are widely used in time series forecasting. The...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiang, Kelly Pang Li, Syafrina, Abdul Halim, Nur Haizum, Abd Rahman
Format: Article
Language:English
Published: UPM 2024
Subjects:
Online Access:http://umpir.ump.edu.my/id/eprint/42505/1/2024%20Hyperparameter%20Tuning%20of%20Deep%20Neural%20Network%20in%20Time%20Series%20Forecasting.pdf
http://umpir.ump.edu.my/id/eprint/42505/
https://myjms.mohe.gov.my/index.php/dismath/article/view/27587
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.ump.umpir.42505
record_format eprints
spelling my.ump.umpir.425052024-09-05T04:08:32Z http://umpir.ump.edu.my/id/eprint/42505/ Hyperparameter tuning of deep neural network in time series forecasting Xiang, Kelly Pang Li Syafrina, Abdul Halim Nur Haizum, Abd Rahman QA Mathematics Deep Artificial Neural Network (DANN) is a type of Artificial Neural Network (ANN) with multiple hidden layers, making them a 'deep' form of ANN. Since ANN is a type of Deep Neural Network (DNN), DANNs fall under the broader DNN category and are widely used in time series forecasting. The performance of DANN is highly dependent on the choice of hyperparameters. Random selection of the hyperparameters may increase DANN’s forecasting error. Hence, this study aims to optimize the performance of DANN in time series forecasting by tuning two important hyperparameters: the number of epochs and batch size. In this study, DANN with 1, 10, 20, 50 and 100 epochs, and batch sizes of 32 and 64 are used to grid search and form different combinations of hyperparameters. The performances of each model are evaluated and compared based on the mean square error (MSE) and mean absolute error (MAE). In addition, mean absolute percentage error (MAPE) is used to compare the performance of the DANN model on high-frequency and low-frequency time series data. Our study use simulated and real-life data to reveal the performance of the DANN model. The results show more than one epoch is needed to provide good performance. Specifically, analysis of simulated data consistently suggests that 10 epochs offer optimal results. Similarly, 10 epochs yield optimal results for low-frequency real-life data, while high-frequency real-life data prefers 100 epochs. Additionally, the finding indicates that batch sizes of 32 and 64 are optimal when used in different combinations. Hence, this study suggests that, in starting the learning process, it is crucial to perform hyperparameter tuning. This step ensures the selection of appropriate hyperparameter values, which significantly impact the learning outcome of a DNN model, leading to improved forecast accuracy results. UPM 2024 Article PeerReviewed pdf en cc_by_nc_4 http://umpir.ump.edu.my/id/eprint/42505/1/2024%20Hyperparameter%20Tuning%20of%20Deep%20Neural%20Network%20in%20Time%20Series%20Forecasting.pdf Xiang, Kelly Pang Li and Syafrina, Abdul Halim and Nur Haizum, Abd Rahman (2024) Hyperparameter tuning of deep neural network in time series forecasting. Menemui Matematik (Discovering Mathematics), 46 (1). pp. 47-73. ISSN 0126-9003. (Published) https://myjms.mohe.gov.my/index.php/dismath/article/view/27587
institution Universiti Malaysia Pahang Al-Sultan Abdullah
building UMPSA Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Malaysia Pahang Al-Sultan Abdullah
content_source UMPSA Institutional Repository
url_provider http://umpir.ump.edu.my/
language English
topic QA Mathematics
spellingShingle QA Mathematics
Xiang, Kelly Pang Li
Syafrina, Abdul Halim
Nur Haizum, Abd Rahman
Hyperparameter tuning of deep neural network in time series forecasting
description Deep Artificial Neural Network (DANN) is a type of Artificial Neural Network (ANN) with multiple hidden layers, making them a 'deep' form of ANN. Since ANN is a type of Deep Neural Network (DNN), DANNs fall under the broader DNN category and are widely used in time series forecasting. The performance of DANN is highly dependent on the choice of hyperparameters. Random selection of the hyperparameters may increase DANN’s forecasting error. Hence, this study aims to optimize the performance of DANN in time series forecasting by tuning two important hyperparameters: the number of epochs and batch size. In this study, DANN with 1, 10, 20, 50 and 100 epochs, and batch sizes of 32 and 64 are used to grid search and form different combinations of hyperparameters. The performances of each model are evaluated and compared based on the mean square error (MSE) and mean absolute error (MAE). In addition, mean absolute percentage error (MAPE) is used to compare the performance of the DANN model on high-frequency and low-frequency time series data. Our study use simulated and real-life data to reveal the performance of the DANN model. The results show more than one epoch is needed to provide good performance. Specifically, analysis of simulated data consistently suggests that 10 epochs offer optimal results. Similarly, 10 epochs yield optimal results for low-frequency real-life data, while high-frequency real-life data prefers 100 epochs. Additionally, the finding indicates that batch sizes of 32 and 64 are optimal when used in different combinations. Hence, this study suggests that, in starting the learning process, it is crucial to perform hyperparameter tuning. This step ensures the selection of appropriate hyperparameter values, which significantly impact the learning outcome of a DNN model, leading to improved forecast accuracy results.
format Article
author Xiang, Kelly Pang Li
Syafrina, Abdul Halim
Nur Haizum, Abd Rahman
author_facet Xiang, Kelly Pang Li
Syafrina, Abdul Halim
Nur Haizum, Abd Rahman
author_sort Xiang, Kelly Pang Li
title Hyperparameter tuning of deep neural network in time series forecasting
title_short Hyperparameter tuning of deep neural network in time series forecasting
title_full Hyperparameter tuning of deep neural network in time series forecasting
title_fullStr Hyperparameter tuning of deep neural network in time series forecasting
title_full_unstemmed Hyperparameter tuning of deep neural network in time series forecasting
title_sort hyperparameter tuning of deep neural network in time series forecasting
publisher UPM
publishDate 2024
url http://umpir.ump.edu.my/id/eprint/42505/1/2024%20Hyperparameter%20Tuning%20of%20Deep%20Neural%20Network%20in%20Time%20Series%20Forecasting.pdf
http://umpir.ump.edu.my/id/eprint/42505/
https://myjms.mohe.gov.my/index.php/dismath/article/view/27587
_version_ 1822924652650954752
score 13.235362