Evaluating JA-ABC5 hyperparameter optimisation with classifiers
Because of its simplicity, flexibility, and robustness, the Artificial Bee Colony (ABC) algorithm, a swarm intelligence-based optimisation method, has been widely applied in a variety of fields. However, its application in hyperparameter optimisation for machine learning classifiers deserves explora...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Conference or Workshop Item |
| Language: | en |
| Published: |
Springer International Publishing
2024
|
| Subjects: | |
| Online Access: | https://umpir.ump.edu.my/id/eprint/39241/13/Evaluating%20JA-ABC5%20hyperparameter%20optimisation%20with%20classifiers.pdf https://umpir.ump.edu.my/id/eprint/39241/ https://doi.org/10.1007/978-981-97-3851-9_36 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| Summary: | Because of its simplicity, flexibility, and robustness, the Artificial Bee Colony (ABC) algorithm, a swarm intelligence-based optimisation method, has been widely applied in a variety of fields. However, its application in hyperparameter optimisation for machine learning classifiers deserves exploration.The effectiveness of ABC and its modified version, JA-ABC5, for hyperparameter optimisation across various classifiers, including Support Vector Machine (SVM) and K-Nearest Neighbour (KNN), is studied in this research. The Wisconsin dataset is used to evaluate the performance of these classifiers, and the hyperparameters are optimised using the JA-ABC5 algorithm. The performance of JA-ABC5 is compared to that of grid search, standard ABC, Bayesian optimisation, and random search. The findings demonstrate that JA-ABC5 performs admirably in terms of SVM, with accuracy, specificity, and sensitivity of 98.59%, 99.51%, and 98.34%, respectively. Its performance in KNN is comparable. This study extends our knowledge of machine learning model optimisation, which has the potential to enhance the effectiveness of these models across a range of applications. |
|---|
