Studying the Impact of Initialization for Population-Based Algorithms with Low-Discrepancy Sequences

To solve different kinds of optimization challenges, meta-heuristic algorithms have been extensively used. Population initialization plays a prominent role in meta-heuristic algorithms for the problem of optimization. These algorithms can affect convergence to identify a robust optimum solution. To...

全面介紹

Saved in:
書目詳細資料
Main Authors: Adnan Ashraf, Sobia Pervaiz, Waqas Haider Bangyal, Kashif Nisar, Ag. Asri Ag. Ibrahim, Joel J. P. C. Rodrigues, Danda B. Rawat
格式: Article
語言:English
English
出版: MDPI AG, Basel, Switzerland 2021
主題:
在線閱讀:https://eprints.ums.edu.my/id/eprint/31833/1/Studying%20the%20Impact%20of%20Initialization%20for%20Population-Based%20Algorithms%20with%20Low-Discrepancy%20Sequences.pdf
https://eprints.ums.edu.my/id/eprint/31833/2/Studying%20the%20Impact%20of%20Initialization%20for%20Population-Based%20Algorithms%20with%20Low-Discrepancy%20Sequences1.pdf
https://eprints.ums.edu.my/id/eprint/31833/
https://www.mdpi.com/2076-3417/11/17/8190
https://www.mdpi.com/journal/applsci
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:To solve different kinds of optimization challenges, meta-heuristic algorithms have been extensively used. Population initialization plays a prominent role in meta-heuristic algorithms for the problem of optimization. These algorithms can affect convergence to identify a robust optimum solution. To investigate the effectiveness of diversity, many scholars have a focus on the reliability and quality of meta-heuristic algorithms for enhancement. To initialize the population in the search space, this dissertation proposes three new low discrepancy sequences for population initialization instead of uniform distribution called the WELL sequence, Knuth sequence, and Torus sequence. This paper also introduces a detailed survey of the different initialization methods of PSO and DE based on quasi-random sequence families such as the Sobol sequence, Halton sequence, and uniform random distribution. For well-known benchmark test problems and learning of artificial neural network, the proposed methods for PSO (TO-PSO, KN-PSO, and WE-PSO), BA (BA-TO, BA-WE, and BA-KN), and DE (DE-TO, DE-WE, and DE-KN) have been evaluated. The synthesis of our strategies demonstrates promising success over uniform random numbers using low discrepancy sequences. The experimental findings indicate that the initialization based on low discrepancy sequences is exceptionally stronger than the uniform random number. Furthermore, our work outlines the profound effects on convergence and heterogeneity of the proposed methodology. It is expected that a comparative simulation survey of the low discrepancy sequence would be beneficial for the investigator to analyze the meta-heuristic algorithms in detail.