A flexible enhanced fuzzy min-max neural network for pattern classification
In this paper, the existing enhanced fuzzy min–max (EFMM) neural network is improved with a flexible learning procedure for undertaking pattern classification tasks. Four new contributions are introduced. Firstly, a new training strategy is proposed for avoiding the generation of unnecessary overlap...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English English |
Published: |
Elsevier
2024
|
Subjects: | |
Online Access: | http://umpir.ump.edu.my/id/eprint/41060/1/A%20flexible%20enhanced%20fuzzy%20min-max%20neural%20network_ABST.pdf http://umpir.ump.edu.my/id/eprint/41060/2/A%20flexible%20enhanced%20fuzzy%20min-max%20neural%20network%20for%20pattern%20classification.pdf http://umpir.ump.edu.my/id/eprint/41060/ https://doi.org/10.1016/j.eswa.2024.124030 https://doi.org/10.1016/j.eswa.2024.124030 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper, the existing enhanced fuzzy min–max (EFMM) neural network is improved with a flexible learning procedure for undertaking pattern classification tasks. Four new contributions are introduced. Firstly, a new training strategy is proposed for avoiding the generation of unnecessary overlapped regions between hyperboxes of different classes. The learning phase is simplified by eliminating the contraction procedure. Secondly, a new flexible expansion procedure is introduced. It eliminates the use of a user-defined parameter (expansion coefficient) to determine the hyperbox sizes. Thirdly, a new overlap test rule is applied during the test phase to identify the containment cases and activate the contraction procedure (if necessary). Fourthly, a new contraction procedure is formulated to overcome the containment cases and avoid the data distortion problem. Both the third and fourth contributions are important for preventing the catastrophic forgetting issue and supporting the stability-plasticity principle pertaining to online learning. The performance of the proposed model is evaluated with benchmark data sets. The results demonstrate its efficiency in handling pattern classification tasks, outperforming other related models in online learning environments. |
---|