Switched neural networks for simultaneous learning of multiple functions.
This paper introduces the notion of switched neural networks for learning multiple functions under different switching configurations. The neural network structure has adjustable parameters and for each function the state of the parameter vector is determined by a mask vector, 1/0 for active/inactiv...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Published: |
Institute of Electrical and Electronics Engineers Inc.
2024
|
Subjects: | |
Online Access: | http://eprints.utm.my/108868/ http://dx.doi.org/10.1109/TETCI.2024.3369981 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper introduces the notion of switched neural networks for learning multiple functions under different switching configurations. The neural network structure has adjustable parameters and for each function the state of the parameter vector is determined by a mask vector, 1/0 for active/inactive or +1/-1 for plain/inverted. The optimization problem is to schedule the switching strategy (mask vector) required for each function together with the best parameter vector (weights/biases) minimizing the loss function. This requires a procedure that optimizes a vector containing real and binary values simultaneously to discover commonalities among various functions. Our studies show that a small sized neural network structure with an appropriate switching regime is able to learn multiple functions successfully. During the tests focusing on classification, we considered 2-variable binary functions and all 16 combinations have been chosen as the functions. The regression tests consider four functions of two variables. Our studies showed that simple NN structures are capable of storing multiple information via appropriate switching. |
---|