Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
The task of continual learning is to design algorithms that can address the problem of catastrophic forgetting. However, in the real world, there are noisy labels due to inaccurate human annotations and other factors, which seem to exacerbate catastrophic forgetting. To tackle both catastrophic forg...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Conference paper |
Published: |
Springer Science and Business Media Deutschland GmbH
2025
|
Subjects: | |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The task of continual learning is to design algorithms that can address the problem of catastrophic forgetting. However, in the real world, there are noisy labels due to inaccurate human annotations and other factors, which seem to exacerbate catastrophic forgetting. To tackle both catastrophic forgetting and noise issues, we propose an innovative framework. Our framework leverages sample uncertainty to purify the data stream and selects representative samples for replay, effectively alleviating catastrophic forgetting. Additionally, we adopt a semi-supervised approach for fine-tuning to ensure the involvement of all available samples. Simultaneously, we incorporate contrastive learning and entropy minimization to mitigate noise memorization in the model. We validate the effectiveness of our proposed method through extensive experiments on two benchmark datasets, CIFAR-10 and CIFAR-100. For CIFAR-10, we achieve a performance gain of 2% under 20% noise conditions. ? The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2024. |
---|