Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement

The task of continual learning is to design algorithms that can address the problem of catastrophic forgetting. However, in the real world, there are noisy labels due to inaccurate human annotations and other factors, which seem to exacerbate catastrophic forgetting. To tackle both catastrophic forg...

Full description

Saved in:
Bibliographic Details
Main Authors: Guo G., Wei Z., Cheng J.
Other Authors: 58805753200
Format: Conference paper
Published: Springer Science and Business Media Deutschland GmbH 2025
Subjects:
Tags: Add Tag
No Tags, Be the first to tag this record!
id my.uniten.dspace-37178
record_format dspace
spelling my.uniten.dspace-371782025-03-03T15:48:19Z Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement Guo G. Wei Z. Cheng J. 58805753200 58805777500 22833734200 Catastrophic forgetting Continual learning Feature enhancement Noisy data Noisy labels Real-world Replay Sample features Samples selection Uncertainty Entropy The task of continual learning is to design algorithms that can address the problem of catastrophic forgetting. However, in the real world, there are noisy labels due to inaccurate human annotations and other factors, which seem to exacerbate catastrophic forgetting. To tackle both catastrophic forgetting and noise issues, we propose an innovative framework. Our framework leverages sample uncertainty to purify the data stream and selects representative samples for replay, effectively alleviating catastrophic forgetting. Additionally, we adopt a semi-supervised approach for fine-tuning to ensure the involvement of all available samples. Simultaneously, we incorporate contrastive learning and entropy minimization to mitigate noise memorization in the model. We validate the effectiveness of our proposed method through extensive experiments on two benchmark datasets, CIFAR-10 and CIFAR-100. For CIFAR-10, we achieve a performance gain of 2% under 20% noise conditions. ? The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2024. Final 2025-03-03T07:48:18Z 2025-03-03T07:48:18Z 2024 Conference paper 10.1007/978-981-99-8543-2_40 2-s2.0-85181983496 https://www.scopus.com/inward/record.uri?eid=2-s2.0-85181983496&doi=10.1007%2f978-981-99-8543-2_40&partnerID=40&md5=f79b7a4392845c0d29b3d4190ac18737 https://irepository.uniten.edu.my/handle/123456789/37178 14432 LNCS 498 510 Springer Science and Business Media Deutschland GmbH Scopus
institution Universiti Tenaga Nasional
building UNITEN Library
collection Institutional Repository
continent Asia
country Malaysia
content_provider Universiti Tenaga Nasional
content_source UNITEN Institutional Repository
url_provider http://dspace.uniten.edu.my/
topic Catastrophic forgetting
Continual learning
Feature enhancement
Noisy data
Noisy labels
Real-world
Replay
Sample features
Samples selection
Uncertainty
Entropy
spellingShingle Catastrophic forgetting
Continual learning
Feature enhancement
Noisy data
Noisy labels
Real-world
Replay
Sample features
Samples selection
Uncertainty
Entropy
Guo G.
Wei Z.
Cheng J.
Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
description The task of continual learning is to design algorithms that can address the problem of catastrophic forgetting. However, in the real world, there are noisy labels due to inaccurate human annotations and other factors, which seem to exacerbate catastrophic forgetting. To tackle both catastrophic forgetting and noise issues, we propose an innovative framework. Our framework leverages sample uncertainty to purify the data stream and selects representative samples for replay, effectively alleviating catastrophic forgetting. Additionally, we adopt a semi-supervised approach for fine-tuning to ensure the involvement of all available samples. Simultaneously, we incorporate contrastive learning and entropy minimization to mitigate noise memorization in the model. We validate the effectiveness of our proposed method through extensive experiments on two benchmark datasets, CIFAR-10 and CIFAR-100. For CIFAR-10, we achieve a performance gain of 2% under 20% noise conditions. ? The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd 2024.
author2 58805753200
author_facet 58805753200
Guo G.
Wei Z.
Cheng J.
format Conference paper
author Guo G.
Wei Z.
Cheng J.
author_sort Guo G.
title Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
title_short Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
title_full Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
title_fullStr Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
title_full_unstemmed Enhancing Continual Noisy Label Learning with�Uncertainty-Based Sample Selection and�Feature Enhancement
title_sort enhancing continual noisy label learning with�uncertainty-based sample selection and�feature enhancement
publisher Springer Science and Business Media Deutschland GmbH
publishDate 2025
_version_ 1826077772317982720
score 13.244413