Investigating group distributionally robust optimization for deep imbalanced learning: a case study of binary tabular data classification.
One of the most studied machine learning challenges that recent studies have shown the susceptibility of deep neural networks to is the class imbalance problem. While concerted research efforts in this direction have been notable in recent years, findings have shown that the canonical learning objec...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Science and Information Organization
2023
|
Subjects: | |
Online Access: | http://eprints.utm.my/105382/1/IsmailMustapha2023_InvestigatingGroupDistributionallyRobustOptimization.pdf http://eprints.utm.my/105382/ http://dx.doi.org/10.14569/IJACSA.2023.0140286 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | One of the most studied machine learning challenges that recent studies have shown the susceptibility of deep neural networks to is the class imbalance problem. While concerted research efforts in this direction have been notable in recent years, findings have shown that the canonical learning objective, empirical risk minimization (ERM), is unable to achieve optimal imbalance learning in deep neural networks given its bias to the majority class. An alternative learning objective, group distributionally robust optimization (gDRO), is investigated in this study for imbalance learning, focusing on tabular imbalanced data as against image data that has dominated deep imbalance learning research. Contrary to minimizing average per instance loss as in ERM, gDRO seeks to minimize the worst group loss over the training data. Experimental findings in comparison with ERM and classical imbalance methods using four popularly used evaluation metrics in imbalance learning across several benchmark imbalance binary tabular data of varying imbalance ratios reveal impressive performance of gDRO, outperforming other compared methods in terms of g-mean and roc-auc. |
---|