Javascript must be enabled to continue!
An Asymmetric Contrastive Loss for Handling Imbalanced Datasets
View through CrossRef
Contrastive learning is a representation learning method performed by contrasting a sample to other similar samples so that they are brought closely together, forming clusters in the feature space. The learning process is typically conducted using a two-stage training architecture, and it utilizes the contrastive loss (CL) for its feature learning. Contrastive learning has been shown to be quite successful in handling imbalanced datasets, in which some classes are overrepresented while some others are underrepresented. However, previous studies have not specifically modified CL for imbalanced datasets. In this work, we introduce an asymmetric version of CL, referred to as ACL, in order to directly address the problem of class imbalance. In addition, we propose the asymmetric focal contrastive loss (AFCL) as a further generalization of both ACL and focal contrastive loss (FCL). The results on the imbalanced FMNIST and ISIC 2018 datasets show that the AFCL is capable of outperforming the CL and FCL in terms of both weighted and unweighted classification accuracies.
Title: An Asymmetric Contrastive Loss for Handling Imbalanced Datasets
Description:
Contrastive learning is a representation learning method performed by contrasting a sample to other similar samples so that they are brought closely together, forming clusters in the feature space.
The learning process is typically conducted using a two-stage training architecture, and it utilizes the contrastive loss (CL) for its feature learning.
Contrastive learning has been shown to be quite successful in handling imbalanced datasets, in which some classes are overrepresented while some others are underrepresented.
However, previous studies have not specifically modified CL for imbalanced datasets.
In this work, we introduce an asymmetric version of CL, referred to as ACL, in order to directly address the problem of class imbalance.
In addition, we propose the asymmetric focal contrastive loss (AFCL) as a further generalization of both ACL and focal contrastive loss (FCL).
The results on the imbalanced FMNIST and ISIC 2018 datasets show that the AFCL is capable of outperforming the CL and FCL in terms of both weighted and unweighted classification accuracies.
Related Results
Advanced Re-Sampling Techniques for Multi-Class Imbalanced Classification
Advanced Re-Sampling Techniques for Multi-Class Imbalanced Classification
Imbalanced classification is a common problem in machine learning, where one class significantly outnumbers the others. This imbalance leads to biased model performance, where the ...
Realizing the Asymmetric Index of a Graph
Realizing the Asymmetric Index of a Graph
A graph G is asymmetric if its automorphism group is trivial. Asymmetric graphs were introduced by Erd\H{o}s and R\'{e}nyi Erdos [1]. They suggested the problem of starting with ...
A contrastive adversarial encoder for multi-omics data integration
A contrastive adversarial encoder for multi-omics data integration
Early and accurate cancer detection is crucial for effective treatment, prognosis, and the advancement of precision medicine. Analyzing omics data is vital in cancer research. Whil...
Handling the Imbalanced Problem in Agri-Food Data Analysis
Handling the Imbalanced Problem in Agri-Food Data Analysis
Imbalanced data situations exist in most fields of endeavor. The problem has been identified as a major bottleneck in machine learning/data mining and is becoming a serious issue o...
Application of Machine Learning Techniques for Customer Churn Prediction in the Banking Sector
Application of Machine Learning Techniques for Customer Churn Prediction in the Banking Sector
Aim/Purpose: Previous studies have primarily focused on comparing predictive models without considering the impact of data preprocessing on model performance. Therefore, this study...
Analyzing Data Augmentation Techniques for Contrastive Learning in Recommender Models
Analyzing Data Augmentation Techniques for Contrastive Learning in Recommender Models
This paper investigates the application of contrastive learning-based user and item representation learning in recommendation systems. A recommendation model combining contrastive ...
Improving Neural Retrieval with Contrastive Learning
Improving Neural Retrieval with Contrastive Learning
In recent years, neural retrieval models have shown remarkable progress in improving the efficiency and accuracy of information retrieval systems. However, challenges remain in eff...
Contrastive Distillation Learning with Sparse Spatial Aggregation
Contrastive Distillation Learning with Sparse Spatial Aggregation
Abstract
Contrastive learning has advanced significantly and demonstrates excellent transfer learning capabilities. Knowledge distillation is one of the most effective meth...

