Javascript must be enabled to continue!
SECL: Sampling enhanced contrastive learning
View through CrossRef
Instance-level contrastive learning such as SimCLR has been successful as a powerful method for representation learning. However, SimCLR suffers from problems of sampling bias, feature bias and model collapse. A set-level based Sampling Enhanced Contrastive Learning (SECL) method based on SimCLR is proposed in this paper. We use the proposed super-sampling method to expand the augmented samples into a contrastive-positive set, which can learn class features of the target sample to reduce the bias. The contrastive-positive set includes Augmentations (the original augmented samples) and Neighbors (the super-sampled samples). We also introduce a samples-correlation strategy to prevent model collapse, where a positive correlation loss or a negative correlation loss is computed to adjust the balance of model’s Alignment and Uniformity. SECL reaches 94.14% classification precision on SST-2 dataset and 89.25% on ARSC dataset. For the multi-class classification task, SECL achieves 90.99% on AGNews dataset. They are all about 1% higher than the precision of SimCLR. Experiments show that the training convergence of SECL is faster, and SECL reduces the risk of bias and model collapse.
Title: SECL: Sampling enhanced contrastive learning
Description:
Instance-level contrastive learning such as SimCLR has been successful as a powerful method for representation learning.
However, SimCLR suffers from problems of sampling bias, feature bias and model collapse.
A set-level based Sampling Enhanced Contrastive Learning (SECL) method based on SimCLR is proposed in this paper.
We use the proposed super-sampling method to expand the augmented samples into a contrastive-positive set, which can learn class features of the target sample to reduce the bias.
The contrastive-positive set includes Augmentations (the original augmented samples) and Neighbors (the super-sampled samples).
We also introduce a samples-correlation strategy to prevent model collapse, where a positive correlation loss or a negative correlation loss is computed to adjust the balance of model’s Alignment and Uniformity.
SECL reaches 94.
14% classification precision on SST-2 dataset and 89.
25% on ARSC dataset.
For the multi-class classification task, SECL achieves 90.
99% on AGNews dataset.
They are all about 1% higher than the precision of SimCLR.
Experiments show that the training convergence of SECL is faster, and SECL reduces the risk of bias and model collapse.
Related Results
Contrastive Distillation Learning with Sparse Spatial Aggregation
Contrastive Distillation Learning with Sparse Spatial Aggregation
Abstract
Contrastive learning has advanced significantly and demonstrates excellent transfer learning capabilities. Knowledge distillation is one of the most effective meth...
Improving Neural Retrieval with Contrastive Learning
Improving Neural Retrieval with Contrastive Learning
In recent years, neural retrieval models have shown remarkable progress in improving the efficiency and accuracy of information retrieval systems. However, challenges remain in eff...
Contrastive Instruction-Trajectory Learning for Vision-Language Navigation
Contrastive Instruction-Trajectory Learning for Vision-Language Navigation
The vision-language navigation (VLN) task requires an agent to reach a target with the guidance of natural language instruction. Previous works learn to navigate step-by-step follo...
An Asymmetric Contrastive Loss for Handling Imbalanced Datasets
An Asymmetric Contrastive Loss for Handling Imbalanced Datasets
Contrastive learning is a representation learning method performed by contrasting a sample to other similar samples so that they are brought closely together, forming clusters in t...
Initial Experience with Pediatrics Online Learning for Nonclinical Medical Students During the COVID-19 Pandemic
Initial Experience with Pediatrics Online Learning for Nonclinical Medical Students During the COVID-19 Pandemic
Abstract
Background: To minimize the risk of infection during the COVID-19 pandemic, the learning mode of universities in China has been adjusted, and the online learning o...
Contrastive Functional Analysis
Contrastive Functional Analysis
Why is a raven like a writing-desk? The concept of similarity lies at the heart of this new book on contrastive analysis. Similarity judgements depend partly on properties of the o...
Self-Supervised Heterogeneous Graph Neural Network with Multi-Scale Meta-Path Contrastive Learning
Self-Supervised Heterogeneous Graph Neural Network with Multi-Scale Meta-Path Contrastive Learning
Abstract
Heterogeneous graph neural networks (HGNNs) exhibit remarkable capabilities in modeling complex structures and multi-semantic information. However, existing method...
IDENTIFYING BARRIERS IN E – LEARNING, A MEDICAL STUDENT’S PERSPECTIVE
IDENTIFYING BARRIERS IN E – LEARNING, A MEDICAL STUDENT’S PERSPECTIVE
Objective:
To recognize the barriers in different modes of e learning, from the medical student’s perspective during the period of Covid 19 pandemic.
Study Desi...

