An Asymmetric Contrastive Loss for Handling Imbalanced Datasets

Valentino Vito, Lim Yohanes Stefanus

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)

Abstract

Contrastive learning is a representation learning method performed by contrasting a sample to other similar samples so that they are brought closely together, forming clusters in the feature space. The learning process is typically conducted using a two-stage training architecture, and it utilizes the contrastive loss (CL) for its feature learning. Contrastive learning has been shown to be quite successful in handling imbalanced datasets, in which some classes are overrepresented while some others are underrepresented. However, previous studies have not specifically modified CL for imbalanced datasets. In this work, we introduce an asymmetric version of CL, referred to as ACL, in order to directly address the problem of class imbalance. In addition, we propose the asymmetric focal contrastive loss (AFCL) as a further generalization of both ACL and focal contrastive loss (FCL). The results on the imbalanced FMNIST and ISIC 2018 datasets show that the AFCL is capable of outperforming the CL and FCL in terms of both weighted and unweighted classification accuracies.

Original languageEnglish
Article number1303
JournalEntropy
Volume24
Issue number9
DOIs
Publication statusPublished - Sept 2022

Keywords

  • asymmetric loss
  • class imbalance
  • contrastive loss
  • entropy
  • focal loss

Fingerprint

Dive into the research topics of 'An Asymmetric Contrastive Loss for Handling Imbalanced Datasets'. Together they form a unique fingerprint.

Cite this