LibAUC

An end-to-end machine learning library for AUC optimization (AUROC, AUPRC).

Why LibAUC?

Deep AUC Maximization (DAM) is a paradigm for learning a deep neural network by maximizing the AUC score of the model on a dataset. There are several benefits of maximizing AUC score over minimizing the standard losses, e.g., cross-entropy.

  • In many domains, AUC score is the default metric for evaluating and comparing different methods. Directly maximizing AUC score can potentially lead to the largest improvement in the model’s performance.
  • Many real-world datasets are usually imbalanced. AUC is more suitable for handling imbalanced data distribution since maximizing AUC aims to rank the predication score of any positive data higher than any negative data

Installation

$ pip install libauc

Usage

Official Tutorials:

  • Creating Imbalanced Benchmark Datasets [Notebook][Script]
  • Optimizing AUROC loss with ResNet20 on Imbalanced CIFAR10 [Notebook][Script]
  • Optimizing AUPRC loss with ResNet18 on Imbalanced CIFAR10 [Notebook][Script]
  • Training with Pytorch Learning Rate Scheduling [Notebook][Script]
  • Optimizing AUROC loss with DenseNet121 on CheXpert [Notebook][Script]
  • Training with Imbalanced Datasets on Distributed Setting [Coming soon]

Quickstart for Beginners:

Optimizing AUROC (Area Under the Receiver Operating Characteristic)

>>> #import library
>>> from libauc.losses import AUCMLoss
>>> from libauc.optimizers import PESG
...
>>> #define loss
>>> Loss = AUCMLoss()
>>> optimizer = PESG()
...
>>> #training
>>> model.train()    
>>> for data, targets in trainloader:
>>>	data, targets  = data.cuda(), targets.cuda()
        preds = model(data)
        loss = Loss(preds, targets) 
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()
...	
>>> #restart stage
>>> optimizer.update_regularizer()		


Optimizing AUPRC (Area Under the Precision-Recall Curve)

>>> #import library
>>> from libauc.losses import APLoss_SH
>>> from libauc.optimizers import SOAP_SGD, SOAP_ADAM
...
>>> #define loss
>>> Loss = APLoss_SH()
>>> optimizer = SOAP_ADAM()
...
>>> #training
>>> model.train()    
>>> for index, data, targets in trainloader:
>>>	data, targets  = data.cuda(), targets.cuda()
        preds = model(data)
        loss = Loss(preds, targets, index) 
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()	

Please visit our website or github for more examples.

Citation

If you find LibAUC useful in your work, please cite the following paper for our library:

@inproceedings{yuan2021robust,
	title={Robust Deep AUC Maximization: A New Surrogate Loss and Empirical Studies on Medical Image Classification},
	author={Yuan, Zhuoning and Yan, Yan and Sonka, Milan and Yang, Tianbao},
	booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
	year={2021}
	}

GitHub

https://github.com/yzhuoning/LibAUC