This is the source code for paper ReAct: Out-of-distribution Detection With Rectified Activations by Yiyou Sun, Chuan Guo and Yixuan Li.
In this work, we propose ReAct—a simple technique for reducing model overconfidence on OOD data. Our method is motivated by novel analysis on internal activations of neural networks, which displays highly distinctive signature patterns for most OOD distributions.
Please download ImageNet-1k and place the training data and validation data in
./datasets/id_data/ILSVRC-2012/train
and ./datasets/id_data/ILSVRC-2012/val
, respectively.
We have curated 4 OOD datasets from iNaturalist, SUN, Places, and Textures, and de-duplicated concepts overlapped with ImageNet-1k.
For iNaturalist, SUN, and Places, we have sampled 10,000 images from the selected concepts for each dataset, which can be download via the following links:
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/iNaturalist.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/SUN.tar.gz
wget http://pages.cs.wisc.edu/~huangrui/imagenet_ood_dataset/Places.tar.gz
For Textures, we use the entire dataset, which can be downloaded from their original website.
Please put all downloaded OOD datasets into ./datasets/ood_data/
.
The model we used in the paper is the pre-trained ResNet-50 and MobileNet-v2 provided by Pytorch. The download process will start upon running.
To reproduce our results on ResNet-50, please run:
python eval.py --threshold 1.0
To reproduce baseline approaches (Energy Score), please run:
python eval.py --threshold 1e6 #we set the threshold close to infinity, so it is the original energy score.
ReACT achieves state-of-the-art performance averaged on the 4 OOD datasets.
If you use our codebase, please cite our work:
@inproceedings{sun2021react,
title={ReAct: Out-of-distribution Detection With Rectified Activations},
author={Sun, Yiyou and Guo, Chuan and Li, Yixuan},
booktitle={Advances in Neural Information Processing Systems},
year={2021}
}