ENCORE : A Neural Collapse Perspective on Out-of-Distribution Detection in Deep Neural Networks
Abstract
Out-of-Distribution (OOD) detection is of paramountimportance in guaranteeing safe and reliable deploymentof a Deep Neural Network (DNN) model in real-world set-tings. However, most OOD detection approaches still lackmotivation rooted in established properties of the DNNs.This disconnect between the proposed approach and theo-retical underpinning to measurable DNN properties makesthese approaches unreliable. To bridge this gap, we takea different perspective to using energy scoring for OODdetection. Specifically, we look at energy score throughthe lens of the properties of neural collapse and observethat simple feature scaling can improve the separation be-tween In-Distribution (ID) and OOD inputs. Based on thisobservation, we propose ENCORE , which scales featuresof each input adaptively and uses them to obtain modi-fied logits based on insights from theory of neural collapse.We show that ENCORE outperforms state-of-the-art ap-proaches across a variety of benchmarks; for example, by1.37% on CIFAR10 and by 1.07% on Imagenet benchmarks.