Learning Deep Classifiers Consistent with Fine-Grained Novelty Detection


University of California, San Diego
Published in IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2021

Abstract


The problem of novelty detection in fine-grained visual classification (FGVC) is considered. An integrated understanding of the probabilistic and distance-based approaches to novelty detection is developed within the framework of convolutional neural networks (CNNs). It is shown that softmax CNN classifiers are inconsistent with novelty detection, because their learned class-conditional distributions and associated distance metrics are unidentifiable. A new regularization constraint, the class-conditional Gaussianity loss, is then proposed to eliminate this unidentifiability, and enforce Gaussian class-conditional distributions. This enables training Novelty Detection Consistent Classifiers (NDCCs) that are jointly optimal for classification and novelty detection. Empirical evaluations show that NDCCs achieve significant improvements over the state-of-the-art on both small- and large-scale FGVC datasets.

Methodology


Left: A CNN trained for classification with the crossentropy loss LCE is inconsistent with novelty detection (ND). Because the class-conditional distributions learned by the CNN are unidentifiable, multiple sets of distributions (visualized using contour plots) are compatible with the CNN parameters. Right: Regularization with the proposed CCG loss LCCG makes the distributions identifiable, in fact Gaussian, without sacrificing classification performance.

Results


paper

Multi-class novelty detection performance (AUROC) of different methods. The best results are highlighted in bold, and the second best underlined.

Poster


paper

Paper


PDF

Supplement

Code

Bibtex

Acknowledgements

This work was partially funded by NSF awards IIS1924937 and IIS-2041009, a gift from Amazon, a gift from Qualcomm, and NVIDIA GPU donations. We also acknowledge and thank the use of the Nautilus platform for some of the experiments discussed above.