Strengthening the Normalizing Flows by Contrastive Learning for Better Anomaly Detection
Abstract

说明有作用,推开了

说明,不简单,abnormal的构造重要。
Introduction

-
2d NF出正负样本的Z和Neg_Z,再接contrastive learning,
-
smooth classification?

- where CELoss represents CrossEntropyLoss, and τ is the label smoothing coefficient.
-
contrastive learning: 只考虑了正样本的全局特征要靠近,正负样本的box内的特征要排斥。
-
Z⇒S
-
Neg_Box_s represents the features of anomalous regions in negative samples
-
Box_s represents the features of corresponding regions in positive samples.
-
F denotes the cosine similarity function

retaining the lightweight characteristics of the 2D-Flow 为啥2d更light?

Related work
Normalizing Flow
- BGAD-FAS [37]
- initially used self-generated negative samples to optimize classification boundaries during training, the two-stage process and manual boundary delineation posed inconvenience.
References
- CL-Flow: Strengthening the Normalizing Flows by Contrastive Learning for Better Anomaly Detection, 23
- BGAD-FAS [37]: Explicit Boundary Guided Semi-Push-Pull Contrastive Learning for Supervised Anomaly Detection (CVPR2023)