Strengthening the Normalizing Flows by Contrastive Learning for Better Anomaly Detection

Abstract

Untitled

说明有作用,推开了

Untitled

说明,不简单,abnormal的构造重要。

Introduction

Untitled

  1. 2d NF出正负样本的Z和Neg_Z,再接contrastive learning,

  2. smooth classification?

    Untitled

    1. where CELoss represents CrossEntropyLoss, and τ is the label smoothing coefficient.
  3. contrastive learning: 只考虑了正样本的全局特征要靠近,正负样本的box内的特征要排斥。

    1. Z⇒S

    2. Neg_Box_s represents the features of anomalous regions in negative samples

    3. Box_s represents the features of corresponding regions in positive samples.

    4. F denotes the cosine similarity function

      Untitled

retaining the lightweight characteristics of the 2D-Flow 为啥2d更light?

Untitled

Related work

Normalizing Flow

  1. BGAD-FAS [37]
    1. initially used self-generated negative samples to optimize classification boundaries during training, the two-stage process and manual boundary delineation posed inconvenience.

References

  1. CL-Flow: Strengthening the Normalizing Flows by Contrastive Learning for Better Anomaly Detection, 23
  2. BGAD-FAS [37]: Explicit Boundary Guided Semi-Push-Pull Contrastive Learning for Supervised Anomaly Detection (CVPR2023)