How

Untitled

  1. MFF: multi-scale feature fusion;
  2. one-class embedding (OCE): 4th residule block of ResNet;
    1. 不是一个强策略,但是RD++, cvpr23 还是保留了。
    2. project the teacher model’s high-dimensional representations into a low-dimensional space ⇒ avoid to learn too much from the teacher besides the ability to handle normal images.
  3. M: 2d anomaly map, i.e. cosine similarity on each location, eq (1);
  4. L: per location average of M, eq (2);
  5. Knowledge distillation (KD)

Why

The activation discrepancy on anomalies vanishes sometimes, leading to anomaly detection failure. We argue that this issue is attributed to

  1. the similar architectures of the teacher and student nets and
  2. the same data flow during T-S knowledge transfer.

Experiments

Untitled

Limitations

Untitled

References

  1. Anomaly Detection via Reverse Distillation from One-Class Embedding, cvpr22, ranked 33
    1. Ranked #2 on Anomaly Detection on AeBAD-V