How

- MFF: multi-scale feature fusion;
- one-class embedding (OCE): 4th residule block of ResNet;
- 不是一个强策略,但是RD++, cvpr23 还是保留了。
- project the teacher model’s high-dimensional representations into a low-dimensional space ⇒ avoid to learn too much from the teacher besides the ability to handle normal images.
- M: 2d anomaly map, i.e. cosine similarity on each location, eq (1);
- L: per location average of M, eq (2);
- Knowledge distillation (KD)
Why
The activation discrepancy on anomalies vanishes sometimes, leading to anomaly detection failure. We argue that this issue is attributed to
- the similar architectures of the teacher and student nets and
- the same data flow during T-S knowledge transfer.
Experiments

Limitations

References
- Anomaly Detection via Reverse Distillation from One-Class Embedding, cvpr22, ranked 33
- Ranked #2 on Anomaly Detection on AeBAD-V