1. 103 Normalizing Flow

  2. They can be seen as distribution based methods (fitting feature y with Gaussian distribution), while they use Gaussian distribution $z=f_{NF}(y)$.

    1. $y=f_{ex}(x)$
    2. x is input image, f_ex is pre-trained feature extraction network.
  3. DifferNet, wacv21: image level 1d normalizing flow, inaccurate location

  4. CFLOW-AD, WACV22: patch level 1d normalizing flow, better location.

    1. introduces positional encoding into the conditional normalizing flow framework.
  5. CL-FLow 23 uses information of multi-scale feature maps and improves DifferNet.

  6. FastFlow, 21: image level 2d NF, end-to-end, faster, more economical

    1. introduces an alternate stacking of large and small convolution kernels in the NF module to model global and local distribution.
    2. feature extractor: Vit (CaiT) > wide restnet 50 > Vit (DeiT) > resnet 18
    3. 还要看看code,理解一下如何用的?
    4. FastFlow+AltUB
  7. MSFlow, 23

  8. PyramidFlow, CVPR23

  9. AltUB: Alternating Training Method to Update Base Distribution of Normalizing Flow for Anomaly Detection, 22

1st Problem of NF

Is it enough to just learn from normal samples?

  1. DO DEEP GENERATIVE MODELS KNOW WHAT THEY DON’T KNOW? ICLR19
  2. Why Normalizing Flows Fail to Detect Out-of-Distribution Data, nips20

Untitled

  1. Improve by Contrastive learning
    1. Explicit Boundary Guided Semi-Push-Pull Contrastive Learning for Supervised Anomaly Detection (CVPR2023)

    2. CL-FLow 23

      Untitled

2nd Problem of NF

Multiscale for better location.

  1. MSFlow: Multi-Scale Flow-based Framework for Unsupervised Anomaly Detection, 23
    1. Ranked #4 on MVtecAD (Detection AUROC metric)