distribution / distribution-map based methods

Untitled

— from Deep Industrial Image Anomaly Detection: A Survey. MIR24

  1. They construct a distribution map of normal samples in the feature space, and measure how much a sample’s features deviate from the normal distribution.

  2. Gaussian & Mahalanobis distance

    1. uses a parametric distribution like a Gaussian distribution to model the distribution of normal samples in the feature space
    2. AD task can be reformulated as a task of out-of-distribution detection (OOD)
  3. Modeling the Distribution of Normal Data in Pre-Trained Deep Features for Anomaly Detection, 20 the simplest, image level distribution, Mahalanobis distance

    1. it establishes a model of normality by fitting a multivariate Gaussian to feature representations of a pre-trained network.
  4. PaDim, 21 patch level distribution, Mahalanobis distance

Intro-Normalizing Flow based AD

Why previous methods are not good enough?

PaDim assumes that it can fit normal data with a unimodal & general multivariate Gaussian (MVG)

  1. general MCG, not a independent MVG, i.e. has to save the covariance matrix ⇒ large storage
  2. unimodal MVG, not a multimodal MVG
    1. 比如一个位置可能有三种不同的正常纹理,它的特征分布可能是“三座山峰(多峰)”或者像“弯月”一样复杂的形状。如果强行用一个椭圆罩子(高斯分布)去套,就会产生极大的误差。

Recap on distribution base methods

  1. distribution per location, i.e. position info is considered, too dependent on matched locations.
  2. just use pre-trained network.
  3. anomaly score by Mahalanobis distance is faster & less memory than kNN.