distribution / distribution-map based methods

— from Deep Industrial Image Anomaly Detection: A Survey. MIR24
-
They construct a distribution map of normal samples in the feature space, and measure how much a sample’s features deviate from the normal distribution.
-
Gaussian & Mahalanobis distance
- uses a parametric distribution like a Gaussian distribution to model the distribution of normal samples in the feature space
- AD task can be reformulated as a task of out-of-distribution detection (OOD)
-
Modeling the Distribution of Normal Data in Pre-Trained Deep Features for Anomaly Detection, 20 the simplest, image level distribution, Mahalanobis distance
- it establishes a model of normality by fitting a multivariate Gaussian to
feature representations of a pre-trained network.
-
PaDim, 21 patch level distribution, Mahalanobis distance
Why previous methods are not good enough?
PaDim assumes that it can fit normal data with a unimodal & general multivariate Gaussian (MVG)
- general MCG, not a independent MVG, i.e. has to save the covariance matrix ⇒ large storage
- unimodal MVG, not a multimodal MVG
- 比如一个位置可能有三种不同的正常纹理,它的特征分布可能是“三座山峰(多峰)”或者像“弯月”一样复杂的形状。如果强行用一个椭圆罩子(高斯分布)去套,就会产生极大的误差。
Recap on distribution base methods
- distribution per location, i.e. position info is considered, too dependent on matched locations.
- just use pre-trained network.
- anomaly score by Mahalanobis distance is faster & less memory than kNN.