Jump to content

Draft:Out-of-Distribution Detection

From Wikipedia, the free encyclopedia

Out-of-Distribution Detection is a research area within machine learning and artificial intelligence focused on identifying data inputs that significantly differ from the data distribution on which a model was trained. This task is crucial for ensuring the reliability and safety of machine learning models, particularly in safety-critical applications such as autonomous driving, medical diagnostics, and other domains where unexpected inputs could lead to catastrophic consequences.

Motivation

[edit]

Methods

[edit]

Posterior-based

[edit]

Logit-based

[edit]

Feature-based

[edit]

Activation Shaping

[edit]

Available Software

[edit]
  • OpenOOD
  • Pytorch-OOD



References

[edit]