Back to glossary
AI GLOSSARY
Out-of-Distribution
OODResearch & Advanced Concepts
Referring to inputs that differ significantly from the data a model was trained on. Models often perform poorly or unpredictably on OOD inputs because they have learned patterns specific to their training distribution. Detecting and handling OOD inputs gracefully is an important challenge for deploying reliable AI systems in open-ended real-world environments.
See also: distribution shift.