Back to glossaryExternal reference
AI GLOSSARY
Distillation
AI & Machine Learning
A technique where a smaller, simpler model, the student, is trained to mimic the behavior of a larger, more powerful model, the teacher. The result is a compact model that retains much of the capability of the original but is cheaper and faster to run. Distillation is widely used to make large foundation models practical to deploy on consumer hardware or at scale.
See also: quantization, model compression, fine-tuning.