Back to glossary

AI GLOSSARY

Knowledge Distillation

AI & Machine Learning

A specific form of distillation where the goal is to transfer knowledge from a large model into a smaller one, often by training the student on the soft probability outputs of the teacher rather than just hard labels. The soft outputs carry richer information about the teacher's internal confidence across classes, allowing the student to learn more efficiently.
See also: distillation, fine-tuning, foundation model.

External reference