Back to glossary
AI GLOSSARY
Batch Learning
Learning Paradigms
A training approach where the model is trained on the entire dataset at once, in a single training run, rather than updating continuously as new data arrives. Batch learning produces stable, reproducible models but requires retraining from scratch, or from a checkpoint, whenever the model needs to incorporate new data.
See also: online learning, checkpoint.