Back to glossary

AI GLOSSARY

Batch Normalization

AI & Machine Learning

A technique that stabilizes and speeds up neural network training by normalizing the inputs to each layer so they have a consistent scale and distribution. Introduced by Ioffe and Szegedy in 2015, it allows higher learning rates, reduces sensitivity to initial weight settings, and has a regularizing effect that can reduce overfitting. Why exactly it works remains debated, with the original explanation of reducing internal covariate shift now considered incomplete.
See also: backpropagation, overfitting.

External reference