{"version":"1.0","type":"rich","provider_name":"gaks.ai AI Glossary","provider_url":"https://gaks.ai/glossary","title":"Batch Normalization — AI Glossary","author_name":"Glenn Katrud Solheim","author_url":"https://gaks.ai","width":600,"height":200,"html":"<div style=\"font-family:sans-serif;border:1px solid #e0e0e0;border-radius:8px;padding:16px;max-width:600px;background:#ffffff;color:#111111;\"><p style=\"margin:0 0 4px;font-size:11px;color:#666;\">AI Glossary — gaks.ai</p><h3 style=\"margin:0 0 8px;font-size:16px;\">Batch Normalization</h3><p style=\"margin:0 0 12px;font-size:14px;line-height:1.6;\">A technique that stabilizes and speeds up neural network training by normalizing the inputs to each layer so they have a consistent scale and distribution. Introduced by Ioffe and Szegedy in 2015, it allows higher learning rates, reduces sensitivity to initial weight settings, and has a regularizing effect that can reduce overfitting. Why exactly it works remains debated, with the original explanation of reducing internal covariate shift now considered incomplete.  See also: backpropagation, overfitting.</p><a href=\"https://gaks.ai/glossary/batch-normalization\" style=\"font-size:12px;color:#0077aa;\">Source: gaks.ai/glossary/batch-normalization →</a></div>"}