Back to glossary

AI GLOSSARY

Stochastic Gradient Descent

SGDAI & Machine Learning

A variant of gradient descent that updates model weights using the gradient computed from a single data point or small batch, rather than the full dataset. The randomness this introduces is not just a computational shortcut, it can help models escape flat or suboptimal regions of the loss landscape that full-batch gradient descent might get stuck in, and it scales far more efficiently to large datasets.

External reference