Back to glossary

AI GLOSSARY

Gradient

AI & Machine Learning

A measure of how much a model's loss changes with respect to each of its parameters. Gradients point in the direction of steepest increase in error, so training algorithms move in the opposite direction to reduce the loss. Computing gradients efficiently via backpropagation is what makes training deep neural networks tractable.
See also: gradient descent, backpropagation, loss function.

External reference