Back to glossaryExternal reference
AI GLOSSARY
Dropout
AI & Machine Learning
A regularization technique used during neural network training where a random subset of neurons is temporarily disabled on each pass through the data. This prevents the network from becoming too reliant on any particular neuron or pathway, and helps it generalize better to new data. Dropout was introduced by Srivastava et al. in 2014 and remains one of the most widely used regularization methods.
See also: overfitting, regularization, batch normalization.