Back to glossaryExternal reference
AI GLOSSARY
Catastrophic Forgetting
Research & Advanced Concepts
The tendency of a neural network to abruptly lose previously learned knowledge when trained on new data or tasks, as new learning overwrites old weights. It is a fundamental challenge for continual and lifelong learning, and solving it is an active research area with implications for building AI systems that can accumulate knowledge over time without degrading earlier capabilities.
See also: continual learning, transfer learning.