Back to glossaryExternal reference
AI GLOSSARY
Scaling Law
Ecosystem & Industry
An empirical relationship showing that AI model performance improves predictably as model size, training data, and compute increase, following mathematical power laws. Established by Kaplan et al. (2020), scaling laws were a major driver of investment in ever-larger models through the early 2020s, suggesting that simply training bigger models on more data reliably leads to better performance. Whether that relationship continues to hold — and under what conditions it breaks — is now one of the more actively debated questions in the field.