Back to glossaryExternal reference
AI GLOSSARY
Floating Point Operations Per Second
FLOPSEcosystem & Industry
A measure of computational performance indicating how many floating point calculations a processor can perform per second. In AI, FLOPS are used to quantify the computational cost of training or running a model, and to compare hardware capabilities. Larger models require vastly more FLOPS to train, making it a key metric for estimating resource requirements and understanding the economics of AI development.
See also: GPU, training, scaling law.