Back to glossary

AI GLOSSARY

FLOPs

FLOPsAI & Machine Learning

Floating Point Operations — a measure of computational workload used to quantify the cost of training or running an AI model. A single FLOP is one arithmetic operation (addition, multiplication, etc.) on a floating-point number. Model size and training cost are often compared in terms of FLOPs, as in "this model required 10^23 FLOPs to train."
See also: parameter, inference cost.