Back to glossary
AI GLOSSARY
Pipeline Parallelism
Deployment & Infrastructure
A distributed training strategy that splits the layers of a model across multiple devices, with each device processing a different stage of the forward and backward pass simultaneously, like an assembly line. It is often combined with data parallelism and model parallelism to train very large models efficiently.