Back to glossary

AI GLOSSARY

Model Parallelism

Deployment & Infrastructure

A distributed training strategy where different parts of a model are placed on different processors or machines, rather than replicating the whole model. Model parallelism is necessary when a model is too large to fit in the memory of a single device.
See also: data parallelism, distributed training.