Back to glossaryExternal reference
AI GLOSSARY
Autoregressive Model
Neural Network Architectures
A model that generates output one element at a time, with each new token, pixel, or data point predicted based on everything that came before it. Most large language models are autoregressive, generating text left to right with each word conditioned on the full preceding context. This sequential nature makes generation straightforward but limits parallelism during inference.
See also: large language model, transformer, token.