Back to glossary

AI GLOSSARY

Token Limit

Large Language Model (LLM) Terms

The maximum number of tokens a language model can process in a single interaction, encompassing both input and output. Hitting the token limit means the model can no longer see earlier parts of the conversation, which can cause it to lose track of context. Managing token limits is a key practical constraint in building language model applications.