Back to glossary

AI GLOSSARY

Temperature

Large Language Model (LLM) Terms

A parameter that controls the randomness of a language model's output by scaling the probability distribution over possible next tokens. A low temperature makes the model more deterministic and focused, sticking to high-probability choices. A high temperature makes it more creative and varied, but also more likely to produce unexpected or incoherent outputs.
See also: sampling.