← Back to glossary
Glossary

Token

Reviewed 20 March 2026 Canonical definition

A token is the basic unit of text that a language model processes — typically a word, subword, or character. Token counts determine model input limits, output length, and cost. Governance includes monitoring and budgeting token consumption.