Token

« Back to Glossary Index

A token is a unit of text that an AI model processes. Tokens can be as short as one character or as long as one word, depending on the language and model. For example, “education” is one token, while “AI-powered” might be split into two. Tools like Copilot work by analyzing and predicting sequences of tokens to generate responses. Token limits affect how much text an AI can read or write at once, so understanding tokens helps users manage input length and optimize results.

« Back to Glossary Index