Token

A token is the basic unit of text that AI models use to process and generate language. Tokens can be whole words, parts of words, or individual characters — for example, the word "prompting" might be split into "prompt" and "ing." Token counts determine context window limits, API costs, and processing speed.

Example

The sentence "I love prompt engineering" is approximately 4 tokens. The longer phrase "Retrieval-Augmented Generation is fascinating" might be 6-7 tokens because hyphenated and longer words get split into subword pieces.

Related Terms

Put this into practice

Build polished, copy-ready prompts in under 60 seconds with SurePrompts.

Try SurePrompts