← Glossary AI

Token

A small chunk of text. It's how the AI counts how much it's reading and writing.

Explained simply.

The AI doesn't see whole words the way you and I do. It breaks everything into pieces called tokens. A short word like 'the' is one token. A longer word like 'autonomous' might be two or three ('aut', 'ono', 'mous'). Punctuation counts. Spaces count. Emojis count. Roughly, 1 token = 4 characters of English = 3/4 of a word. 1,000 tokens ≈ 750 words ≈ 3 paragraphs.

An example.

The sentence 'Claude is an AI assistant' is about 6 tokens. A 10-page document is about 3,000-5,000 tokens. This whole glossary entry is about 400 tokens.

Why it matters.

You pay per token. Context windows are measured in tokens. Every API bill is a token bill. If you know your input is 10,000 tokens and your output is 2,000 tokens, you can estimate cost and speed before you hit Run.