Zubnet AILearnWiki › Context Window
Using AI

Context Window

Also known as: Context Length
The maximum amount of text (measured in tokens) a model can process in a single conversation. This includes both your input and the model's output. If a model has a 200K context window, that's roughly 150,000 words — about two novels.

Why it matters

Context window size determines what you can do. Summarize a whole codebase? Needs big context. Quick question-answer? Small is fine. But bigger isn't always better — models can lose focus in very long contexts.

Related Concepts

← All Terms
← Content Moderation Corpus →
ESC