Context Window
A context window is the maximum amount of text (measured in tokens) that an AI model can process in a single interaction, including both the input prompt and the generated output. Models with larger context windows can handle longer documents and maintain more conversation history, but performance may degrade as the window fills up.
Example
GPT-4 Turbo has a context window of 128,000 tokens (roughly 96,000 words). If you paste a 50-page report into the prompt, it consumes a large portion of the window, leaving less room for the model's response.
Put this into practice
Build polished, copy-ready prompts in under 60 seconds with SurePrompts.
Try SurePrompts