A context window in artificial intelligence, specifically in language models like ChatGPT, refers to the amount of information or number of words the model can “see” or consider at one time when it’s generating a response.
Imagine you’re reading a long story, but you can only remember the last few sentences you’ve read. Those last few sentences are kind of like your “context window” for understanding the story. You use them to make sense of what’s happening now in the story and to guess what might happen next.
In the case of AI models, the size of the context window can vary. For example, GPT-3, one of OpenAI’s language models, has a context window of 2048 tokens (a token can be as short as one character or as long as one word). That means when it’s generating a response, it looks at the last 2048 tokens of text to decide what to say next. If the conversation is longer than 2048 tokens, GPT-3 won’t “remember” the earlier parts.
« Back to Glossary Index