An AI start-up recently released an AI model with a special feature: a context window over 1 million tokens – well above what’s offered by most commercial AI systems.
But why is a large context length important – and why does that matter for your business?
A large language model (LLM), such as OpenAI’s ChatGPT, functions as a simple input/output system. You input text, the AI processes this input, and then outputs a text-based response. As the interaction progresses, the AI analyzes an increasing amount of the input text, which enables it to generate more relevant and accurate responses.
The amount of text an LLM can consider as it generates a response is called it’s “context window”. Think of it like the area on a stage; a coffee house stage might barely fit a single artist, whereas a giant stadium can host a wide variety of performances with huge video screens, giant props, and a squad of dancers. Where a stage is measured in square feet, a context window is measured in “tokens”, with each token roughly equivalent to a word or a punctuation mark. For instance, the sentence “this is a sentence.” would have five tokens (‘this’ ‘is’ ‘a’ ‘sentence’ ‘.’).
The original ChatGPT (GPT-3) released in November 2022 had a context window of approximately 2000 tokens, meaning it could remember short conversations or a few pages of text before it started forgetting older information. This is enough for applications like short chats, programming snippets, or answering common Wikipedia questions, but little more.
GPT-4 Turbo, OpenAI’s most advanced model today, has a context window of over 128,000 tokens, meaning it can now consider data equivalent to a typical airport book. This window size could be used for applications like analyzing lengthy legal documents or programming medium-sized applications.
A context of window of 1M tokens is enormous by today’s standards, and to date has only been available commercially via Google’s Gemini AI. 1M tokens is equivalent to four copies of Moby Dick, and could enable applications like reviewing large software code bases, synthesizing information from a large number of documents, or analyzing massive data sets for trends and insights.
Just like in business, every AI system has a different set of requirements, and the right solution means balancing cost and capability. With the newest innovations, the ability of AI’s to process ever greater amounts of text opens up a whole range of new exciting opportunities.
Liked this post? Get more straight to your inbox by subscribing to the AI For Business newsletter: https://buttondown.email/AI_For_Business
Want to stay informed on the most important AI news in just 30 minutes a month? Sign up for the Executive Summary newsletter by AI For Business. Thank you!
#LLM #LLMs #AI #GenAI #GenerativeAI #Explainer