The previous Claude 2 was launched ls July with a whopping 100,000 (100K) tokens, which makes for longer input and output than the free version of ChatGPT. This capability means users can exchange up to around 75,000 words in each conversation. The latest version currently available, Claude 3, can handle about 200,000 words, with a 195K context, giving it an even better ability to understand context in conversations.
Claude’s 195K context exceeds ChatGPT’s 4K context in GPT-3.5. Context enables LLMs to generate nuanced, natural language by leveraging information from massive datasets used to train the models on the contextual relationships between words and phrases.
In simple terms, this context is the background information, such as previous chats, the back-and-forth conversation from earlier in a chat, and user preferences that give the AI bot a better understanding of what’s happening. This information could be maintaining context within a long conversation or applying it to a user’s settings. Typically, the larger the context, the more accurate the information in a conversation.
Also: GPT-4 Turbo reclaims ‘best AI model’ crown from Anthropic’s Claude 3
Context helps the AI chatbot understand when a user, for example, might be referring to a “bat” in sports equipment or a winged animal.
Claude’s context means it can parse and summarize long documents, including scientific and medical studies, books, and reports. This context also means Claude can generate long texts up to several thousand words long.
Source: 4 things Claude AI can do that ChatGPT can't