Human leaps and bounds on OpenAI with a chatbot that can read a novel in under a minute

The limitations of chatbots are often overlooked memory. While it is true that the AI ​​language models that power these systems are trained on terabytes of text, the amount these systems can process in use — that is, the combination of text input and output, also known as a “context window” — is limited. For ChatGPT, it’s about 3000 words. There are ways around this, but it’s still not a huge amount of information to play with.

Now, AI startup Anthropic (founded by former OpenAI engineers) has greatly expanded its chatbot’s context window, pushing it to around 75,000 words. As the company points out in a blog postthis is enough to process a file The great Gatsby through once. In fact, the company tested the system by doing just that—editing one sentence into the novel and asking Claude to mark the change. I did it in 22 seconds.

You may have noticed my inaccuracy in describing the length of these context windows. This is because AI language models measure information not by the number of characters or words, but by symbols; A semantic unit does not accurately define these familiar quantities. It makes sense when you think about it. After all, words can be long or short, and their length does not necessarily correspond to the complexity of their meaning. (They are often the longest definitions in the dictionary Shorter words.) The use of “tokens” reflects this fact, thus, to be more precise: Claude’s context window can now handle 100,000 tokens, up from 9,000 before. In comparison, OpenAI’s GPT-4 handles about 8000 tokens (that’s not the standard form available in ChatGPT – you have to pay for access) while the full-fat limited version GPT-4 model can Handling up to 32,000 tokens.

See also  SEOPS Plus subscribers get out of the big potential promotion tax

For now, the new cloud capacity is only available to Anthropic’s business partners, who leverage the chatbot via the company’s API. Prices are also unknown, but they are sure to be a huge bump. Processing more text means spending more on computing.

But the news shows that the ability of AI language models to process information is increasing, and this will certainly make these systems more useful. As Anthropic notes, it takes a human about five hours to read 75,000 words of text, but with Claude’s expanded context window, he can take on the task of reading, summarizing, and analyzing long documents in a matter of minutes. (Though it doesn’t do anything about chatbots’ constant tendency to build up information.) A larger context window also means the system is capable of longer conversations. One factor in chatbots going off the rails is that when their context window gets full they forget what was said which is why Bing chatbot is limited to 20 conversation courses. More context means more conversation.

Leave a Reply

Your email address will not be published. Required fields are marked *