30.2 C
New York
Tuesday, August 12, 2025

Buy now

Anthropic’s Claude AI model can now handle longer prompts

Anthropic is rising the quantity of knowledge that enterprise clients can ship to Claude in a single immediate, a part of an effort to draw extra builders to the corporate’s standard AI coding fashions.

For Anthropic’s API clients, the corporate’s Claude Sonnet 4 AI mannequin now has a a million token context window — that means the AI can deal with requests so long as 750,000 phrases, greater than the whole Lord of the Rings trilogy, or 75,000 strains of code. That’s roughly 5 occasions Claude’s earlier restrict (200,000 tokens), and greater than double the 400,000 token context window supplied by OpenAI’s GPT-5.

Lengthy context may even be accessible for Claude Sonnet 4 by means of Anthropic’s cloud companions, together with on Amazon Bedrock and Google Cloud’s Vertex AI.

Anthropic has constructed one of many largest enterprise companies amongst AI mannequin builders, largely by promoting Claude to AI coding platforms reminiscent of Microsoft’s GitHub Copilot, Windsurf, and Anysphere’s Cursor. Whereas Claude has change into the mannequin of selection amongst builders, GPT-5 could threaten Anthropic’s dominance with its aggressive pricing and powerful coding efficiency. Anysphere CEO Michael Truell even helped OpenAI announce the launch of GPT-5, which is now the default AI mannequin for brand new customers in Cursor.

Anthropic’s product lead for the Claude platform, Brad Abrams, informed iinfoai in an interview that he expects AI coding platforms to get a “lot of profit” from this replace. When requested if GPT-5 put a dent in Claude’s API utilization, Abrams downplayed the priority, saying he’s “actually pleased with the API enterprise and the way in which it’s been rising.”

See also  What is the Bletchley Declaration Signed by 28 Countries?

Whereas OpenAI generates most of its income from client subscriptions to ChatGPT, Anthropic’s enterprise facilities round promoting AI fashions to enterprises by means of an API. That’s made AI coding platforms a key buyer for Anthropic, and might be why the corporate is throwing in some new perks to draw customers within the face of GPT-5.

Final week, Anthropic unveiled an up to date model of its largest AI mannequin, Claude Opus 4.1, which pushed the corporate’s AI coding capabilities a bit additional.

Techcrunch occasion

San Francisco
|
October 27-29, 2025

Typically talking, AI fashions are likely to carry out higher on all duties after they have extra context, however particularly for software program engineering issues. For instance, in case you ask an AI mannequin to spin up a brand new characteristic in your app, it’s more likely to do a greater job if it might see the whole mission, reasonably than only a small part.

Abrams additionally informed iinfoai that Claude’s massive context window helps it carry out higher at lengthy agentic coding duties, wherein the AI mannequin is autonomously engaged on an issue for minutes or hours. With a big context window, Claude can keep in mind all its earlier steps in long-horizon duties.

However some corporations have taken massive context home windows to an excessive, claiming their AI fashions can course of large prompts. Google affords a 2 million token context window for Gemini 2.5 Professional, and Meta affords a ten million token context window for Llama 4 Scout.

Some research recommend there’s a restrict to how efficient massive context home windows will be; AI fashions will not be nice at processing these large prompts. Abrams stated that Anthropic’s analysis crew centered on rising not simply the context window for Claude, however the “efficient context window,” suggesting that its AI can perceive many of the info it’s given. Nevertheless, he declined to disclose Anthropic’s precise methods.

See also  Microsoft CEO Satya Nadella addresses 'painful' layoffs, discusses ambitious AI plans in company memo

When prompts to Claude Sonnet 4 are over 200,000 tokens, Anthropic will cost extra to API customers, at $6 per million enter tokens and $22.50 per million output tokens (up from $3 per million enter tokens and $15 per million output tokens).

Supply hyperlink

Related Articles

Leave a Reply

Please enter your comment!
Please enter your name here

Latest Articles