Chinese language tech firm Alibaba on Monday launched Qwen3, a household of AI fashions the corporate claims matches and in some circumstances outperforms the most effective fashions out there from Google and OpenAI.
A lot of the fashions are — or quickly might be — out there for obtain below an “open” license from AI dev platform Hugging Face and GitHub. They vary in dimension from 0.6 billion parameters to 235 billion parameters. Parameters roughly correspond to a mannequin’s problem-solving abilities, and fashions with extra parameters usually carry out higher than these with fewer parameters.
The rise of China-originated mannequin sequence like Qwen have elevated the strain on American labs reminiscent of OpenAI to ship extra succesful AI applied sciences. They’ve additionally led policymakers to implement restrictions geared toward limiting the flexibility of Chinese language AI firms to acquire the chips essential to coach fashions.
Introducing Qwen3!
We launch and open-weight Qwen3, our newest giant language fashions, together with 2 MoE fashions and 6 dense fashions, starting from 0.6B to 235B. Our flagship mannequin, Qwen3-235B-A22B, achieves aggressive leads to benchmark evaluations of coding, math, common… pic.twitter.com/JWZkJeHWhC
— Qwen (@Alibaba_Qwen) April 28, 2025
In line with Alibaba, Qwen3 fashions are “hybrid” fashions within the sense that they will take time and “purpose” via complicated issues or reply easier requests rapidly. Reasoning permits the fashions to successfully fact-check themselves, much like fashions like OpenAI’s o3, however at the price of larger latency.
“We have now seamlessly built-in considering and non-thinking modes, providing customers the flexibleness to manage the considering finances,” wrote the Qwen crew in a weblog submit. “This design permits customers to configure task-specific budgets with better ease.”
A number of the fashions additionally undertake a mix of consultants (MoE) structure, which may be extra computationally environment friendly for answering queries. MoE breaks down duties into subtasks and delegates them to smaller, specialised “skilled” fashions.
The Qwen3 fashions help 119 languages, Alibaba says, and have been educated on a dataset of almost 36 trillion tokens. Tokens are the uncooked bits of knowledge {that a} mannequin processes; 1 million tokens is equal to about 750,000 phrases. Alibaba says that Qwen3 was educated on a mixture of textbooks, “question-answer pairs,” code snippets, AI-generated knowledge, and extra.
These enhancements, together with others, significantly boosted Qwen3’s capabilities in comparison with its predecessor, Qwen2, says Alibaba. Not one of the Qwen3 fashions are head and shoulders above top-of-the-line current fashions like OpenAI’s o3 and o4-mini, however they’re sturdy performers nonetheless.
On Codeforces, a platform for programming contests, the most important Qwen3 mannequin — Qwen-3-235B-A22B — simply beats out OpenAI’s o3-mini and Google’s Gemini 2.5 Professional. Qwen-3-235B-A22B additionally bests o3-mini on the newest model of AIME, a difficult math benchmark, and BFCL, a take a look at for assessing a mannequin’s capacity to “purpose” about issues.
However Qwen-3-235B-A22B isn’t publicly out there — not less than not but.
The biggest public Qwen3 mannequin, Qwen3-32B, remains to be aggressive with numerous proprietary and open AI fashions, together with Chinese language AI lab DeepSeek’s R1. Qwen3-32B surpasses OpenAI’s o1 mannequin on a number of checks, together with the coding benchmark LiveCodeBench.
Alibaba says Qwen3 “excels” in tool-calling capabilities in addition to following directions and copying particular knowledge codecs. Along with the fashions for obtain, Qwen3 is on the market from cloud suppliers, together with Fireworks AI and Hyperbolic.
Tuhin Srivastava, co-founder and CEO of AI cloud host Baseten, mentioned that Qwen3 is one other level within the pattern line of open fashions protecting tempo with closed supply methods reminiscent of OpenAI’s.
“The U.S. is doubling down on limiting gross sales of chips to China and purchases from China, however fashions like Qwen 3 which might be state-of-the-art and open … will undoubtedly be used domestically,” he instructed iinfoai. “It displays the fact that companies are each constructing their very own instruments [as well as] shopping for off the shelf through closed-model firms like Anthropic and OpenAI.”