Ali Qwen3 is officially released: minimum 600 million parameters, challenging Gemini-2.5Pro

Alibaba Qwen3 series challenges industry giants and is a new AI choice with both performance and efficiency.
Core content:
1. Qwen3 series model parameter scale and performance characteristics
2. Comparison with industry top model benchmark test results
3. Performance of small MoE model and Dense model
Alibaba announces the launch of the Qwen3 series # Qwen3
Open-sourced weights for two MoE models: Qwen3-235B-A22B, a large model with more than 235 billion total parameters and more than 22 billion activation parameters, and Qwen3-30B-A3B, a small MoE model with about 30 billion total parameters and 3 billion activation parameters
In addition, six Dense models have been open sourced, including Qwen3-32B, Qwen3-14B, Qwen3-8B, Qwen3-4B, Qwen3-1.7B, and Qwen3-0.6B, all under the Apache 2.0 license.
Here you can experience it for free:
chat.qwen.ai
According to Qianwen’s official statement: The flagship model Qwen3-235B-A22B showed very competitive results in benchmarks such as code, mathematics, and general ability compared with top models such as DeepSeek-R1, o1, o3-mini, Grok-3, and Gemini-2.5-Pro
In addition, the small MoE model Qwen3-30B-A3B has 10% of the number of activation parameters of QwQ-32B, and performs better. Even small models like Qwen3-4B can match the performance of Qwen2.5-72B-Instruct
One picture to understand: