480B / 35B (MoE with 160 experts; 8 active)
Apache 2.0
Advanced agentic coding model with 256K native context (extends to 1M); trained on 7.5T tokens (70% code); long-horizon RL with 20K parallel environments; SOTA on SWE-Bench Verified; supports 100+ languages; includes Qwen Code CLI tool.