.

Technology

Qwen3Coder

Qwen3Coder is Alibaba's 480B-parameter (35B active) Mixture-of-Experts (MoE) model, engineered for state-of-the-art agentic coding, tool use, and repository-scale reasoning.

This is the Qwen team's most advanced agentic code model: the Qwen3-Coder-480B-A35B-Instruct (MoE architecture) . It delivers significant performance gains in autonomous software engineering, rivaling models like Claude Sonnet 4 . Key features include a massive 256K native context window (extensible to 1M tokens) for comprehensive repository analysis and a 7.5T-token training set (70% code) . The model excels at multi-turn programming, debugging, and cross-file reasoning across languages like Python, Java, and Rust . We've optimized it for real-world tasks via long-horizon reinforcement learning (Agent RL) .

https://github.com/QwenLM/Qwen3-Coder
2 projects · 2 cities

Related technologies

Recent Talks & Demos

Showing 1-2 of 2

Members-Only

Sign in to see who built these projects