China)
MIT
1T (MoE),Flagship trillion-parameter general-purpose model; hybrid Syntax–Function–Aesthetics reward for code gen; strong in maths/coding; base for Ling family; pretrained on massive data for broad capabilities.