← Back to models

INTELLECT-3

Prime Intellect (USA) November 26 2025

Parameters

106B / 12B (MoE)

License

MIT

Key Features

Post-trained on GLM-4.5-Air-Base using SFT and RL; trained on 512 H200 GPUs with prime-rl framework; SOTA performance for size on math (90.8% AIME 2024), code, and reasoning; fully open-sourced with complete RL stack and environments.

Paper / Source

https://www.primeintellect.ai/blog/intellect-3