← Back to models

Mistral Large 3

Mistral AI (France) December 2 2025

Parameters

675B / 41B (MoE)

License

Apache 2.0

Key Features

First open-weight frontier model with unified multimodal (text + image) and multilingual capabilities; granular MoE architecture; 256K context window; excels in long-document understanding, agentic workflows, coding, and multilingual processing; trained on 3000 H200 GPUs; ranked #2 in OSS non-reasoning on LMArena.

Paper / Source

https://mistral.ai/news/mistral-3