← Back to models

DeepSeek V3.2

DeepSeek AI (China) December 1 2025

Parameters

671B / 37B (2 variants: standard

License

MIT

Key Features

Speciale; MoE),First DeepSeek to integrate thinking into tool-use; hybrid thinking/non-thinking modes; standard version reaches GPT-5 level (93.1% AIME, 92.5% HMMT); Speciale variant for extreme reasoning with gold medals in IMO/CMO/ICPC/IOI 2025 (99.2% HMMT, 35/42 IMO); combines theorem-proving from Math-V2; massive agent training (1,800+ environments).

Paper / Source

https://huggingface.co/deepseek-ai/DeepSeek-V3.2/resolve/main/assets/paper.pdf