← Back to models

LFM2-2.6B-Exp

Liquid AI (USA) December 25 2025

Parameters

2.6B (dense)

License

Apache 2.0

Key Features

Experimental checkpoint built on LFM2-2.6B using pure reinforcement learning; hybrid architecture with 10 double-gated short-range convolution blocks + 6 Grouped Query Attention (GQA) blocks; specifically trained on instruction following, knowledge, and math; achieves 82.41% GSM8K, 79.56% IFEval, 42% GPQA; IFBench score surpasses DeepSeek R1-0528 (a model 263× larger); 3× faster training and 2× faster CPU decode/prefill than Qwen3; designed for edge deployment (on-device, smartphones, laptops); 32K context window; low KV cache requirements; recommended for agentic tasks, data extraction, RAG, creative writing, and multi-turn conversations; explicitly NOT recommended for programming or knowledge-intensive tasks; multilingual support (English, Arabic, Chinese, French, German, Japanese, Korean, Spanish); demonstrates that small models can punch above weight class through superior RL training rather than scale.

Paper / Source

https://huggingface.co/LiquidAI/LFM2-2.6B-Exp