DeepSeek R1 Distill 32B
DeepSeek
Knowledge-distilled reasoning model from DeepSeek R1 (671B) into Qwen2.5-32B base. Chain-of-thought reasoning that outperforms OpenAI o1-mini. Most popular model on Ollama (79M+ pulls).
Text Generation Local DeepSeek Family vR1
Parameters
32B
params
Context Window
131K
tokens
Max Output
-
tokens
Input Price
-
per 1M tokens
Output Price
-
per 1M tokens
DeepSeek Family Timeline 13 versions
DeepSeek R1 Distill 32B Current local
Jan 2025
Capabilities
👁️
Vision
⚡
Function Calling
📋
JSON Mode
🌊
Streaming
💬
System Prompt
🖥️
Code Execution
🔍
Web Search
🔌
MCP Support
Local Model Specs
Quantization
Q4_K_M
Architecture
Dense (Qwen2.5 base)
Runtime
Ollama / llama.cpp
Disk Size
20 GB
Details
- Release Date
- January 22, 2025
- Knowledge Cutoff
- -
- Source
- Local
- License
- Apache 2.0
- Model ID
- deepseek-r1-distill-32b
Last updated: March 13, 2026