DeepSeek R1 32B
DeepSeekHigh-performance reasoning model that rivals much larger models. Excellent for complex problem solving, mathematics, and code generation.
Parameters 32B
Min VRAM 20 GB
Recommended VRAM 24 GB
Context Length 128K
License MIT
🚀 Get Started
Run DeepSeek R1 32B locally with one command:
ollama run deepseek-r1:32b Requires Ollama installed.
📊 Benchmarks
BenchmarkScore
MMLU 75.8
GSM8K 90.3
HumanEval 79.1
💻 Hardware Recommendations
🟢 Minimum
20 GB VRAM GPU or 40+ GB RAM (CPU mode)
Expect slower generation in CPU mode
🔵 Recommended
24 GB VRAM GPU
Fast generation with room for context