DeepSeek R1 14B
DeepSeekMid-size reasoning model with excellent chain-of-thought capabilities. Balances performance and resource requirements well for serious local AI work.
Parameters 14B
Min VRAM 10 GB
Recommended VRAM 16 GB
Context Length 128K
License MIT
🚀 Get Started
Run DeepSeek R1 14B locally with one command:
ollama run deepseek-r1:14b Requires Ollama installed.
📊 Benchmarks
BenchmarkScore
MMLU 69.2
GSM8K 83.1
HumanEval 72.6
💻 Hardware Recommendations
🟢 Minimum
10 GB VRAM GPU or 20+ GB RAM (CPU mode)
Expect slower generation in CPU mode
🔵 Recommended
16 GB VRAM GPU
Fast generation with room for context