DeepSeek R1 70B
DeepSeekThe full DeepSeek R1 distilled to 70B parameters. State-of-the-art reasoning that competes with GPT-4 and Claude on math and coding benchmarks.
Parameters 70B
Min VRAM 40 GB
Recommended VRAM 48 GB
Context Length 128K
License MIT
๐ Get Started
Run DeepSeek R1 70B locally with one command:
ollama run deepseek-r1:70b Requires Ollama installed.
๐ Benchmarks
BenchmarkScore
MMLU 79.8
GSM8K 94.3
HumanEval 85.9
๐ป Hardware Recommendations
๐ข Minimum
40 GB VRAM GPU or 80+ GB RAM (CPU mode)
Expect slower generation in CPU mode
๐ต Recommended
48 GB VRAM GPU
Fast generation with room for context
๐ฌ Community Sentiment
๐ข Positive 80% positive
Based on 4 recent discussions. Community appears generally positive about DeepSeek R1 70B.
Similar Models
Related Comparisons
DeepSeek R1 vs GPT-4
Can a free, open-weight model really compete with GPT-4? DeepSeek R1 challenges ...
DeepSeek R1 vs Llama 3.3 70BTwo of the best open-weight 70B models compared. DeepSeek R1 brings chain-of-tho...
Qwen 2.5 vs DeepSeek R1Two Chinese AI labs go head-to-head. Qwen 2.5's balanced capabilities vs DeepSee...