DeepSeek R1 70B

DeepSeek

The full DeepSeek R1 distilled to 70B parameters. State-of-the-art reasoning that competes with GPT-4 and Claude on math and coding benchmarks.

Parameters 70B
Min VRAM 40 GB
Recommended VRAM 48 GB
Context Length 128K
License MIT

๐Ÿš€ Get Started

Run DeepSeek R1 70B locally with one command:

ollama run deepseek-r1:70b

Requires Ollama installed.

๐Ÿ“Š Benchmarks

BenchmarkScore
MMLU 79.8
GSM8K 94.3
HumanEval 85.9

๐Ÿ’ป Hardware Recommendations

๐ŸŸข Minimum

40 GB VRAM GPU or 80+ GB RAM (CPU mode)

Expect slower generation in CPU mode

๐Ÿ”ต Recommended

48 GB VRAM GPU

Fast generation with room for context

๐Ÿ’ฌ Community Sentiment

๐ŸŸข Positive 80% positive

Based on 4 recent discussions. Community appears generally positive about DeepSeek R1 70B.

Sample sources: ๐• Post ยท ๐• Post ยท ๐• Post ยท ๐• Post Updated: 2/7/2026

Best For

reasoningmathcodinganalysis

Similar Models

Related Comparisons