DeepSeek R1 7B

DeepSeek

Compact reasoning model distilled from DeepSeek-R1, offering strong chain-of-thought reasoning in a small package. Great for local deployment with limited hardware.

Parameters 7B
Min VRAM 6 GB
Recommended VRAM 8 GB
Context Length 128K
License MIT

🚀 Get Started

Run DeepSeek R1 7B locally with one command:

ollama run deepseek-r1:7b

Requires Ollama installed.

📊 Benchmarks

BenchmarkScore
MMLU 58.7
GSM8K 74.5
HumanEval 62.2

💻 Hardware Recommendations

🟢 Minimum

6 GB VRAM GPU or 12+ GB RAM (CPU mode)

Expect slower generation in CPU mode

🔵 Recommended

8 GB VRAM GPU

Fast generation with room for context

Best For

reasoningmathcoding

Similar Models