Llama 3.3 70B
MetaMeta's latest instruction-tuned model with exceptional multilingual support and tool use. One of the best open-weight models available for general-purpose tasks.
Parameters 70B
Min VRAM 40 GB
Recommended VRAM 48 GB
Context Length 128K
License Llama 3.3 Community
๐ Get Started
Run Llama 3.3 70B locally with one command:
ollama run llama3.3:70b Requires Ollama installed.
๐ Benchmarks
BenchmarkScore
MMLU 86.0
GSM8K 91.1
HumanEval 88.4
๐ป Hardware Recommendations
๐ข Minimum
40 GB VRAM GPU or 80+ GB RAM (CPU mode)
Expect slower generation in CPU mode
๐ต Recommended
48 GB VRAM GPU
Fast generation with room for context
๐ฌ Community Sentiment
๐ข Positive 90% positive
Based on 10 recent discussions. Community appears generally positive about Llama 3.3 70B.