Phi-4

Microsoft

Microsoft's small language model that achieves remarkable performance through synthetic data training. Excels at reasoning and STEM tasks despite its compact size.

Parameters 14B
Min VRAM 10 GB
Recommended VRAM 16 GB
Context Length 16K
License MIT

🚀 Get Started

Run Phi-4 locally with one command:

ollama run phi4

Requires Ollama installed.

📊 Benchmarks

BenchmarkScore
MMLU 84.8
GSM8K 95.3
HumanEval 82.6

💻 Hardware Recommendations

🟢 Minimum

10 GB VRAM GPU or 20+ GB RAM (CPU mode)

Expect slower generation in CPU mode

🔵 Recommended

16 GB VRAM GPU

Fast generation with room for context

Best For

reasoningmathcodingSTEM

Similar Models

Related Comparisons