Technology
Phi-4-mini
A 3.8-billion parameter powerhouse that outperforms models twice its size on reasoning and logic benchmarks.
Phi-4-mini delivers high-tier intelligence in a compact footprint by leveraging the same synthetic data pipeline used for the 14B model. It excels in multilingual support and complex reasoning (scoring 84.9 on GSM8K) while maintaining a 128K token context window. This model is optimized for edge deployment and latency-sensitive applications where efficiency is non-negotiable. It provides a massive performance-to-size ratio for developers building local AI agents or mobile-first experiences.
1 project
·
1 city
Related technologies
Recent Talks & Demos
Showing 1-1 of 1