Technology
Phi-3
Microsoft's family of small language models (SLMs) delivering high-reasoning performance on local devices and edge hardware.
Phi-3-mini packs 3.8 billion parameters into a footprint small enough for local deployment on an iPhone 14. Trained on a 3.3 trillion token dataset of high-quality synthetic data and filtered web content: it outperforms models twice its size (like Mixtral 8x7B) on benchmarks for coding and logic. The family includes 7B (small) and 14B (medium) variants, providing developers with low-latency options for complex tasks without the massive compute requirements of traditional LLMs.
14 projects
·
15 cities
Related technologies
Recent Talks & Demos
Showing 1-14 of 14
Eric Chat: Local Mac AI
Ottawa
Apr 25
Eric Transformer
MLX-LM
Cryptographic Ledger for Agent Context
New York City
Mar 18
MCP
Claude API
Notebook LM: AI Slide Decks
Fort Wayne
Feb 21
Microsoft Word
Grok3
Miruvor: Neuromorphic AI Memory
Vienna
Feb 19
TrueNorth
Edge AI
Claude: Slack Frontend Agent
Tokyo
Feb 19
Claude Code
Claude API
FHE-Studio: Encrypted AI Inference
Toronto
Jan 29
Intel SGX
FHE Studio
Dlab-852-Mini: Hong Kong Cultural AI
Hong Kong
Dec 18
Python
datasets
Empathetic Development: AI Personas Validate
Seattle
Dec 8
Gemini
React
Number Theory: AI, Crypto, Optimization
Boston
Dec 2
Python
Apache Kafka
Scalable Production RAG Architecture
Toronto
Nov 10
FAISS
OpenAI API
Graphiti: RAG and Memory Combined
Hamburg
Aug 14
GPT-4
LangChain
Phi-4 + FastViT-HD VLM
Seattle
Jun 27
Phi-4-mini
PyTorch
LLMs Automate Phishing Lure Generation
Dublin
Feb 24
Llama 3
Snowflake
AutoML Agent: Code Generation
Toronto
Nov 28
Phi-3
AWS Lambda