Technology
LM Studio
LM Studio is the cross-platform desktop application for downloading, running, and serving local LLMs (Llama, Gemma, Qwen) on macOS, Windows, and Linux.
LM Studio delivers local LLM power directly to your desktop (macOS, Windows, Linux). Use the built-in catalog to discover and download models like Llama 3.1 or Gemma 2 in formats like GGUF and MLX. Run these models offline via a simple chat interface, or expose them over an OpenAI-compatible REST API for seamless integration with your development projects. This is local, private AI: full control, zero cloud dependency.
10 projects
·
8 cities
Related technologies
Recent Talks & Demos
Showing 1-10 of 10
OpenClaw: Local LLMs in Home Assistant
Hong Kong
Apr 29
Home Assistant
OpenClaw
LangChain: Local LLM Orchestration
Manchester Nh
Jan 20
LangChain
Prompt Flow
Quiet Local AI Inferencing
Hong Kong
Jan 20
llama
vLLM
Gaming PC to OpenAI Server
New York City
Dec 9
Ollama
LM Studio
NeuroGraph
Paris
Dec 9
llama
LM Studio
AnythingLLM: LLMs, RAG, Agentes
Manizales
Oct 29
AnythingLLM
Ollama
Pydantic Agents: Cooperate or Compete
Nashville
Oct 28
Python
Pydantic AI
Obsidian: Private AI Journaling
Seattle
Jul 24
TypeScript
LM Studio
Obsidian Local LLM Plugin
Seattle
Jun 27
Obsidian
LM Studio
Local LLMs for Privacy Products
Seattle
Feb 21
LM Studio
Python