Members-Only
Recent Talks & Demos are for members only
You must be an AI Tinkerers active member to view these talks and demos.
Bodhi App: Local LLMs
Learn how Bodhi App runs open‑source LLMs locally, offering privacy and cost savings without requiring API, CLI, or development expertise.
Bodhi App allows you to run LLMs locally, saving you cost and giving you complete privacy for your data.
Whereas ollama gives you ability to do the same, Bodhi is targeted to wider audience and does not assume that you understand API, CLI, frontend/backend, and unlocks the power of open source LLMs for everyone with a device.
Related projects
Running llama3 locally without a GPU
Dubai
This talk demonstrates running Llama3 locally on an NPU laptop without a GPU, explores its limitations and opportunities,…
LLM Gateways, a Software Engineering Perspective
Amman
Explore LLM Gateways like LiteLLM, examining engineering concepts like load balancing, moderation, and abstraction for production inference deployments.
Run Local, open source AI
Singapore
Learn how to run open-source models like Llama3, Mistral, and Gemma locally using Jan.ai and Cortex.so, with practical…
Alert creation and debugging using AI
Bengaluru
Learn to convert English alert descriptions into PromQL queries and use statistical analysis with LLMs to debug alerts,…
Beyond One-Size-Fits-All: Building Intelligent LLM Selection Systems
Sydney
Learn how to deploy an OpenAI‑compatible LLM router that classifies prompts, selects the appropriate model, and optimizes cost,…
Amharic Llama and Llava (open source multimodal llm for low resource language)
New York City
Presentation covers an open‑source multimodal LLM for Amharic, its architecture, training pipeline, and a data‑augmentation technique to overcome…