Members-Only
Recent Talks & Demos are for members only
You must be an AI Tinkerers active member to view these talks and demos.
Real-time Jazz Accompaniment
This talk shows a real-time system that creates rhythmically and harmonically coherent jazz accompaniments from live MIDI input, using edge NLP models and metadata analysis.
This work explores the generation of symbolic musical accompaniment from a lead melody and metadata, aiming to produce rhythmically and harmonically coherent accompaniment track. The goal is to create a real-time system that responds to a musician’s playing style and improvisation. By analyzing live instrumental input, the system generates accompaniments that align with tempo, harmony, and rhythm.
Related projects
Music Box
Lausanne
We’ll demonstrate a real‑time, on‑device system that analyzes live instrument input and creates low‑latency adaptive accompaniments matching tempo,…
A Realtime Vocal Assistant on an ESP32
Paris
Learn how to build a real‑time voice assistant using an ESP32 microcontroller, local Node.js server, and LangChain for…
Space LLM
Paris
Learn how to generate architectural floor plans using fine‑tuned large language models, bridging text‑based AI with pixel‑based generative…
Harmonising AI and Music: Playa Music's Innovative Playlist Generation and Discovery
Paris
This talk demonstrates two AI models for music discovery and playlist generation, detailing their integration with multiple language…
Claudio AI Musician
Milan
This talk demonstrates AI agents collaborating with users to control an FM synthesizer, creating a dataset for training…
🎹🎻🎸 Searching for similar music tracks 🎼🎶
Paris
Learn how to generate audio embeddings and use vector search to retrieve similar music tracks, including matching hummed…