Members-Only
Recent Talks & Demos are for members only
You must be an AI Tinkerers active member to view these talks and demos.
Self Attention
An interactive demo using ASCII art illustrates how self‑attention links tokens, revealing the underlying mechanisms of language models for attendees.
Demo from exhibition that aimed at bringing the concept of self attention to the spectators. Ascii art generated using textual language model shows the contextual and semantic closeness of individual tokens.
Related projects
Attention in 4 lines
Berlin
This talk explains the core mechanics of attention in transformer models using just four lines of code, clarifying…
Creative AI Off-Roading: Unconventional Workflows for Imaginative Collaboration
Prague
Explore unconventional AI workflows that prioritize creativity and experimentation, demonstrating how blended AI tools can generate imaginative art…
AI for Analyzing Trends
Prague
Learn how to create an AI-driven blog that extracts Google and Amazon trend data, uses Python and AWS,…
Law and AI
Prague
Examining legal AI challenges, focusing on Retrieval‑Augmented Generation's role in reducing hallucinations, and practical methods for building reliable,…
Promptbook
Prague
Learn how Promptbook abstracts large language models to automate creation of diverse content, covering its architecture, practical examples,…
The Personal AI Assistant: When Cursor Meets Claude Code for Life Management
Waterloo
Learn how to build a personal AI assistant with Cursor and Claude Code that tracks goals, automates routines,…