Members-Only
Recent Talks & Demos are for members only
You must be an AI Tinkerers active member to view these talks and demos.
Xelerit: AI Copilot for Robotics
Demonstration of a platform that automates robotics engineering by generating native robot code, translating languages, configuring PLC I/O, and providing simulation and LLM‑based documentation assistance.
I will go through how our software works, which mirror the complete work of a robotics engineer, making it much faster.
Our mvp has:
• robot code generation (in the native robot-brand language)
• copilot chat (for easy navigation of robot docs)
• code translator between robot languages
• I/O automatic configuration from PLC to robot.
• Simulation
Xelerit's Agentic ADE automates industrial automation design, test, and deployment.
- GraphRAGGraphRAG integrates knowledge graphs with Retrieval-Augmented Generation (RAG) to enable multi-hop reasoning and deliver context-rich, verifiable LLM responses.GraphRAG is a superior RAG architecture: it moves beyond simple vector-based semantic search. The system constructs a knowledge graph (KG) from unstructured data, extracting entities and relationships (nodes and edges). This structure allows the Large Language Model (LLM) to perform complex, multi-hop reasoning, a task where baseline RAG systems often fail. By leveraging the KG's relational context instead of isolated text chunks, GraphRAG significantly improves answer accuracy, reduces hallucination, and provides a clear, traceable provenance for the generated response. It is a critical upgrade for enterprise GenAI applications demanding high-trust, explainable results.
- GPT-4GPT-4 is OpenAI’s large multimodal model: it processes both text and image inputs, delivering human-level performance on complex professional and academic benchmarks.This is OpenAI’s latest milestone in scaling deep learning: a large multimodal model accepting both text and image inputs. It demonstrates a significant capability leap over its predecessor, scoring in the top 10% on a simulated bar exam (GPT-3.5 scored in the bottom 10%). The model handles nuanced instructions and long-form content, supporting context windows up to 32,768 tokens (32K model). This capacity allows processing up to 25,000 words in a single, complex prompt. GPT-4 is engineered for enhanced reliability, steerability, and advanced reasoning across diverse tasks.
- AI agentsAutonomous software systems that leverage LLMs to reason, plan, and execute complex, multi-step goals across external tools and data sources.AI agents are the next evolution of applied AI: they are goal-driven, autonomous entities that handle entire workflows without constant human prompting. Unlike simple chatbots, agents use a core LLM for reasoning, integrate with external tools (APIs, CRMs, web browsers), and maintain memory to adapt and improve over time. Enterprises deploy them for high-value automation: examples include a sales agent generating over 2,000 qualified leads monthly or a research agent analyzing 50 petabytes of clinical data for insights. This technology is about scaling complex decision-making and action, not just conversation.
- PLCThe ruggedized computer that executes sub-millisecond logic to automate everything from Tesla assembly lines to municipal water systems.PLCs are the hardened brains of modern industry. These controllers replace physical relays with digital logic (Ladder Logic or Structured Text) to manage high-speed I/O. They handle extreme temperatures and electrical noise while maintaining consistent scan cycles. You will find them in every major facility: a Siemens S7-1500 managing a BMW paint shop or an Allen-Bradley ControlLogix 5580 syncing 100 axes of motion. They ensure 99.9% uptime by processing sensor data (4-20mA signals or Modbus TCP) to trigger precise physical outputs like motor starters and pneumatic valves.
- Robotics SimulationRobotics Simulation is the critical virtual twin technology that designs, validates, and optimizes complex robotic work cells before a single component deploys on the physical floor.This is non-negotiable for modern automation: Robotics Simulation software creates a high-fidelity digital twin of the entire production system, eliminating costly physical trials. We use platforms like DELMIA Robotics (ranked #1 for Offline Robot Programming) or open-source tools like Gazebo to validate robot trajectories, cycle times, and collision avoidance with 99% accuracy. This process enables true offline programming and virtual commissioning: engineers test complex AI/ML control algorithms and optimize throughput for a multi-robot factory line, reducing deployment errors and cutting commissioning time by up to 40%.
- GPT-3A 175-billion parameter autoregressive language model that masters complex tasks through few-shot learning.OpenAI debuted GPT-3 in 2020: a transformer-based engine trained on 570GB of filtered text. It utilizes 175 billion parameters to execute diverse functions (including Python scripting and logical reasoning) using only natural language prompts. This architecture removed the requirement for task-specific fine-tuning: establishing the foundation for modern tools like GitHub Copilot and the initial ChatGPT release.
- Llama-2Llama 2 is Meta AI's powerful, openly accessible family of large language models (LLMs), featuring models from 7B to 70B parameters for research and commercial applications.Llama 2 is Meta AI's next-generation LLM family, released for free research and commercial use. The collection includes both pre-trained foundation models and instruction-tuned 'Chat' variants, scaling from 7 billion (7B) up to 70 billion (70B) parameters. Key technical upgrades over Llama 1 involve training on 2 trillion tokens (40% more data) and doubling the context length to 4096 tokens. The Llama-2-chat models were rigorously aligned using Reinforcement Learning from Human Feedback (RLHF), positioning them as a top-tier, openly available option for developers building advanced generative AI solutions.
- PaLM 2Google's versatile large language model optimized for advanced reasoning, multilingual translation, and coding across four distinct scales.PaLM 2 powers 25+ Google products (including Gemini and Workspace) using a Transformer-based architecture trained on a massive corpus of 100+ languages. It excels in specialized tasks: solving complex math problems, generating high-quality code, and passing professional-level exams. Developers deploy the model via the PaLM API in four sizes: Gecko, Otter, Bison, and Unicorn. Gecko is lightweight enough to run locally on mobile devices (offline), while Unicorn handles the most complex, data-heavy reasoning tasks at scale.
- BLOOMA 176-billion parameter open-access multilingual language model built by the BigScience research collective.BLOOM is the result of a year-long collaboration involving 1,000+ researchers from 70+ countries. It supports 46 natural languages and 13 programming languages: it provides a high-performance alternative to proprietary models. The model was trained on the Jean Zay supercomputer in France using the 1.6-terabyte ROOTS dataset (a massive collection of diverse text sources). By providing full access to its weights and training process, BLOOM enables global developers to build and audit AI tools without the restrictions of closed-door APIs.
- BERTBERT (Bidirectional Encoder Representations from Transformers) is a foundational, pre-trained NLP model that uses a Transformer encoder to process text bidirectionally, capturing full word context for superior language understanding.BERT is a revolutionary language representation model introduced by Google AI Language in 2018. It is built on the Transformer architecture and distinguishes itself by being deeply bidirectional: it processes the entire sequence of words (left and right context) simultaneously, unlike previous unidirectional models. This capability is achieved through a Masked Language Model (MLM) pre-training objective. The model, released in sizes like BERTBASE (110 million parameters) and BERTLARGE (340 million parameters), dramatically improved the state-of-the-art across 11+ Natural Language Processing tasks, including question answering (SQuAD) and sentiment analysis, establishing a new baseline for the field.
- RoBERTaRoBERTa (Robustly Optimized BERT Pretraining Approach) is a high-performance language model from Facebook AI that significantly outperforms BERT by optimizing the pretraining strategy, not the core architecture.RoBERTa is a robustly optimized version of the BERT model, developed by researchers at Facebook AI in 2019. The team conducted a replication study, proving BERT was undertrained and could achieve state-of-the-art results with a refined recipe: they removed the Next Sentence Prediction (NSP) objective, implemented dynamic masking, and scaled up training dramatically. Specifically, RoBERTa trained for 500K steps (up from 100K) on a massive 160GB of text data (ten times BERT’s data) using much larger batch sizes (up to 8K). This optimized approach yielded superior performance on major benchmarks like GLUE, RACE, and SQuAD, establishing RoBERTa as a benchmark for subsequent language model development.
Related projects
1nterface ai
Zürich
The talk explores an AI-driven context collection tool designed for laptops, demonstrating how it understands user behavior to…
Automating building, deployment and maintenance of AI with AI agents
London
Learn how AI agents automate model selection, prompt design, and deployment, letting developers add AI features via simple…
camelAI: Talk to your tools and take action.
Austin
This talk demonstrates how camelAI connects apps through AI chat to simplify complex tasks using APIs and user-directed…
AI Computer
Berlin
Learn how to build a desktop PC with an RTX 3090 for local AI workloads, covering hardware assembly, software…
Trustworthy and Secure Agents
Zürich
This talk demonstrates a prototype system enforcing strict security policies on LLM agents, enabling precise control over their…
AI Agents working together on a Machine Learning Task
Milan
Watch three AI agents collaborate: a data engineer cleans raw data, a tech lead orchestrates, and a data…