Technology
GPT-3
A 175-billion parameter autoregressive language model that masters complex tasks through few-shot learning.
OpenAI debuted GPT-3 in 2020: a transformer-based engine trained on 570GB of filtered text. It utilizes 175 billion parameters to execute diverse functions (including Python scripting and logical reasoning) using only natural language prompts. This architecture removed the requirement for task-specific fine-tuning: establishing the foundation for modern tools like GitHub Copilot and the initial ChatGPT release.
368 projects
ยท
77 cities
Related technologies
Recent Talks & Demos
Showing 21-40 of 368
Public Speaking AI Agent
Tiruchirappalli
Jan 31
React
TypeScript
Kuralit: Intent-Driven Mobile Interface
Tiruchirappalli
Jan 31
Kuralit
iOS
OpenCode Local Models on DGX
Seattle
Jan 30
OpenCode
NVIDIA DGX Spark
LLM Failover Chains and Redis
Nashville
Jan 29
GPT-4o
Claude
Nihongo Convo: AI Conversation Practice
Nashville
Jan 29
gpt-4o-mini
gpt-4o-mini-tts
Multi-Model Imposter Game
Nashville
Jan 29
Python
FastAPI
Mori Solution: Construction RAG Pipeline
Nashville
Jan 29
GPT-4
React
JobsYo: Multi-Model Job Search AI
Toronto
Jan 29
GPT-5
Gemini 3
Human-ish: LinkedIn AI Detector
Toronto
Jan 29
GPTZero
GPT5
Consistent Pictogram Generation for AAC
Valencia
Jan 29
Flutter
Replicate
Zensei: Interpretable Market Regimes
Valencia
Jan 29
Claude
Codex
NetShow IQ1: AI Business Factory
San Diego
Jan 22
Anthropic Claude
Google Gemini
AI Tinkerers 2026 Predictions
Montreal
Jan 21
Generative AI
TensorFlow
Finetuning with Claude Synthetic Data
Cologne
Jan 21
Claude
GPT-5
Synthesizing Contextual Agentic Assistants
Manchester Nh
Jan 20
Vercel AI SDK
ChatGPT
JP-TL-Bench: AI Paper Writing
Tokyo
Jan 15
Claude Code
Opus 4
NetShow IQ1: English Business Factory
Orange County
Jan 14
GPT
Anthropic Claude
Agent Runner
Seattle
Jan 12
GPT-4
Claude 3 Opus
Forge: Multi-Agent Code Fixes
Seattle
Jan 12
Claude Opus
Python
Apple AI in Shortcuts
Seattle
Dec 18
iOS Shortcuts
iOS 26