Technology
GPT-3
A 175-billion parameter autoregressive language model that masters complex tasks through few-shot learning.
OpenAI debuted GPT-3 in 2020: a transformer-based engine trained on 570GB of filtered text. It utilizes 175 billion parameters to execute diverse functions (including Python scripting and logical reasoning) using only natural language prompts. This architecture removed the requirement for task-specific fine-tuning: establishing the foundation for modern tools like GitHub Copilot and the initial ChatGPT release.
368 projects
·
77 cities
Related technologies
Recent Talks & Demos
Showing 41-60 of 368
PRESENT: Voice Steward Architecture
Seattle
Dec 18
Next
LiveKit
VibeVoice Realtime: Mac Metal TTS
Seattle
Dec 18
GPT-4
LangChain
ChatGPT App Gotchas
Seattle
Dec 18
GPT-4
Claude-3
AI: Organizational Context Translation
Atlanta
Dec 16
GPT-4
Prompting
Chisme: AI Quiz Generator
Waterloo
Dec 15
GPT-4o
Python
Muse: Playful Voice Coding
Sydney
Dec 11
Next
TypeScript
Inklu-Connect JobSync
Bremen
Dec 10
OpenRouter API
GPT-5 Nano
Conversation Games: Designing for AI
Chicago
Dec 9
Hume AI
GPT-4
Agentic ChatGPT Apps: MCP UI
Paris
Dec 9
Vite
TypeScript
Vibe Scaffold: Specs for AI Agents
Seattle
Dec 8
OpenAI API
GPT-4o
AceRocket: AI Learning Navigator
Eastside Entrepreneurs
Dec 4
TensorFlow
Machine Learning
FacultyFinder.io: AI CV Matching
Toronto
Dec 3
OpenAI GPT API
Claude
Blob Oracle: Slime Mold LLM
Lausanne
Dec 3
GPT-OSS
Raspberry Pi
OpenAI Dashboard: Data Insights
Boston
Dec 2
GPT-4
LangChain
Temporal DeepRAG Conversational Workflows
Bengaluru
Nov 29
Temporal
Python
VeloBlanco: Automating Media Verification
Bogotá
Nov 27
GPT-4
Claude
Deepgram OpenAI ElevenLabs Production
Bogotá
Nov 27
GPT-4
Deepgram
DungeonMind: D&D AI Tools
Denver
Nov 24
GPT-5 Ad Campaign Simulator
Boston
Nov 17
GPT-4
LangChain
The Chefz: Agentic AI
Amman
Nov 15
GPT-4o
Gemini