Technology
GPT-3
A 175-billion parameter autoregressive language model that masters complex tasks through few-shot learning.
OpenAI debuted GPT-3 in 2020: a transformer-based engine trained on 570GB of filtered text. It utilizes 175 billion parameters to execute diverse functions (including Python scripting and logical reasoning) using only natural language prompts. This architecture removed the requirement for task-specific fine-tuning: establishing the foundation for modern tools like GitHub Copilot and the initial ChatGPT release.
391 projects
·
79 cities
Related technologies
Recent Talks & Demos
Showing 141-164 of 391
Qwen: Bending LLMs into Tools
Hong Kong
Jul 31
Python
Replit
PlexifyAEC: Construction Digital Twins
New York City
Jul 29
ChatGPT
Replit
Self-Improving Agents Fix Mistakes
Munich
Jul 25
Timee: Automating Real-World Chores
Seattle
Jul 24
Amigo Speaks
Seattle
Jul 24
LuchaCoach: Local AI Meeting Coach
Austin
Jul 10
Survaize: Vision Models Generate Apps
DC
Jul 10
Lovable: AI Tech Implementation Agent
Nürnberg
Jul 3
AI Game Master: Bootstrapped Success
Seattle
Jun 27
n8n: AI Content Automation
São Paulo
Jun 26
Mining opportunities
Santiago
Jun 26
GPT-4
Claude-3
Unpitched: Zero to One AI
Poland
Jun 26
Quesma Charts: AI Chart Generation
Poland
Jun 26
GPT-4
OpenRouter
slimbot: AI Digital Health Twin
Poland
Jun 26
GPT-4
OpenAI API
Vox Machina
San Francisco
Jun 25
Agentes de IA por voz y datos
Manizales
Jun 25
OpenAI
ChatGPT
Patch Party: Live Agent Fixing
London
Jun 25
ChatGPT: Chicago City RAG
Chicago
Jun 24
OpenAI GPT-4
LangChain
AI Film Pipeline with Custom GPTs
Toronto
Jun 18
ChatGPT-4 API
Midjourney
GESTURE: Real-Time ASL Translation
Lausanne
Jun 16
MCP: Standardizing AI Connections
Cincinnati
Jun 12
Backtesting AI Portfolios
Milan
Jun 10
WebRTC Voice AI Agents
Milan
Jun 10
Manifest AI: Scheduling with LangChain
St Louis
Jun 5