Technology
GPT-3
A 175-billion parameter autoregressive language model that masters complex tasks through few-shot learning.
OpenAI debuted GPT-3 in 2020: a transformer-based engine trained on 570GB of filtered text. It utilizes 175 billion parameters to execute diverse functions (including Python scripting and logical reasoning) using only natural language prompts. This architecture removed the requirement for task-specific fine-tuning: establishing the foundation for modern tools like GitHub Copilot and the initial ChatGPT release.
Related technologies
Recent Talks & Demos
Showing 221-244 of 392