.

Technology

GPT-3

A 175-billion parameter autoregressive language model that masters complex tasks through few-shot learning.

OpenAI debuted GPT-3 in 2020: a transformer-based engine trained on 570GB of filtered text. It utilizes 175 billion parameters to execute diverse functions (including Python scripting and logical reasoning) using only natural language prompts. This architecture removed the requirement for task-specific fine-tuning: establishing the foundation for modern tools like GitHub Copilot and the initial ChatGPT release.

https://openai.com/index/gpt-3-powers-next-generation-of-apps/
368 projects · 77 cities

Related technologies

Recent Talks & Demos

Showing 41-60 of 368

Members-Only

Sign in to see who built these projects