Technology
GPT models
GPT models (Generative Pre-trained Transformer): A Transformer-based large language model series from OpenAI, designed for advanced human-like text generation, reasoning, and code synthesis.
GPT models are a class of Large Language Models (LLMs) pioneered by OpenAI, fundamentally built on the decoder-only Transformer architecture. These systems undergo massive unsupervised pre-training on vast internet text data, enabling them to grasp complex language patterns (Generative Pre-training). Successive iterations, including the 175-billion-parameter GPT-3 and the current flagship GPT-5.2, demonstrate escalating capabilities: they excel at zero-shot and few-shot learning, performing diverse tasks like advanced reasoning, code generation (e.g., Python, JavaScript), and creative content synthesis with high coherence and accuracy. They serve as the core engine behind applications like ChatGPT.
Related technologies
Recent Talks & Demos
Showing 1-4 of 4