Technology
GPT
Generative Pre-trained Transformer (GPT): The large language model (LLM) family, pioneered by OpenAI, uses a Transformer architecture to generate human-quality text, code, and multimodal content.
GPT is an advanced neural network architecture: specifically, a Generative Pre-trained Transformer. It operates as a foundation model, pre-trained on massive internet-scale datasets to learn grammar, context, and world knowledge (e.g., hundreds of billions of parameters). This pre-training enables its core function: predicting the next most probable word in a sequence. Models like GPT-4 and GPT-4o leverage this capability to execute diverse tasks—from generating complex software code and translating languages to creating coherent, contextually relevant dialogue for applications like ChatGPT.
Related technologies
Recent Talks & Demos
Showing 1-24 of 217