.

Technology

Foundation Model

Large-scale deep learning models, pre-trained on massive, broad data (e.g., Common Crawl), which are then adapted for a wide range of downstream tasks like natural language processing or image generation.

A Foundation Model (FM) is a large-scale deep learning architecture, typically a transformer, trained on vast, general, self-supervised data; this pre-training establishes a powerful, generalized knowledge base. The term, coined by the Stanford HAI in 2021, signifies a paradigm shift: instead of training siloed models, organizations leverage a single FM like OpenAI’s GPT-4, Meta's Llama 2, or Stability AI's Stable Diffusion as a starting point. This base model is then fine-tuned or prompted to perform diverse, specific tasks—from code generation to image classification—significantly reducing the cost and time required for new AI application development.

https://en.wikipedia.org/wiki/Foundation_model
8 projects · 8 cities

Related technologies

Recent Talks & Demos

Showing 1-8 of 8

Members-Only

Sign in to see who built these projects