.

Technology

Foundation models

Liquid AI's 1.2B and 2.6B parameter models deliver high-density performance for edge devices using a memory-efficient, non-transformer architecture.

These Cactus models replace traditional Transformers with Liquid Foundation Models (LFMs) based on linear state-space designs. The 1.2B and 2.6B variants beat larger competitors (including Llama 3.2-3B) in benchmarks while maintaining a minimal memory footprint. They support 32k token context windows without the quadratic scaling costs of self-attention. This makes them the top choice for on-device AI, robotics, and secure enterprise applications where hardware resources are limited.

https://www.liquid.ai/blog/liquid-foundation-models
8 projects · 8 cities

Related technologies

Recent Talks & Demos

Showing 1-8 of 8

Members-Only

Sign in to see who built these projects