.

Technology

On-device AI

On-device AI: Local processing of machine learning models directly on the edge device, ensuring sub-millisecond latency and maximum data privacy.

This technology shifts AI model execution from cloud servers to the user's hardware: think smartphones, IoT, and vehicles. It’s powered by specialized silicon (Apple’s Neural Engine, Qualcomm’s Hexagon NPU) and optimized runtimes (Google’s LiteRT, formerly TensorFlow Lite). The key benefits are immediate: ultra-low latency, as data doesn't leave the device; robust privacy, since sensitive information remains local; and reliable offline functionality. For example, Google's Gemini Nano runs on-device for instant summarization, while Qualcomm's Snapdragon 8 Gen 3 enables generative AI features directly on flagship Android phones, pushing performance to the edge.

https://ai.google.dev/edge/litert
3 projects · 4 cities

Related technologies

Recent Talks & Demos

Showing 1-3 of 3

Members-Only

Sign in to see who built these projects