Technology
NPU
Specialized silicon for AI inference: The NPU accelerates neural network tasks like object detection and real-time translation with superior power efficiency.
The Neural Processing Unit (NPU) is a specialized microprocessor engineered for high-efficiency AI and machine learning workloads. Unlike general-purpose CPUs or GPUs, the NPU's architecture is optimized for parallel, low-precision matrix multiplication (the core math of neural networks), dramatically accelerating inference tasks. This specialization enables real-time, on-device AI for consumer electronics: think Apple's Neural Engine, Intel's Core Ultra chips, or Google Pixel's image processing. By offloading these sustained AI computations, the NPU delivers significantly better performance per watt than traditional processors, conserving battery life while powering features like background blurring, facial recognition, and local LLMs.
Related technologies
Recent Talks & Demos
Showing 1-18 of 18