.

Technology

NVIDIA H200

The NVIDIA H200 Tensor Core GPU: The AI Superchip engineered to accelerate generative AI and high-performance computing (HPC) workloads.

This is the H200, a major power-up for your data center: it’s the first GPU to integrate 141GB of HBM3e memory, delivering a massive 4.8TB/s of memory bandwidth (a 1.4X increase over the H100). Built on the NVIDIA Hopper architecture, the H200 is specifically designed to handle the largest Large Language Models (LLMs) and complex HPC simulations. Expect serious throughput gains: H200 boosts inference speed by up to 2X for models like Llama 2 70B compared to the H100 GPU. We're talking about next-level performance and energy efficiency for your most demanding AI factories.

https://www.nvidia.com/en-us/data-center/h200-gpu/
2 projects · 3 cities

Related technologies

Recent Talks & Demos

Showing 1-2 of 2

Members-Only

Sign in to see who built these projects