Nvidia H200
Compare prices for Nvidia H200 across cloud providers
Jan. 16, 2025 (updated)
The Nvidia H200, released in 2024, builds on the Hopper architecture with improved memory and scalability. It is designed to support advanced AI research and large-scale distributed training, offering enhanced performance over its predecessor, the H100.
Nvidia H200 prices
Based on our data, the Nvidia H200 might be available in the following cloud providers:
Provider | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
CUDO Compute | 1x H200 | 141GB | -- | -- | $2.49 | Source |
DataCrunch | 1x H200 | 141GB | 44 | 185GB | $3.03 | Launch |
RunPod | 1x H200 | 141GB | -- | -- | $3.99 | Source |
DataCrunch | 2x H200 | 282GB | 88 | 370GB | $6.06 | Source |
DataCrunch | 4x H200 | 564GB | 176 | 740GB | $12.12 | Source |
Nebius | 8x H200 | 1128GB | -- | -- | $20.72 | Source |
DataCrunch | 8x H200 | 1128GB | 176 | 1450GB | $24.24 | Source |
Lambda Labs | 8x H200 | 1128GB | 224 | -- | On Request | Source |
The Cloud Minders | 8x H200 | 1128GB | 384 | 2048GB | On Request | Source |
Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.
Nvidia H200 specs
Feature | H200 SXM¹ | H200 NVL¹ |
---|---|---|
FP64 | 34 TFLOPS | 34 TFLOPS |
FP64 Tensor Core | 67 TFLOPS | 67 TFLOPS |
FP32 | 67 TFLOPS | 67 TFLOPS |
TF32 Tensor Core² | 989 TFLOPS | 989 TFLOPS |
BFLOAT16 Tensor Core² | 1,979 TFLOPS | 1,979 TFLOPS |
FP16 Tensor Core² | 1,979 TFLOPS | 1,979 TFLOPS |
FP8 Tensor Core² | 3,958 TFLOPS | 3,958 TFLOPS |
INT8 Tensor Core² | 3,958 TFLOPS | 3,958 TFLOPS |
GPU Memory | 141GB | 141GB |
GPU Memory Bandwidth | 4.8TB/s | 4.8TB/s |
Decoders | 7 NVDEC, 7 JPEG | 7 NVDEC, 7 JPEG |
Confidential Computing | Supported | Supported |
Max Thermal Design Power (TDP) | Up to 700W | Up to 600W |
Multi-Instance GPUs | Up to 7 MIGs @18GB each | Up to 7 MIGs @18GB each |
Form Factor | SXM | PCIe |
Interconnect | NVIDIA NVLink™: 900GB/s, PCIe Gen5: 128GB/s | 2- or 4-way NVIDIA NVLink bridge: 900GB/s, PCIe Gen5: 128GB/s |
¹ Preliminary specifications. May be subject to change.
² With sparsity.
Source: official Nvidia H200 datasheet.