Nvidia V100
Compare prices for Nvidia V100 across cloud providers
Jan. 16, 2025 (updated)
Introduced in 2017, the Nvidia V100 is built on the Volta architecture, offering versatile performance for AI training and inference. It remains a popular choice for research and enterprise projects.
Nvidia V100 prices
Based on our data, the Nvidia V100 might be available in the following cloud providers:
Provider | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
![]() |
1x V100 | 16GB | 6 | 32GB | $0.24 | Source |
![]() |
1x V100 | 16GB | 6 | 23GB | $0.39 | Launch |
![]() |
1x V100 | 16GB | 4 | 16GB | $0.54 | Source |
![]() |
1x V100 | 16GB | 8 | 45GB | $0.77 | Source |
![]() |
1x V100 | 16GB | 8 | 44GB | $0.85 | Source |
![]() |
1x V100 | 16GB | 12 | 56GB | $1.38 | Source |
![]() |
2x V100 | 32GB | 16 | 90GB | $1.55 | Source |
![]() |
1x V100 | 16GB | 8 | 45GB | $1.97 | Source |
![]() |
2x V100 | 32GB | 16 | 90GB | $2.01 | Source |
![]() |
1x V100 | 32GB | 8 | 30GB | $2.34 | Launch |
![]() |
1x V100 | 16GB | 8 | 30GB | $2.34 | Launch |
![]() |
3x V100 | 48GB | 24 | 120GB | $2.65 | Source |
![]() |
1x V100 | 16GB | 6 | 112GB | $3.06 | Source |
![]() |
1x V100 | 16GB | 8 | 61GB | $3.06 | Launch |
![]() |
4x V100 | 64GB | 32 | 180GB | $3.10 | Source |
![]() |
4x V100 | 64GB | 48 | 225GB | $3.32 | Source |
![]() |
1x V100 | 32GB | 12 | 92GB | $3.51 | Source |
![]() |
2x V100 | 32GB | 18 | 90GB | $3.94 | Source |
![]() |
8x V100 | 128GB | 92 | 448GB | $4.40 | Launch |
![]() |
2x V100 | 32GB | 12 | 224GB | $6.12 | Source |
![]() |
4x V100 | 64GB | 36 | 180GB | $7.89 | Source |
![]() |
4x V100 | 64GB | 24 | 448GB | $12.24 | Source |
![]() |
4x V100 | 64GB | 32 | 244GB | $12.24 | Launch |
![]() |
4x V100 | 64GB | 24 | 448GB | $13.46 | Source |
![]() |
8x V100 | 256GB | 40 | 672GB | $22.03 | Source |
![]() |
8x V100 | 128GB | 64 | 488GB | $24.48 | Launch |
Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.
Nvidia V100 specs
V100 PCIe | V100 SXM2 | V100S PCIe | |
---|---|---|---|
GPU Architecture | NVIDIA Volta | NVIDIA Volta | NVIDIA Volta |
NVIDIA Tensor Cores | 640 | 640 | 640 |
NVIDIA CUDA® Cores | 5,120 | 5,120 | 5,120 |
Double-Precision Performance | 7 TFLOPS | 7.8 TFLOPS | 8.2 TFLOPS |
Single-Precision Performance | 14 TFLOPS | 15.7 TFLOPS | 16.4 TFLOPS |
Tensor Performance | 112 TFLOPS | 125 TFLOPS | 130 TFLOPS |
GPU Memory | 32 GB / 16 GB HBM2 | 32 GB HBM2 | 32 GB HBM2 |
Memory Bandwidth | 900 GB/sec | 900 GB/sec | 1134 GB/sec |
ECC | Yes | Yes | Yes |
Interconnect Bandwidth | 32 GB/sec | 300 GB/sec | 32 GB/sec |
System Interface | PCIe Gen3 | NVIDIA NVLink™ | PCIe Gen3 |
Form Factor | PCIe Full Height/Length | SXM2 | PCIe Full Height/Length |
Max Power Consumption | 250 W | 300 W | 250 W |
Thermal Solution | Passive | Passive | Passive |
Compute APIs | CUDA, DirectCompute, OpenCL™, OpenACC® | CUDA, DirectCompute, OpenCL™, OpenACC® | CUDA, DirectCompute, OpenCL™, OpenACC® |
Source: official Nvidia V100 datasheet.