Nvidia V100

Compare prices for Nvidia V100 across cloud providers.

Pricing

Provider GPUs VRAM vCPUs RAM Price/h
DataCrunch 1x V100 16GB 6 23GB $0.62
OVH 1x V100 16GB 8 45GB $0.77
Koyeb 1x V100 16GB 8 44GB $0.85
OVH 2x V100 32GB 16 90GB $1.55
OVH 1x V100 16GB 8 45GB $1.97
Paperspace 1x V100 16GB 8 30GB $2.30
Paperspace 1x V100 32GB 8 30GB $2.30
AWS 1x V100 16GB 8 61GB $3.06 Launch
Azure 1x V100 16GB 6 112GB $3.06
OVH 4x V100 64GB 32 180GB $3.10
OVH 2x V100 32GB 18 90GB $3.94
Lambda Labs 8x V100 128GB 92 448GB $4.40 Launch
Azure 2x V100 32GB 12 224GB $6.12
OVH 4x V100 64GB 36 180GB $7.89
Azure 4x V100 64GB 24 448GB $12.24
AWS 4x V100 64GB 32 244GB $12.24 Launch
Azure 4x V100 64GB 24 448GB $13.46
Azure 8x V100 256GB 40 672GB $22.03
AWS 8x V100 128GB 64 488GB $24.48 Reserve

Note: Prices are subject to change, and may vary by region and other factors not listed here. I included links to launch GPUs using Shadeform (our sponsor) so you can see if they're available right now. There's no fees to use their service and I don't get a commission when you use these links.

Specs

V100 PCIe V100 SXM2 V100S PCIe
GPU Architecture NVIDIA Volta NVIDIA Volta NVIDIA Volta
NVIDIA Tensor Cores 640 640 640
NVIDIA CUDA® Cores 5,120 5,120 5,120
Double-Precision Performance 7 TFLOPS 7.8 TFLOPS 8.2 TFLOPS
Single-Precision Performance 14 TFLOPS 15.7 TFLOPS 16.4 TFLOPS
Tensor Performance 112 TFLOPS 125 TFLOPS 130 TFLOPS
GPU Memory 32 GB / 16 GB HBM2 32 GB HBM2 32 GB HBM2
Memory Bandwidth 900 GB/sec 900 GB/sec 1134 GB/sec
ECC Yes Yes Yes
Interconnect Bandwidth 32 GB/sec 300 GB/sec 32 GB/sec
System Interface PCIe Gen3 NVIDIA NVLink™ PCIe Gen3
Form Factor PCIe Full Height/Length SXM2 PCIe Full Height/Length
Max Power Consumption 250 W 300 W 250 W
Thermal Solution Passive Passive Passive
Compute APIs CUDA, DirectCompute, OpenCL™, OpenACC® CUDA, DirectCompute, OpenCL™, OpenACC® CUDA, DirectCompute, OpenCL™, OpenACC®

You can find the official datasheet here.


Looking for alternatives? Compare Nvidia V100 against other GPUs.