Nvidia V100

Compare prices for Nvidia V100 across cloud providers

Jan. 16, 2025 (updated)

Introduced in 2017, the Nvidia V100 is built on the Volta architecture, offering versatile performance for AI training and inference. It remains a popular choice for research and enterprise projects.

Nvidia V100 prices

Based on our data, the Nvidia V100 might be available in the following cloud providers:

Provider GPUs VRAM vCPUs RAM Price/h
The Cloud Minders logo The Cloud Minders 1x V100 16GB 6 32GB $0.24 Source
DataCrunch logo DataCrunch 1x V100 16GB 6 23GB $0.39 Launch
CUDO Compute logo CUDO Compute 1x V100 16GB 4 16GB $0.54 Source
OVHcloud logo OVH 1x V100 16GB 8 45GB $0.77 Source
Koyeb logo Koyeb 1x V100 16GB 8 44GB $0.85 Source
Exoscale logo Exoscale 1x V100 16GB 12 56GB $1.38 Source
OVHcloud logo OVH 2x V100 32GB 16 90GB $1.55 Source
OVHcloud logo OVH 1x V100 16GB 8 45GB $1.97 Source
Exoscale logo Exoscale 2x V100 32GB 16 90GB $2.01 Source
Paperspace logo Paperspace 1x V100 32GB 8 30GB $2.34 Launch
Paperspace 1x V100 16GB 8 30GB $2.34 Launch
Exoscale 3x V100 48GB 24 120GB $2.65 Source
Azure 1x V100 16GB 6 112GB $3.06 Source
AWS 1x V100 16GB 8 61GB $3.06 Launch
OVH 4x V100 64GB 32 180GB $3.10 Source
Exoscale 4x V100 64GB 48 225GB $3.32 Source
Alibaba Cloud 1x V100 32GB 12 92GB $3.51 Source
OVH 2x V100 32GB 18 90GB $3.94 Source
Lambda Labs 8x V100 128GB 92 448GB $4.40 Launch
Azure 2x V100 32GB 12 224GB $6.12 Source
OVH 4x V100 64GB 36 180GB $7.89 Source
Azure 4x V100 64GB 24 448GB $12.24 Source
AWS 4x V100 64GB 32 244GB $12.24 Launch
Azure 4x V100 64GB 24 448GB $13.46 Source
Azure 8x V100 256GB 40 672GB $22.03 Source
AWS 8x V100 128GB 64 488GB $24.48 Launch

Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.

Nvidia V100 specs

V100 PCIe V100 SXM2 V100S PCIe
GPU Architecture NVIDIA Volta NVIDIA Volta NVIDIA Volta
NVIDIA Tensor Cores 640 640 640
NVIDIA CUDA® Cores 5,120 5,120 5,120
Double-Precision Performance 7 TFLOPS 7.8 TFLOPS 8.2 TFLOPS
Single-Precision Performance 14 TFLOPS 15.7 TFLOPS 16.4 TFLOPS
Tensor Performance 112 TFLOPS 125 TFLOPS 130 TFLOPS
GPU Memory 32 GB / 16 GB HBM2 32 GB HBM2 32 GB HBM2
Memory Bandwidth 900 GB/sec 900 GB/sec 1134 GB/sec
ECC Yes Yes Yes
Interconnect Bandwidth 32 GB/sec 300 GB/sec 32 GB/sec
System Interface PCIe Gen3 NVIDIA NVLink™ PCIe Gen3
Form Factor PCIe Full Height/Length SXM2 PCIe Full Height/Length
Max Power Consumption 250 W 300 W 250 W
Thermal Solution Passive Passive Passive
Compute APIs CUDA, DirectCompute, OpenCL™, OpenACC® CUDA, DirectCompute, OpenCL™, OpenACC® CUDA, DirectCompute, OpenCL™, OpenACC®

Source: official Nvidia V100 datasheet.