Nvidia A100

Compare prices for Nvidia A100 across cloud providers

Jan. 16, 2025 (updated)

Launched in 2020, the Nvidia A100 is built on the Ampere architecture and supports large-scale AI workloads with features like multi-instance GPU (MIG). This design allows flexible resource allocation, making it suitable for shared enterprise workloads.

Nvidia A100 prices

Based on our data, the Nvidia A100 might be available in the following cloud providers:

Provider GPUs VRAM vCPUs RAM Price/h
Build AI logo Build AI 1x A100 40GB 15 200GB $1.05 Source
Lambda Labs logo Lambda Labs 1x A100 40GB 30 200GB $1.29 Launch
DataCrunch logo DataCrunch 1x A100 40GB 22 120GB $1.29 Launch
Hyperstack logo Hyperstack 1x A100 80GB 28 120GB $1.35 Launch
Hyperstack logo Hyperstack 1x A100 80GB 31 240GB $1.40 Source
Build AI logo Build AI 1x A100 40GB 15 200GB $1.42 Source
Build AI logo Build AI 1x A100 80GB 30 225GB $1.45 Source
Crusoe logo Crusoe 1x A100 40GB 12 120GB $1.45 Launch
Oblivus logo Oblivus 1x A100 80GB 28 120GB $1.47
RunPod logo RunPod 1x A100 80GB 8 117GB $1.64 Source
FluidStack 1x A100 40GB 32 128GB $1.65 Source
Crusoe 1x A100 80GB 12 120GB $1.65 Launch
Massed Compute 1x A100 80GB 12 64GB $1.72 Launch
Civo 1x A100 40GB 8 64GB $1.79 Source
FluidStack 1x A100 80GB 48 256GB $1.80 Launch
CUDO Compute 1x A100 80GB 12 48GB $1.83 Source
RunPod 1x A100 80GB 16 125GB $1.89 Source
DataCrunch 1x A100 80GB 22 120GB $1.89 Launch
Build AI 1x A100 80GB 30 225GB $1.97 Source
Civo 1x A100 80GB 12 128GB $2.38 Source
Fly.io 1x A100 40GB -- -- $2.50 Source
Lambda Labs 2x A100 80GB 60 400GB $2.58 Launch
Vultr 1x A100 80GB 12 120GB $2.60 Launch
Koyeb 1x A100 80GB 15 180GB $2.70 Source
OVH 1x A100 80GB 15 180GB $3.07 Source
Paperspace 1x A100 40GB 8 90GB $3.19 Launch
Paperspace 1x A100 80GB 8 90GB $3.28 Launch
Fly.io 1x A100 80GB -- -- $3.50 Source
Civo 2x A100 80GB 16 128GB $3.57 Source
GCP 1x A100 40GB 12 85GB $3.67 Source
DataCrunch 2x A100 160GB 44 240GB $3.78 Source
Replicate 1x A100 40GB 10 72GB $4.14 Source
Civo 2x A100 160GB 24 256GB $4.76 Source
Azure 1x A100 80GB 24 220GB $4.78 Launch
Replicate 1x A100 80GB 10 144GB $5.04 Source
Lambda Labs 4x A100 160GB 120 800GB $5.16 Launch
DataCrunch 4x A100 160GB 88 480GB $5.16 Source
OVH 2x A100 160GB 30 360GB $6.15 Source
Civo 4x A100 160GB 32 255GB $7.14 Source
GCP 2x A100 80GB 24 170GB $7.35 Source
DataCrunch 4x A100 320GB 88 480GB $7.56 Source
Replicate 2x A100 80GB 20 144GB $8.28 Source
Civo 4x A100 320GB 48 512GB $9.52 Source
Replicate 2x A100 160GB 20 288GB $10.08 Source
Lambda Labs 8x A100 320GB 124 1800GB $10.32 Launch
OVH 4x A100 320GB 60 720GB $12.29 Source
Vultr 8x A100 5120GB 112 2048GB $12.90 Source
Civo 8x A100 320GB 64 512GB $14.29 Source
GCP 4x A100 160GB 48 340GB $14.69 Source
DataCrunch 8x A100 640GB 176 960GB $15.12 Source
Replicate 4x A100 160GB 40 288GB $16.56 Source
Civo 8x A100 640GB 96 1024GB $19.05 Source
Replicate 4x A100 320GB 40 576GB $20.16 Source
Azure 8x A100 320GB 96 900GB $27.20 Source
GCP 8x A100 320GB 96 680GB $29.39 Source
AWS 8x A100 320GB 96 1152GB $32.77 Launch
Replicate 8x A100 640GB 80 960GB $40.32 Source
GCP 16x A100 640GB 96 1360GB $55.74 Source

Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.

Nvidia A100 specs

A100 40GB PCIe A100 80GB PCIe A100 40GB SXM A100 80GB SXM
FP64 9.7 TFLOPS 9.7 TFLOPS 9.7 TFLOPS 9.7 TFLOPS
FP64 Tensor Core 19.5 TFLOPS 19.5 TFLOPS 19.5 TFLOPS 19.5 TFLOPS
FP32 19.5 TFLOPS 19.5 TFLOPS 19.5 TFLOPS 19.5 TFLOPS
Tensor Float 32 (TF32) 156 TFLOPS, 312 TFLOPS* 156 TFLOPS, 312 TFLOPS* 156 TFLOPS, 312 TFLOPS* 156 TFLOPS, 312 TFLOPS*
BFLOAT16 Tensor Core 312 TFLOPS, 624 TFLOPS* 312 TFLOPS, 624 TFLOPS* 312 TFLOPS, 624 TFLOPS* 312 TFLOPS, 624 TFLOPS*
FP16 Tensor Core 312 TFLOPS, 624 TFLOPS* 312 TFLOPS, 624 TFLOPS* 312 TFLOPS, 624 TFLOPS* 312 TFLOPS, 624 TFLOPS*
INT8 Tensor Core 624 TOPS, 1248 TOPS* 624 TOPS, 1248 TOPS* 624 TOPS, 1248 TOPS* 624 TOPS, 1248 TOPS*
GPU Memory 40GB HBM2 80GB HBM2e 40GB HBM2 80GB HBM2e
GPU Memory Bandwidth 1,555GB/s 1,935GB/s 1,555GB/s 2,039GB/s
Max Thermal Design Power (TDP) 250W 300W 400W 400W
Multi-Instance GPU Up to 7 MIGs @ 5GB Up to 7 MIGs @ 10GB Up to 7 MIGs @ 5GB Up to 7 MIGs @ 10GB
Form Factor PCIe PCIe SXM SXM
Interconnect NVIDIA® NVLink® Bridge for 2 GPUs: 600GB/s**, PCIe Gen4: 64GB/s NVIDIA® NVLink® Bridge for 2 GPUs: 600GB/s**, PCIe Gen4: 64GB/s NVLink: 600GB/s, PCIe Gen4: 64GB/s NVLink: 600GB/s, PCIe Gen4: 64GB/s
Server Options Partner and NVIDIA-Certified Systems™ with 1-8 GPUs Partner and NVIDIA-Certified Systems™ with 1-8 GPUs NVIDIA HGX™ A100-Partner and NVIDIA-Certified Systems with 4, 8, or 16 GPUs, NVIDIA DGX™ A100 with 8 GPUs NVIDIA HGX™ A100-Partner and NVIDIA-Certified Systems with 4, 8, or 16 GPUs, NVIDIA DGX™ A100 with 8 GPUs

* With sparsity
** SXM4 GPUs via HGX A100 server boards; PCIe GPUs via NVLink Bridge for up to two GPUs

Source: official Nvidia A100 datasheet.