Nvidia A100
Compare prices for Nvidia A100 across cloud providers
Feb. 26, 2025 (updated)
Launched in 2020, the Nvidia A100 is built on the Ampere architecture and supports large-scale AI workloads with features like multi-instance GPU (MIG). This design allows flexible resource allocation, making it suitable for shared enterprise workloads.
Provider | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
![]() |
1x A100 | 40GB | 15 | 200GB | $1.05 | Source |
![]() |
1x A100 | 40GB | 30 | 200GB | $1.29 | Launch |
![]() |
1x A100 | 40GB | 22 | 120GB | $1.29 | Launch |
![]() |
1x A100 | 80GB | 28 | 120GB | $1.35 | Launch |
![]() |
1x A100 | 40GB | 15 | 200GB | $1.42 | Source |
![]() |
1x A100 | 80GB | 30 | 225GB | $1.45 | Source |
![]() |
1x A100 | 40GB | 12 | 120GB | $1.45 | Launch |
![]() |
1x A100 | 80GB | 28 | 120GB | $1.47 | |
![]() |
1x A100 | 80GB | 8 | 117GB | $1.64 | Source |
![]() |
1x A100 | 40GB | 32 | 128GB | $1.65 | Source |
![]() |
1x A100 | 80GB | 12 | 120GB | $1.65 | Launch |
![]() |
1x A100 | 80GB | 12 | 64GB | $1.72 | Launch |
![]() |
1x A100 | 40GB | 8 | 64GB | $1.79 | Source |
![]() |
1x A100 | 80GB | 48 | 256GB | $1.80 | Source |
![]() |
1x A100 | 80GB | 12 | 48GB | $1.83 | Launch |
![]() |
1x A100 | 80GB | 16 | 125GB | $1.89 | Source |
![]() |
1x A100 | 80GB | 22 | 120GB | $1.89 | Launch |
![]() |
1x A100 | 80GB | 30 | 225GB | $1.97 | Source |
![]() |
1x A100 | 80GB | 12 | 128GB | $2.38 | Source |
![]() |
1x A100 | 40GB | -- | -- | $2.50 | Source |
![]() |
2x A100 | 80GB | 60 | 400GB | $2.58 | Launch |
![]() |
1x A100 | 80GB | 12 | 120GB | $2.60 | Launch |
![]() |
1x A100 | 80GB | 15 | 180GB | $2.70 | Source |
![]() |
2x A100 | 160GB | 28 | 120GB | $2.70 | Source |
![]() |
1x A100 | 80GB | 15 | 180GB | $3.07 | Source |
![]() |
1x A100 | 40GB | 8 | 90GB | $3.19 | Launch |
![]() |
1x A100 | 80GB | 8 | 90GB | $3.28 | Launch |
![]() |
1x A100 | 80GB | -- | -- | $3.50 | Source |
![]() |
2x A100 | 80GB | 16 | 128GB | $3.57 | Source |
![]() |
1x A100 | 40GB | 12 | 85GB | $3.67 | Source |
![]() |
2x A100 | 160GB | 44 | 240GB | $3.78 | Source |
![]() |
1x A100 | 40GB | 10 | 72GB | $4.14 | Source |
![]() |
2x A100 | 160GB | 24 | 256GB | $4.76 | Source |
![]() |
1x A100 | 80GB | 24 | 220GB | $4.78 | Source |
![]() |
1x A100 | 80GB | 10 | 144GB | $5.04 | Source |
![]() |
4x A100 | 160GB | 120 | 800GB | $5.16 | Launch |
![]() |
4x A100 | 160GB | 88 | 480GB | $5.16 | Source |
![]() |
4x A100 | 320GB | 28 | 120GB | $5.40 | Source |
![]() |
2x A100 | 160GB | 30 | 360GB | $6.15 | Source |
![]() |
4x A100 | 160GB | 32 | 255GB | $7.14 | Source |
![]() |
2x A100 | 80GB | 24 | 170GB | $7.35 | Source |
![]() |
4x A100 | 320GB | 88 | 480GB | $7.56 | Source |
![]() |
2x A100 | 80GB | 20 | 144GB | $8.28 | Source |
![]() |
4x A100 | 320GB | 48 | 512GB | $9.52 | Source |
![]() |
2x A100 | 160GB | 20 | 288GB | $10.08 | Source |
![]() |
8x A100 | 320GB | 124 | 1800GB | $10.32 | Launch |
![]() |
8x A100 | 640GB | 28 | 120GB | $10.80 | Source |
![]() |
8x A100 | 640GB | 31 | 240GB | $11.20 | Source |
![]() |
4x A100 | 320GB | 60 | 720GB | $12.29 | Source |
![]() |
8x A100 | 5120GB | 112 | 2048GB | $12.90 | Source |
![]() |
8x A100 | 320GB | 64 | 512GB | $14.29 | Source |
![]() |
4x A100 | 160GB | 48 | 340GB | $14.69 | Source |
![]() |
8x A100 | 640GB | 176 | 960GB | $15.12 | Source |
![]() |
4x A100 | 160GB | 40 | 288GB | $16.56 | Source |
![]() |
8x A100 | 640GB | 96 | 1024GB | $19.05 | Source |
![]() |
4x A100 | 320GB | 40 | 576GB | $20.16 | Source |
![]() |
8x A100 | 320GB | 96 | 900GB | $27.20 | Source |
![]() |
8x A100 | 320GB | 96 | 680GB | $29.39 | Source |
![]() |
8x A100 | 320GB | 96 | 1152GB | $32.77 | Source |
![]() |
8x A100 | 640GB | 80 | 960GB | $40.32 | Source |
![]() |
16x A100 | 640GB | 96 | 1360GB | $55.74 | Source |
![]() |
8x A100 | 640GB | -- | -- | On Request | Source |
Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.
Nvidia A100 specs
A100 40GB PCIe | A100 80GB PCIe | A100 40GB SXM | A100 80GB SXM | |
---|---|---|---|---|
FP64 | 9.7 TFLOPS | 9.7 TFLOPS | 9.7 TFLOPS | 9.7 TFLOPS |
FP64 Tensor Core | 19.5 TFLOPS | 19.5 TFLOPS | 19.5 TFLOPS | 19.5 TFLOPS |
FP32 | 19.5 TFLOPS | 19.5 TFLOPS | 19.5 TFLOPS | 19.5 TFLOPS |
Tensor Float 32 (TF32) | 156 TFLOPS, 312 TFLOPS* | 156 TFLOPS, 312 TFLOPS* | 156 TFLOPS, 312 TFLOPS* | 156 TFLOPS, 312 TFLOPS* |
BFLOAT16 Tensor Core | 312 TFLOPS, 624 TFLOPS* | 312 TFLOPS, 624 TFLOPS* | 312 TFLOPS, 624 TFLOPS* | 312 TFLOPS, 624 TFLOPS* |
FP16 Tensor Core | 312 TFLOPS, 624 TFLOPS* | 312 TFLOPS, 624 TFLOPS* | 312 TFLOPS, 624 TFLOPS* | 312 TFLOPS, 624 TFLOPS* |
INT8 Tensor Core | 624 TOPS, 1248 TOPS* | 624 TOPS, 1248 TOPS* | 624 TOPS, 1248 TOPS* | 624 TOPS, 1248 TOPS* |
GPU Memory | 40GB HBM2 | 80GB HBM2e | 40GB HBM2 | 80GB HBM2e |
GPU Memory Bandwidth | 1,555GB/s | 1,935GB/s | 1,555GB/s | 2,039GB/s |
Max Thermal Design Power (TDP) | 250W | 300W | 400W | 400W |
Multi-Instance GPU | Up to 7 MIGs @ 5GB | Up to 7 MIGs @ 10GB | Up to 7 MIGs @ 5GB | Up to 7 MIGs @ 10GB |
Form Factor | PCIe | PCIe | SXM | SXM |
Interconnect | NVIDIA® NVLink® Bridge for 2 GPUs: 600GB/s**, PCIe Gen4: 64GB/s | NVIDIA® NVLink® Bridge for 2 GPUs: 600GB/s**, PCIe Gen4: 64GB/s | NVLink: 600GB/s, PCIe Gen4: 64GB/s | NVLink: 600GB/s, PCIe Gen4: 64GB/s |
Server Options | Partner and NVIDIA-Certified Systems™ with 1-8 GPUs | Partner and NVIDIA-Certified Systems™ with 1-8 GPUs | NVIDIA HGX™ A100-Partner and NVIDIA-Certified Systems with 4, 8, or 16 GPUs, NVIDIA DGX™ A100 with 8 GPUs | NVIDIA HGX™ A100-Partner and NVIDIA-Certified Systems with 4, 8, or 16 GPUs, NVIDIA DGX™ A100 with 8 GPUs |
* With sparsity
** SXM4 GPUs via HGX A100 server boards; PCIe GPUs via NVLink Bridge for up to two GPUs
Source: official Nvidia A100 datasheet.