Cloud GPU Price Comparison
March 14, 2025 (updated)
-
Nvidia A4000 ($0.15+)
-
Nvidia A30 ($0.22+)
-
Nvidia V100 ($0.24+)
-
Nvidia A5000 ($0.25+)
-
Nvidia RTX 4000 ($0.31+)
-
Nvidia A4500 ($0.34+)
-
Nvidia A40 ($0.39+)
-
Nvidia T4G ($0.42+)
-
Nvidia M4000 ($0.45+)
-
Nvidia A6000 ($0.50+)
-
Nvidia RTX 6000 ($0.50+)
-
Nvidia P4000 ($0.51+)
-
Nvidia A16 ($0.51+)
-
Nvidia T4 ($0.53+)
-
Nvidia RTX 4090 ($0.69+)
-
Nvidia L4 ($0.71+)
-
Nvidia A10G ($0.75+)
-
Nvidia M60 ($0.75+)
-
Nvidia P5000 ($0.78+)
-
Nvidia RTX 5000 ($0.82+)
-
Nvidia L40S ($0.86+)
-
Nvidia V100S ($0.88+)
-
Nvidia K80 ($0.90+)
-
Nvidia 3080ti ($0.99+)
-
Nvidia A100 ($1.05+)
-
Nvidia RTX 3070 ($1.06+)
-
Nvidia P6000 ($1.10+)
-
Nvidia P100 ($1.17+)
-
Nvidia A10 ($1.50+)
-
Nvidia H100 ($1.90+)
-
Nvidia P40 ($2.07+)
-
AMD MI250 ($2.10+)
-
Nvidia H200 ($2.49+)
-
AMD MI300X ($2.99+)
-
Nvidia GH200 ($3.19+)
-
Nvidia B200 (On Request)
-
Nvidia B100 (On Request)
-
Alibaba Cloud (3)
-
Amazon Web Services (32)
-
Build AI (6)
-
CUDO Compute (14)
-
Civo (14)
-
Contabo (2)
-
Crusoe (4)
-
DataCrunch (17)
-
DigitalOcean (2)
-
Exoscale (18)
-
FluidStack (8)
-
Fly.io (4)
-
Google Cloud (13)
-
Green AI Cloud (5)
-
Hetzner (2)
-
Hyperstack (22)
-
Koyeb (6)
-
Lambda Labs (15)
-
Linode (4)
-
Massed Compute (7)
-
Microsoft Azure (30)
-
Nebius (6)
-
OVHcloud (24)
-
Oblivus (5)
-
Oracle Cloud (1)
-
Paperspace (14)
-
Replicate (13)
-
RunPod (17)
-
Scaleway (12)
-
TensorWave (1)
-
The Cloud Minders (9)
-
Vultr (12)
Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.
Which cloud has the most affordable GPUs?
If you're looking for the lowest price per hour, based on the data I collected that'd be the Nvidia A4000 offered by Hyperstack at $0.15/h.
Out of the 342 price points I track, these are the 20 most affordable (price, ascending):
Provider | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
![]() |
1x A4000 | 16GB | 6 | 24GB | $0.15 | Launch |
![]() |
1x A4000 | 16GB | 6 | 24GB | $0.20 | |
![]() |
1x A30 | 24GB | 8 | 31GB | $0.22 | Source |
![]() |
1x V100 | 16GB | 6 | 32GB | $0.24 | Source |
![]() |
1x A5000 | 24GB | 8 | 24GB | $0.25 | Launch |
![]() |
1x RTX 4000 | 20GB | 14 | 64GB | $0.31 | Source |
![]() |
1x A4000 | 16GB | 4 | 20GB | $0.32 | Source |
![]() |
1x A30 | 24GB | 16 | 48GB | $0.33 | Launch |
![]() |
1x A4500 | 20GB | 4 | 29GB | $0.34 | Source |
![]() |
1x A5000 | 24GB | 4 | 24GB | $0.36 | Source |
![]() |
1x V100 | 16GB | 6 | 23GB | $0.39 | Launch |
![]() |
1x A40 | 48GB | 9 | 50GB | $0.39 | Source |
![]() |
1x A4000 | 16GB | 36 | 128GB | $0.40 | Source |
![]() |
1x A4000 | 16GB | 5 | 32GB | $0.40 | Source |
![]() |
1x A5000 | 24GB | 10 | 48GB | $0.41 | Launch |
![]() |
1x T4G | 16GB | 4 | 16GB | $0.42 | Source |
![]() |
1x M4000 | 8GB | 8 | 30GB | $0.45 | Source |
![]() |
1x A4000 | 20GB | 4 | 16GB | $0.46 | Source |
![]() |
1x A4000 | 16GB | 6 | 24GB | $0.47 | Launch |
![]() |
1x A5000 | 24GB | 6 | 24GB | $0.49 | Launch |
Which GPU is best for AI training / inference?
The answer depends on many factors including your specific workload. But generally, you should consider:
- Tensor Cores
- Memory capacity and bandwidth
- Cache hierarchy
- FLOPS
For an in-depth guide on picking GPUs for AI/ML, check out this article by Tim Dettmers.
Example GPUs by use case
As a starting point, here's my (very broad) attempt at categorizing GPUs for AI workloads:
Category | Examples | Best for | Performance profile |
---|---|---|---|
High-end | A100, H100, H200, GH200, MI250, MI300X | Training large language models, high-performance computing, large-scale inference | Highest memory capacity, memory bandwidth, and compute performance |
Mid-range | A40, A30, A6000, V100, L4, T4 | Medium-sized model training, inference, and fine-tuning tasks | Higher memory and compute, cost-effective for many AI tasks |
Budget | K80, M60, P100, A4000 | Small model training, experiments, low-cost inference | Moderate memory, older architectures (K80, M60), lower cost |
Consumer | RTX3070, RTX4090, RTX4000 | Gaming, content creation, and mid-range AI tasks | High compute for gaming and development, Tensor Cores make RTX cards suitable for entry-level AI inference |
Next, I'll provide a brief overview of each cloud provider and some of the GPUs they offer (342 price points).
RunPod
RunPod is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
A30 | 1x A30 | 24GB | 8 | 31GB | $0.22 | Source |
RTX A4000 | 1x A4000 | 16GB | 4 | 20GB | $0.32 | Source |
A4500 | 1x A4500 | 20GB | 4 | 29GB | $0.34 | Source |
A5000 | 1x A5000 | 24GB | 4 | 24GB | $0.36 | Source |
A40 | 1x A40 | 48GB | 9 | 50GB | $0.39 | Source |
RTX4090 | 1x RTX 4090 | 24GB | 5 | 30GB | $0.69 | Source |
A6000 | 1x A6000 | 48GB | 8 | 62GB | $0.76 | Source |
L40S | 1x L40S | 48GB | 12 | 62GB | $0.86 | Source |
A6000 Ada | 1x A6000 | 48GB | 14 | 58GB | $0.88 | Source |
L40 | 1x L40S | 48GB | 8 | 94GB | $0.99 | Source |
A100 PCIe | 1x A100 | 80GB | 8 | 117GB | $1.64 | Source |
A100 SXM | 1x A100 | 80GB | 16 | 125GB | $1.89 | Source |
MI250 | 1x MI250 | 128GB | -- | -- | $2.10 | Source |
H100 PCIe | 1x H100 | 80GB | 16 | 188GB | $2.39 | Source |
H100 SXM | 1x H100 | 80GB | 16 | 125GB | $2.99 | Source |
MI300X | 1x MI300X | 192GB | 24 | 283GB | $2.99 | Source |
H200 SXM | 1x H200 | 141GB | -- | -- | $3.99 | Source |
Amazon Web Services
Amazon Web Services is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
g5g.xlarge | 1x T4G | 16GB | 4 | 16GB | $0.42 | Source |
g4dn.xlarge | 1x T4G | 16GB | 4 | 16GB | $0.53 | Source |
g5g.2xlarge | 1x T4G | 16GB | 8 | 16GB | $0.56 | Source |
g4dn.2xlarge | 1x T4G | 16GB | 8 | 32GB | $0.75 | Source |
g3s.xlarge | 1x M60 | 8GB | 4 | 30GB | $0.75 | Source |
g5g.4xlarge | 1x T4G | 16GB | 16 | 16GB | $0.83 | Source |
p2.xlarge | 1x K80 | 12GB | 4 | 61GB | $0.90 | Source |
g5.xlarge | 1x A10G | 24GB | 4 | 16GB | $1.01 | Source |
g3.4xlarge | 1x M60 | 8GB | 16 | 122GB | $1.14 | Source |
g4dn.4xlarge | 1x T4G | 16GB | 16 | 64GB | $1.20 | Source |
g5.2xlarge | 1x A10G | 24GB | 8 | 32GB | $1.21 | Source |
g5g.8xlarge | 1x T4G | 16GB | 32 | 16GB | $1.37 | Source |
g5.4xlarge | 1x A10G | 24GB | 16 | 64GB | $1.63 | Source |
g4dn.8xlarge | 1x T4G | 16GB | 32 | 128GB | $2.18 | Source |
g3.8xlarge | 2x M60 | 16GB | 32 | 244GB | $2.28 | Source |
g5.8xlarge | 1x A10G | 24GB | 32 | 128GB | $2.45 | Source |
g5g.16xlarge | 2x T4G | 32GB | 64 | 32GB | $2.74 | Source |
p3.2xlarge | 1x V100 | 16GB | 8 | 61GB | $3.06 | Source |
g4dn.12xlarge | 4x T4G | 64GB | 48 | 192GB | $3.92 | Source |
g5.12xlarge | 4x A10G | 96GB | 48 | 192GB | $4.10 | Source |
g4dn.16xlarge | 1x T4G | 16GB | 64 | 256GB | $4.35 | Source |
g3.16xlarge | 4x M60 | 32GB | 64 | 488GB | $4.56 | Source |
g5.12xlarge | 4x A10G | 384GB | 48 | 192GB | $5.68 | Source |
p2.8xlarge | 8x K80 | 96GB | 32 | 488GB | $7.20 | Source |
g4dn.metal | 8x T4G | 128GB | 96 | 384GB | $7.83 | Source |
g5.24xlarge | 4x A10G | 96GB | 96 | 384GB | $8.14 | Source |
p3.8xlarge | 4x V100 | 64GB | 32 | 244GB | $12.24 | Source |
p2.16xlarge | 16x K80 | 192GB | 64 | 732GB | $14.40 | Source |
g5.48xlarge | 8x A10G | 192GB | 192 | 768GB | $16.29 | Source |
p3.16xlarge | 8x V100 | 128GB | 64 | 488GB | $24.48 | Source |
p4d.24xlarge | 8x A100 | 320GB | 96 | 1152GB | $32.77 | Source |
p5.48xlarge | 8x H100 | 640GB | 192 | 2048GB | $98.32 | Source |
Build AI
Build AI is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
A100 (spot) | 1x A100 | 40GB | 15 | 200GB | $1.05 | Source |
A100 | 1x A100 | 40GB | 15 | 200GB | $1.42 | Source |
A100 (spot) | 1x A100 | 80GB | 30 | 225GB | $1.45 | Source |
A100 | 1x A100 | 80GB | 30 | 225GB | $1.97 | Source |
H100 (spot) | 1x H100 | 80GB | 26 | 225GB | $2.79 | Source |
H100 | 1x H100 | 80GB | 26 | 225GB | $3.85 | Source |
CUDO Compute
CUDO Compute is based in UK 🇬🇧 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
RTX A4000 Ada | 1x A4000 | 20GB | 4 | 16GB | $0.46 | Source |
RTX A4000 | 1x A4000 | 16GB | 6 | 24GB | $0.47 | Launch |
RTX A5000 | 1x A5000 | 24GB | 6 | 24GB | $0.49 | Launch |
A40 | 1x A40 | 48GB | 6 | 24GB | $0.53 | Launch |
V100 | 1x V100 | 16GB | 4 | 16GB | $0.54 | Source |
RTX A6000 | 1x A6000 | 48GB | 6 | 24GB | $0.59 | Launch |
L40S | 1x L40S | 48GB | 12 | 48GB | $1.75 | Launch |
A100 PCIe | 1x A100 | 80GB | 12 | 48GB | $1.83 | Launch |
H200 | 1x H200 | 141GB | -- | -- | $2.49 | Source |
H100 NVL | 1x H100 | 94GB | 4 | 16GB | $2.56 | Source |
H100 SXM | 1x H100 | 80GB | 12 | 48GB | $3.18 | Launch |
MI300 | 1x MI300X | 192GB | -- | -- | On Request | Source |
MI250 | 1x MI250 | 128GB | -- | -- | On Request | Source |
B100 | 1x B100 | 192GB | -- | -- | On Request | Source |
Civo
Civo is based in UK 🇬🇧 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
A100-40 Small | 1x A100 | 40GB | 8 | 64GB | $1.79 | Source |
L40S Medium | 1x L40S | 48GB | 8 | 64GB | $1.79 | Source |
A100-80 Small | 1x A100 | 80GB | 12 | 128GB | $2.38 | Source |
A100-40 Medium | 2x A100 | 80GB | 16 | 128GB | $3.57 | Source |
L40S Medium (2x) | 2x L40S | 96GB | 16 | 128GB | $3.57 | Source |
A100-80 Medium | 2x A100 | 160GB | 24 | 256GB | $4.76 | Source |
A100-40 Large | 4x A100 | 160GB | 32 | 255GB | $7.14 | Source |
L40S Large | 4x L40S | 192GB | 32 | 255GB | $7.14 | Source |
A100-80 Large | 4x A100 | 320GB | 48 | 512GB | $9.52 | Source |
A100-40 Extra Large | 8x A100 | 320GB | 64 | 512GB | $14.29 | Source |
L40S Extra Large | 8x L40S | 384GB | 64 | 512GB | $14.29 | Source |
A100-80 Extra Large | 8x A100 | 640GB | 96 | 1024GB | $19.05 | Source |
H100 | 1x H100 | 80GB | -- | -- | On Request | Source |
GH200 | 1x GH200 | 96GB | -- | -- | On Request | Source |
Contabo
Contabo is based in Germany 🇩🇪 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
L40S | 4x L40S | 192GB | 64 | 512GB | $4.29 | Source |
H100 | 4x H100 | 320GB | 64 | 512GB | $10.51 | Source |
Crusoe
Crusoe is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
A40 | 1x A40 | 48GB | 6 | 60GB | $1.10 | Launch |
A100 | 1x A100 | 40GB | 12 | 120GB | $1.45 | Launch |
L40S | 1x L40S | 48GB | 8 | 147GB | $1.45 | Launch |
A100 80GB | 1x A100 | 80GB | 12 | 120GB | $1.65 | Launch |
Crusoe instances can be scaled up to 10x the base configuration.
DataCrunch
DataCrunch is based in Finland 🇫🇮 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
Tesla V100 16GB | 1x V100 | 16GB | 6 | 23GB | $0.39 | Launch |
RTX A6000 48GB | 1x A6000 | 48GB | 10 | 60GB | $1.01 | Launch |
L40S | 1x L40S | 48GB | 20 | 60GB | $1.10 | Source |
RTX 6000 Ada 48GB | 1x RTX 6000 | 48GB | 10 | 60GB | $1.19 | Source |
A100 SXM4 40GB | 1x A100 | 40GB | 22 | 120GB | $1.29 | Launch |
A100 SXM4 80GB | 1x A100 | 80GB | 22 | 120GB | $1.89 | Launch |
H100 SXM5 80GB | 1x H100 | 80GB | 30 | 120GB | $2.65 | Launch |
H200 SXM5 141GB | 1x H200 | 141GB | 44 | 185GB | $3.03 | Launch |
A100 SXM4 80GB | 2x A100 | 160GB | 44 | 240GB | $3.78 | Source |
A100 SXM4 40GB | 4x A100 | 160GB | 88 | 480GB | $5.16 | Source |
H100 SXM5 80GB | 2x H100 | 160GB | 80 | 370GB | $5.30 | Source |
H200 SXM5 141GB | 2x H200 | 282GB | 88 | 370GB | $6.06 | Source |
A100 SXM4 80GB | 4x A100 | 320GB | 88 | 480GB | $7.56 | Source |
H100 SXM5 80GB | 4x H100 | 320GB | 176 | 740GB | $10.60 | Source |
H200 SXM5 141GB | 4x H200 | 564GB | 176 | 740GB | $12.12 | Source |
A100 SXM4 80GB | 8x A100 | 640GB | 176 | 960GB | $15.12 | Source |
H200 SXM5 141GB | 8x H200 | 1128GB | 176 | 1450GB | $24.24 | Source |
DataCrunch instances can be scaled by 2x, 4x, or 8x of the base configuration.
Exoscale
Exoscale is based in Switzerland 🇨🇭 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
3080ti Small | 1x 3080ti | 12GB | 12 | 56GB | $0.99 | Source |
P100 Small | 1x P100 | 16GB | 12 | 56GB | $1.17 | Source |
V100 Small | 1x V100 | 16GB | 12 | 56GB | $1.38 | Source |
A5000 Small | 1x A5000 | 24GB | 12 | 56GB | $1.45 | Source |
P100 Medium | 2x P100 | 32GB | 16 | 90GB | $1.71 | Source |
3080ti Medium | 2x 3080ti | 24GB | 24 | 112GB | $1.84 | Source |
V100 Medium | 2x V100 | 32GB | 16 | 90GB | $2.01 | Source |
A40 Small | 1x A40 | 48GB | 12 | 56GB | $2.14 | Source |
P100 Large | 3x P100 | 48GB | 24 | 120GB | $2.25 | Source |
V100 Large | 3x V100 | 48GB | 24 | 120GB | $2.65 | Source |
A5000 Medium | 2x A5000 | 48GB | 24 | 112GB | $2.68 | Source |
P100 Huge | 4x P100 | 64GB | 48 | 225GB | $2.82 | Source |
V100 Huge | 4x V100 | 64GB | 48 | 225GB | $3.32 | Source |
3080ti Large | 4x 3080ti | 48GB | 48 | 224GB | $3.40 | Source |
A40 Medium | 2x A40 | 96GB | 24 | 120GB | $4.27 | Source |
A5000 Large | 4x A5000 | 96GB | 48 | 224GB | $4.93 | Source |
A40 Large | 4x A40 | 192GB | 48 | 224GB | $8.54 | Source |
A40 Huge | 8x A40 | 384GB | 96 | 448GB | $17.06 | Source |
FluidStack
FluidStack is based in UK 🇬🇧 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
Nvidia A4000 | 1x A4000 | 16GB | 36 | 128GB | $0.40 | Source |
Nvidia A5000 | 1x A5000 | 24GB | 36 | 128GB | $0.55 | Source |
Nvidia A40 | 1x A40 | 48GB | 32 | 128GB | $0.60 | Source |
Nvidia A6000 | 1x A6000 | 48GB | 48 | 128GB | $0.80 | Source |
Nvidia L40 | 1x L40S | 48GB | 32 | 48GB | $1.25 | Source |
Nvidia A100 PCIe 40GB | 1x A100 | 40GB | 32 | 128GB | $1.65 | Source |
Nvidia A100 PCIe 80GB | 1x A100 | 80GB | 48 | 256GB | $1.80 | Source |
Nvidia H100 PCIe | 1x H100 | 80GB | 48 | 256GB | $2.89 | Source |
Google Cloud
Google Cloud is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
g2-standard-4 | 1x L4 | 24GB | 4 | 16GB | $0.71 | Source |
g2-standard-8 | 1x L4 | 24GB | 8 | 32GB | $0.85 | Source |
g2-standard-12 | 1x L4 | 24GB | 12 | 48GB | $1.00 | Source |
g2-standard-16 | 1x L4 | 24GB | 16 | 64GB | $1.15 | Source |
g2-standard-32 | 1x L4 | 24GB | 32 | 128GB | $1.73 | Source |
g2-standard-24 | 2x L4 | 48GB | 24 | 96GB | $2.00 | Source |
a2-highgpu-1g | 1x A100 | 40GB | 12 | 85GB | $3.67 | Source |
g2-standard-48 | 4x L4 | 96GB | 48 | 192GB | $4.00 | Source |
a2-highgpu-2g | 2x A100 | 80GB | 24 | 170GB | $7.35 | Source |
g2-standard-96 | 8x L4 | 192GB | 96 | 384GB | $8.00 | Source |
a2-highgpu-4g | 4x A100 | 160GB | 48 | 340GB | $14.69 | Source |
a2-highgpu-8g | 8x A100 | 320GB | 96 | 680GB | $29.39 | Source |
a2-megagpu-16g | 16x A100 | 640GB | 96 | 1360GB | $55.74 | Source |
Green AI Cloud
Green AI Cloud is based in Sweden 🇸🇪 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
H200 | 1x H200 | 141GB | 112 | 2048GB | $2.75 | Source |
H200 | 8x H200 | 1128GB | 112 | 2048GB | $22.00 | Source |
H100 | 8x H100 | 640GB | -- | -- | On Request | Source |
A100 | 8x A100 | 640GB | -- | -- | On Request | Source |
B200 SXM6 | 8x B200 | 1536GB | 96 | 2048GB | On Request | Source |
Hetzner
Hetzner is based in Germany 🇩🇪 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
GEX44 | 1x RTX 4000 | 20GB | 14 | 64GB | $0.31 | Source |
GEX130 | 1x RTX 6000 | 48GB | 24 | 128GB | $1.45 | Source |
While Hetzner does not currently offer on-demand GPU instances, they do offer GPUs with some of their dedicated servers.
For these, in addition to the hourly rate there's a €94.01 setup fee (one time).
Hyperstack
Hyperstack is based in UK 🇬🇧 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
NVIDIA A4000 | 1x A4000 | 16GB | 6 | 24GB | $0.15 | Launch |
NVIDIA A5000 | 1x A5000 | 24GB | 8 | 24GB | $0.25 | Launch |
NVIDIA A6000 | 1x A6000 | 48GB | 28 | 58GB | $0.50 | Launch |
NVIDIA A40 | 1x A40 | 48GB | 28 | 58GB | $0.50 | Source |
NVIDIA L40 | 1x L40S | 48GB | 28 | 58GB | $1.00 | Launch |
NVIDIA L40 x2 | 2x L40S | 96GB | 28 | 58GB | $1.00 | Source |
NVIDIA A6000 x2 | 2x A6000 | 96GB | 28 | 58GB | $1.00 | Source |
NVIDIA A100 PCIe 80GB | 1x A100 | 80GB | 28 | 120GB | $1.35 | Launch |
NVIDIA H100 PCIe 80GB | 1x H100 | 80GB | 28 | 180GB | $1.90 | Launch |
NVIDIA A6000 x4 | 4x A6000 | 192GB | 28 | 58GB | $2.00 | Source |
NVIDIA A100 PCIe 80GB x2 | 2x A100 | 160GB | 28 | 120GB | $2.70 | Source |
NVIDIA H100 PCIe 80GB x2 | 2x H100 | 160GB | 28 | 180GB | $3.80 | Source |
NVIDIA L40 x4 | 4x L40S | 192GB | 28 | 58GB | $4.00 | Source |
NVIDIA A6000 x8 | 8x A6000 | 384GB | 28 | 58GB | $4.00 | Source |
NVIDIA A100 PCIe 80GB x4 | 4x A100 | 320GB | 28 | 120GB | $5.40 | Source |
NVIDIA H100 PCIe 80GB x4 | 4x H100 | 320GB | 28 | 180GB | $7.60 | Source |
NVIDIA L40 x8 | 8x L40S | 384GB | 28 | 58GB | $8.00 | Source |
NVIDIA A100 PCIe 80GB x8 | 8x A100 | 640GB | 28 | 120GB | $10.80 | Source |
NVIDIA A100 80GB PCIe NVLink | 8x A100 | 640GB | 31 | 240GB | $11.20 | Source |
NVIDIA H100 PCIe 80GB x8 | 8x H100 | 640GB | 28 | 180GB | $15.20 | Source |
NVIDIA H100 PCIe NVLink 80GB x8 | 8x H100 | 640GB | 31 | 180GB | $15.60 | Source |
NVIDIA H100 SXM 80GB x8 | 8x H100 | 640GB | 24 | 240GB | $19.20 | Source |
Koyeb
Koyeb is based in France 🇫🇷 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
RTX4000-SFF-ADA | 1x RTX 4000 | 20GB | 6 | 44GB | $0.50 | Source |
V100-SXM2 | 1x V100 | 16GB | 8 | 44GB | $0.85 | Source |
L4 | 1x L4 | 24GB | 15 | 44GB | $1.00 | Source |
L40S | 1x L40S | 48GB | 30 | 92GB | $2.00 | Source |
A100 | 1x A100 | 80GB | 15 | 180GB | $2.70 | Source |
H100 | 1x H100 | 80GB | 15 | 180GB | $3.30 | Source |
Lambda Labs
Lambda Labs is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
1x RTX6000 | 1x RTX 6000 | 24GB | 14 | 46GB | $0.50 | Launch |
1x A10 | 1x A10G | 24GB | 30 | 200GB | $0.75 | Launch |
1x A6000 | 1x A6000 | 48GB | 14 | 100GB | $0.80 | Launch |
1x A100 SXM | 1x A100 | 40GB | 30 | 200GB | $1.29 | Launch |
2x A6000 | 2x A6000 | 96GB | 28 | 200GB | $1.60 | Launch |
1x H100 PCIe | 1x H100 | 80GB | 26 | 200GB | $2.49 | Launch |
2x A100 | 2x A100 | 80GB | 60 | 400GB | $2.58 | Launch |
Nvidia GH200 | 1x GH200 | 96GB | 64 | 432GB | $3.19 | Source |
4x A6000 | 4x A6000 | 192GB | 56 | 400GB | $3.20 | Launch |
8x V100 | 8x V100 | 128GB | 92 | 448GB | $4.40 | Launch |
4x A100 | 4x A100 | 160GB | 120 | 800GB | $5.16 | Launch |
8x A100 | 8x A100 | 320GB | 124 | 1800GB | $10.32 | Launch |
8x H100 SXM | 8x H100 | 640GB | 208 | 1800GB | $23.92 | Launch |
Nvidia H200 SXM | 8x H200 | 1128GB | 224 | -- | On Request | Source |
B200 | 8x B200 | 1440GB | 224 | -- | On Request | Source |
Massed Compute
Massed Compute is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
A30 | 1x A30 | 24GB | 16 | 48GB | $0.33 | Launch |
A5000 | 1x A5000 | 24GB | 10 | 48GB | $0.41 | Launch |
A6000 | 1x A6000 | 48GB | 6 | 48GB | $0.57 | Launch |
RTX6000 Ada | 1x RTX 6000 | 48GB | 12 | 64GB | $0.97 | Launch |
L40S | 1x L40S | 48GB | 22 | 128GB | $1.10 | Launch |
A100 80GB | 1x A100 | 80GB | 12 | 64GB | $1.72 | Launch |
H100 | 1x H100 | 80GB | 20 | 128GB | $2.98 | Launch |
Their instances can be scaled up to 8x the base configuration.
Microsoft Azure
Microsoft Azure is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
NC4as T4 v3 | 1x T4 | 16GB | 4 | 28GB | $0.53 | Source |
NC8as T4 v3 | 1x T4 | 16GB | 8 | 56GB | $0.75 | Source |
NC6 | 1x K80 | 12GB | 6 | 56GB | $0.90 | Source |
NV6 | 1x M60 | 8GB | 6 | 56GB | $1.14 | Source |
NV12s v3 | 1x M60 | 8GB | 12 | 112GB | $1.14 | Source |
NC16as T4 v3 | 1x T4 | 16GB | 16 | 110GB | $1.20 | Source |
NC12 | 2x K80 | 24GB | 12 | 112GB | $1.80 | Source |
ND6s | 1x P40 | 24GB | 6 | 112GB | $2.07 | Source |
NC6s v2 | 1x P100 | 16GB | 6 | 112GB | $2.07 | Source |
NV12 | 2x M60 | 16GB | 12 | 112GB | $2.28 | Source |
NV24s v3 | 2x M60 | 16GB | 24 | 224GB | $2.28 | Source |
NC6s v3 | 1x V100 | 16GB | 6 | 112GB | $3.06 | Source |
NC24 | 4x K80 | 48GB | 24 | 224GB | $3.60 | Source |
NC24r | 4x K80 | 48GB | 24 | 224GB | $3.96 | Source |
ND12s | 2x P40 | 48GB | 12 | 224GB | $4.14 | Source |
NC12s v2 | 2x P100 | 32GB | 12 | 224GB | $4.14 | Source |
NC64as T4 v3 | 4x T4 | 64GB | 64 | 440GB | $4.35 | Source |
NV48s v3 | 4x M60 | 32GB | 48 | 448GB | $4.56 | Source |
NV24 | 4x M60 | 32GB | 24 | 224GB | $4.56 | Source |
NC24ads A100 v4 | 1x A100 | 80GB | 24 | 220GB | $4.78 | Source |
NC12s v3 | 2x V100 | 32GB | 12 | 224GB | $6.12 | Source |
ND24s | 4x P40 | 96GB | 24 | 448GB | $8.28 | Source |
NC24s v2 | 4x P100 | 64GB | 24 | 448GB | $8.28 | Source |
NC40ads H100 v5 | 1x H100 | 80GB | 40 | 320GB | $8.82 | Source |
ND24rs | 4x P40 | 96GB | 24 | 448GB | $9.11 | Source |
NC24rs v2 | 4x P100 | 64GB | 24 | 448GB | $9.11 | Source |
NC24s v3 | 4x V100 | 64GB | 24 | 448GB | $12.24 | Source |
NC24rs v3 | 4x V100 | 64GB | 24 | 448GB | $13.46 | Source |
ND40rs v2 | 8x V100 | 256GB | 40 | 672GB | $22.03 | Source |
ND96asr A100 v4 | 8x A100 | 320GB | 96 | 900GB | $27.20 | Source |
Nebius
Nebius is based in Netherlands 🇳🇱 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
L40S PCIe | 1x L40S | 48GB | 8 | 32GB | $1.58 | Source |
H100 SXM | 1x H100 | 80GB | 20 | 160GB | $3.55 | Launch |
H100 SXM | 2x H100 | 160GB | 40 | 320GB | $7.10 | Source |
H100 SXM | 4x H100 | 320GB | 80 | 640GB | $14.20 | Source |
H200 | 8x H200 | 1128GB | -- | -- | $20.72 | Source |
H100 SXM | 8x H100 | 640GB | 160 | 1280GB | $28.39 | Launch |
OVHcloud
OVHcloud is based in France 🇫🇷 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
t1-le-45 | 1x V100 | 16GB | 8 | 45GB | $0.77 | Source |
t2-le-45 | 1x V100S | 32GB | 15 | 45GB | $0.88 | Source |
l4-90 | 1x L4 | 24GB | 22 | 90GB | $1.00 | Source |
t1-le-90 | 2x V100 | 32GB | 16 | 90GB | $1.55 | Source |
t2-le-90 | 2x V100S | 64GB | 30 | 90GB | $1.76 | Source |
l40s-90 | 1x L40S | 48GB | 15 | 90GB | $1.80 | Source |
t1-45 | 1x V100 | 16GB | 8 | 45GB | $1.97 | Source |
l4-180 | 2x L4 | 48GB | 45 | 180GB | $2.00 | Source |
t2-45 | 1x V100S | 32GB | 15 | 45GB | $2.19 | Source |
h100-380 | 1x H100 | 80GB | 30 | 380GB | $2.99 | Source |
a100-180 | 1x A100 | 80GB | 15 | 180GB | $3.07 | Source |
t1-le-180 | 4x V100 | 64GB | 32 | 180GB | $3.10 | Source |
t2-le-180 | 4x V100S | 128GB | 60 | 180GB | $3.53 | Source |
l40s-180 | 2x L40S | 96GB | 30 | 180GB | $3.60 | Source |
t1-90 | 2x V100 | 32GB | 18 | 90GB | $3.94 | Source |
l4-360 | 4x L4 | 96GB | 90 | 360GB | $4.00 | Source |
t2-90 | 2x V100S | 64GB | 30 | 90GB | $4.38 | Source |
h100-760 | 2x H100 | 160GB | 60 | 760GB | $5.98 | Source |
a100-360 | 2x A100 | 160GB | 30 | 360GB | $6.15 | Source |
l40s-360 | 4x L40S | 192GB | 60 | 360GB | $7.20 | Source |
t1-180 | 4x V100 | 64GB | 36 | 180GB | $7.89 | Source |
t2-180 | 4x V100S | 128GB | 60 | 180GB | $8.76 | Source |
h100-1520 | 4x H100 | 320GB | 120 | 1520GB | $11.97 | Source |
a100-720 | 4x A100 | 320GB | 60 | 720GB | $12.29 | Source |
Oblivus
Oblivus is based in UK 🇬🇧 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
A4000 | 1x A4000 | 16GB | 6 | 24GB | $0.20 | |
A5000 | 1x A5000 | 24GB | 8 | 30GB | $0.50 | |
A6000 | 1x A6000 | 48GB | 16 | 60GB | $0.55 | |
A100 80GB PCIe | 1x A100 | 80GB | 28 | 120GB | $1.47 | |
H100 PCIe | 1x H100 | 80GB | 28 | 180GB | $1.98 |
Oblivus instances can be scaled up to 8x GPUs per instance.
Paperspace
Paperspace is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
GPU+ (M4000) | 1x M4000 | 8GB | 8 | 30GB | $0.45 | Source |
P4000 | 1x P4000 | 8GB | 8 | 30GB | $0.51 | Source |
RTX4000 | 1x RTX 4000 | 24GB | 8 | 30GB | $0.56 | Source |
P5000 | 1x P5000 | 16GB | 8 | 30GB | $0.78 | Source |
A4000 | 1x A4000 | 16GB | 8 | 45GB | $0.80 | Launch |
RTX5000 | 1x RTX 5000 | 16GB | 8 | 30GB | $0.82 | Source |
P6000 | 1x P6000 | 24GB | 8 | 30GB | $1.10 | Source |
A5000 | 1x A5000 | 24GB | 8 | 45GB | $1.42 | Launch |
A6000 | 1x A6000 | 48GB | 8 | 45GB | $1.93 | Launch |
V100-32G | 1x V100 | 32GB | 8 | 30GB | $2.34 | Launch |
V100 | 1x V100 | 16GB | 8 | 30GB | $2.34 | Launch |
A100 | 1x A100 | 40GB | 8 | 90GB | $3.19 | Launch |
A100-80G | 1x A100 | 80GB | 8 | 90GB | $3.28 | Launch |
H100 PCIe | 1x H100 | 80GB | 16 | 268GB | $5.99 | Launch |
Replicate
Replicate is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
Nvidia T4 | 1x T4 | 16GB | 4 | 16GB | $0.81 | Source |
Nvidia A40 (Small) | 1x A40 | 48GB | 4 | 16GB | $2.07 | Source |
Nvidia A40 (Large) | 1x A40 | 48GB | 10 | 72GB | $2.61 | Source |
Nvidia A100 (40GB) | 1x A100 | 40GB | 10 | 72GB | $4.14 | Source |
Nvidia A100 (80GB) | 1x A100 | 80GB | 10 | 144GB | $5.04 | Source |
2x Nvidia A40 (Large) | 2x A40 | 96GB | 20 | 144GB | $5.22 | Source |
2x Nvidia A100 (40GB) | 2x A100 | 80GB | 20 | 144GB | $8.28 | Source |
2x Nvidia A100 (80GB) | 2x A100 | 160GB | 20 | 288GB | $10.08 | Source |
4x Nvidia A40 (Large) | 4x A40 | 192GB | 40 | 288GB | $10.44 | Source |
4x Nvidia A100 (40GB) | 4x A100 | 160GB | 40 | 288GB | $16.56 | Source |
4x Nvidia A100 (80GB) | 4x A100 | 320GB | 40 | 576GB | $20.16 | Source |
8x Nvidia A40 (Large) | 8x A40 | 384GB | 48 | 680GB | $20.88 | Source |
8x Nvidia A100 (80GB) | 8x A100 | 640GB | 80 | 960GB | $40.32 | Source |
Scaleway
Scaleway is based in France 🇫🇷 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
L4-1-24G | 1x L4 | 24GB | 8 | 48GB | $0.81 | Launch |
GPU-3070 | 1x RTX 3070 | 8GB | 8 | 16GB | $1.06 | Source |
Render-S | 1x P100 | 16GB | 10 | 42GB | $1.34 | Source |
L40S-1-48G | 1x L40S | 48GB | 8 | 96GB | $1.51 | Launch |
L4-2-24G | 2x L4 | 48GB | 16 | 96GB | $1.62 | Source |
H100-1-80G | 1x H100 | 80GB | 24 | 240GB | $2.73 | Launch |
L40S-2-48G | 2x L40S | 96GB | 16 | 192GB | $3.03 | Source |
L4-4-24G | 4x L4 | 96GB | 32 | 192GB | $3.24 | Source |
H100-2-80G | 2x H100 | 160GB | 48 | 480GB | $5.45 | Source |
L40S-4-48G | 4x L40S | 192GB | 32 | 384GB | $6.06 | Source |
L4-8-24G | 8x L4 | 192GB | 64 | 384GB | $6.49 | Source |
L40S-8-48G | 8x L40S | 384GB | 64 | 768GB | $12.11 | Source |
The Cloud Minders
The Cloud Minders is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
V100 | 1x V100 | 16GB | 6 | 32GB | $0.24 | Source |
A4000 | 1x A4000 | 16GB | 5 | 32GB | $0.40 | Source |
A4000 Ada | 1x A4000 | 20GB | 5 | 64GB | $0.55 | Source |
A5000 | 1x A5000 | 24GB | 5 | 64GB | $0.55 | Source |
H100 PCIe | 1x H100 | 80GB | 32 | 192GB | $3.53 | Source |
H100 NVL | 1x H100 | 94GB | 32 | 192GB | $4.05 | Source |
H100 SXM | 1x H100 | 80GB | 24 | 256GB | $4.52 | Source |
H200 SXM | 8x H200 | 1128GB | 384 | 2048GB | On Request | Source |
B200 | 1x B200 | 180GB | -- | -- | On Request | Source |
Our examples are based on TCM's on-demand pricing. Preferred rates are available for committed reservations.
Vultr
Vultr is based in USA 🇺🇸 and offers GPUs in the following example configurations:
Name | GPUs | VRAM | vCPUs | RAM | Price/h | |
---|---|---|---|---|---|---|
A16 | 1x A16 | 16GB | 6 | 64GB | $0.51 | Launch |
A16 | 2x A16 | 96GB | 12 | 128GB | $1.02 | Launch |
A40 | 1x A40 | 48GB | 24 | 120GB | $1.86 | Launch |
A16 | 4x A16 | 192GB | 24 | 256GB | $2.05 | Launch |
L40S | 1x L40S | 48GB | 16 | 240GB | $2.23 | Source |
A100 80GB | 1x A100 | 80GB | 12 | 120GB | $2.60 | Launch |
GH200 | 1x GH200 | 96GB | 72 | 480GB | $3.32 | Source |
A16 | 8x A16 | 384GB | 48 | 486GB | $4.09 | Launch |
A40 | 4x A40 | 192GB | 96 | 480GB | $7.44 | Launch |
A16 | 16x A16 | 768GB | 96 | 960GB | $9.19 | Launch |
HGX A100 | 8x A100 | 5120GB | 112 | 2048GB | $12.90 | Source |
H100 | 8x H100 | 640GB | 112 | 2048GB | $18.40 | Source |