Cloud GPU Price Comparison

March 14, 2025 (updated)

Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.

Which cloud has the most affordable GPUs?

If you're looking for the lowest price per hour, based on the data I collected that'd be the Nvidia A4000 offered by Hyperstack at $0.15/h.

Out of the 342 price points I track, these are the 20 most affordable (price, ascending):

Provider GPUs VRAM vCPUs RAM Price/h
Hyperstack logo Hyperstack 1x A4000 16GB 6 24GB $0.15 Launch
Oblivus logo Oblivus 1x A4000 16GB 6 24GB $0.20
RunPod logo RunPod 1x A30 24GB 8 31GB $0.22 Source
The Cloud Minders logo The Cloud Minders 1x V100 16GB 6 32GB $0.24 Source
Hyperstack logo Hyperstack 1x A5000 24GB 8 24GB $0.25 Launch
Hetzner logo Hetzner 1x RTX 4000 20GB 14 64GB $0.31 Source
RunPod logo RunPod 1x A4000 16GB 4 20GB $0.32 Source
Massed Compute logo Massed Compute 1x A30 24GB 16 48GB $0.33 Launch
RunPod logo RunPod 1x A4500 20GB 4 29GB $0.34 Source
RunPod logo RunPod 1x A5000 24GB 4 24GB $0.36 Source
DataCrunch 1x V100 16GB 6 23GB $0.39 Launch
RunPod 1x A40 48GB 9 50GB $0.39 Source
FluidStack 1x A4000 16GB 36 128GB $0.40 Source
The Cloud Minders 1x A4000 16GB 5 32GB $0.40 Source
Massed Compute 1x A5000 24GB 10 48GB $0.41 Launch
AWS 1x T4G 16GB 4 16GB $0.42 Source
Paperspace 1x M4000 8GB 8 30GB $0.45 Source
CUDO Compute 1x A4000 20GB 4 16GB $0.46 Source
CUDO Compute 1x A4000 16GB 6 24GB $0.47 Launch
CUDO Compute 1x A5000 24GB 6 24GB $0.49 Launch

Which GPU is best for AI training / inference?

The answer depends on many factors including your specific workload. But generally, you should consider:

  1. Tensor Cores
  2. Memory capacity and bandwidth
  3. Cache hierarchy
  4. FLOPS

For an in-depth guide on picking GPUs for AI/ML, check out this article by Tim Dettmers.

Example GPUs by use case

As a starting point, here's my (very broad) attempt at categorizing GPUs for AI workloads:

Category Examples Best for Performance profile
High-end A100, H100, H200, GH200, MI250, MI300X Training large language models, high-performance computing, large-scale inference Highest memory capacity, memory bandwidth, and compute performance
Mid-range A40, A30, A6000, V100, L4, T4 Medium-sized model training, inference, and fine-tuning tasks Higher memory and compute, cost-effective for many AI tasks
Budget K80, M60, P100, A4000 Small model training, experiments, low-cost inference Moderate memory, older architectures (K80, M60), lower cost
Consumer RTX3070, RTX4090, RTX4000 Gaming, content creation, and mid-range AI tasks High compute for gaming and development, Tensor Cores make RTX cards suitable for entry-level AI inference

Next, I'll provide a brief overview of each cloud provider and some of the GPUs they offer (342 price points).

RunPod logo RunPod

RunPod is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
A30 1x A30 24GB 8 31GB $0.22 Source
RTX A4000 1x A4000 16GB 4 20GB $0.32 Source
A4500 1x A4500 20GB 4 29GB $0.34 Source
A5000 1x A5000 24GB 4 24GB $0.36 Source
A40 1x A40 48GB 9 50GB $0.39 Source
RTX4090 1x RTX 4090 24GB 5 30GB $0.69 Source
A6000 1x A6000 48GB 8 62GB $0.76 Source
L40S 1x L40S 48GB 12 62GB $0.86 Source
A6000 Ada 1x A6000 48GB 14 58GB $0.88 Source
L40 1x L40S 48GB 8 94GB $0.99 Source
A100 PCIe 1x A100 80GB 8 117GB $1.64 Source
A100 SXM 1x A100 80GB 16 125GB $1.89 Source
MI250 1x MI250 128GB -- -- $2.10 Source
H100 PCIe 1x H100 80GB 16 188GB $2.39 Source
H100 SXM 1x H100 80GB 16 125GB $2.99 Source
MI300X 1x MI300X 192GB 24 283GB $2.99 Source
H200 SXM 1x H200 141GB -- -- $3.99 Source

Alibaba Cloud logo Alibaba Cloud

Alibaba Cloud is based in Singapore 🇸🇬 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
ecs.gn6i-c4g1.xlarge 1x T4 16GB 4 15GB $1.18 Source
ecs.gn7i-c8g1.2xlarge 1x A10 24GB 8 30GB $2.86 Source
ecs.gn6e-c12g1.3xlarge 1x V100 32GB 12 92GB $3.51 Source

Amazon Web Services logo Amazon Web Services

Amazon Web Services is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
g5g.xlarge 1x T4G 16GB 4 16GB $0.42 Source
g4dn.xlarge 1x T4G 16GB 4 16GB $0.53 Source
g5g.2xlarge 1x T4G 16GB 8 16GB $0.56 Source
g4dn.2xlarge 1x T4G 16GB 8 32GB $0.75 Source
g3s.xlarge 1x M60 8GB 4 30GB $0.75 Source
g5g.4xlarge 1x T4G 16GB 16 16GB $0.83 Source
p2.xlarge 1x K80 12GB 4 61GB $0.90 Source
g5.xlarge 1x A10G 24GB 4 16GB $1.01 Source
g3.4xlarge 1x M60 8GB 16 122GB $1.14 Source
g4dn.4xlarge 1x T4G 16GB 16 64GB $1.20 Source
g5.2xlarge 1x A10G 24GB 8 32GB $1.21 Source
g5g.8xlarge 1x T4G 16GB 32 16GB $1.37 Source
g5.4xlarge 1x A10G 24GB 16 64GB $1.63 Source
g4dn.8xlarge 1x T4G 16GB 32 128GB $2.18 Source
g3.8xlarge 2x M60 16GB 32 244GB $2.28 Source
g5.8xlarge 1x A10G 24GB 32 128GB $2.45 Source
g5g.16xlarge 2x T4G 32GB 64 32GB $2.74 Source
p3.2xlarge 1x V100 16GB 8 61GB $3.06 Source
g4dn.12xlarge 4x T4G 64GB 48 192GB $3.92 Source
g5.12xlarge 4x A10G 96GB 48 192GB $4.10 Source
g4dn.16xlarge 1x T4G 16GB 64 256GB $4.35 Source
g3.16xlarge 4x M60 32GB 64 488GB $4.56 Source
g5.12xlarge 4x A10G 384GB 48 192GB $5.68 Source
p2.8xlarge 8x K80 96GB 32 488GB $7.20 Source
g4dn.metal 8x T4G 128GB 96 384GB $7.83 Source
g5.24xlarge 4x A10G 96GB 96 384GB $8.14 Source
p3.8xlarge 4x V100 64GB 32 244GB $12.24 Source
p2.16xlarge 16x K80 192GB 64 732GB $14.40 Source
g5.48xlarge 8x A10G 192GB 192 768GB $16.29 Source
p3.16xlarge 8x V100 128GB 64 488GB $24.48 Source
p4d.24xlarge 8x A100 320GB 96 1152GB $32.77 Source
p5.48xlarge 8x H100 640GB 192 2048GB $98.32 Source

Build AI logo Build AI

Build AI is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
A100 (spot) 1x A100 40GB 15 200GB $1.05 Source
A100 1x A100 40GB 15 200GB $1.42 Source
A100 (spot) 1x A100 80GB 30 225GB $1.45 Source
A100 1x A100 80GB 30 225GB $1.97 Source
H100 (spot) 1x H100 80GB 26 225GB $2.79 Source
H100 1x H100 80GB 26 225GB $3.85 Source

CUDO Compute logo CUDO Compute

CUDO Compute is based in UK 🇬🇧 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
RTX A4000 Ada 1x A4000 20GB 4 16GB $0.46 Source
RTX A4000 1x A4000 16GB 6 24GB $0.47 Launch
RTX A5000 1x A5000 24GB 6 24GB $0.49 Launch
A40 1x A40 48GB 6 24GB $0.53 Launch
V100 1x V100 16GB 4 16GB $0.54 Source
RTX A6000 1x A6000 48GB 6 24GB $0.59 Launch
L40S 1x L40S 48GB 12 48GB $1.75 Launch
A100 PCIe 1x A100 80GB 12 48GB $1.83 Launch
H200 1x H200 141GB -- -- $2.49 Source
H100 NVL 1x H100 94GB 4 16GB $2.56 Source
H100 SXM 1x H100 80GB 12 48GB $3.18 Launch
MI300 1x MI300X 192GB -- -- On Request Source
MI250 1x MI250 128GB -- -- On Request Source
B100 1x B100 192GB -- -- On Request Source

Civo logo Civo

Civo is based in UK 🇬🇧 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
A100-40 Small 1x A100 40GB 8 64GB $1.79 Source
L40S Medium 1x L40S 48GB 8 64GB $1.79 Source
A100-80 Small 1x A100 80GB 12 128GB $2.38 Source
A100-40 Medium 2x A100 80GB 16 128GB $3.57 Source
L40S Medium (2x) 2x L40S 96GB 16 128GB $3.57 Source
A100-80 Medium 2x A100 160GB 24 256GB $4.76 Source
A100-40 Large 4x A100 160GB 32 255GB $7.14 Source
L40S Large 4x L40S 192GB 32 255GB $7.14 Source
A100-80 Large 4x A100 320GB 48 512GB $9.52 Source
A100-40 Extra Large 8x A100 320GB 64 512GB $14.29 Source
L40S Extra Large 8x L40S 384GB 64 512GB $14.29 Source
A100-80 Extra Large 8x A100 640GB 96 1024GB $19.05 Source
H100 1x H100 80GB -- -- On Request Source
GH200 1x GH200 96GB -- -- On Request Source

Contabo logo Contabo

Contabo is based in Germany 🇩🇪 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
L40S 4x L40S 192GB 64 512GB $4.29 Source
H100 4x H100 320GB 64 512GB $10.51 Source

Crusoe logo Crusoe

Crusoe is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
A40 1x A40 48GB 6 60GB $1.10 Launch
A100 1x A100 40GB 12 120GB $1.45 Launch
L40S 1x L40S 48GB 8 147GB $1.45 Launch
A100 80GB 1x A100 80GB 12 120GB $1.65 Launch

Crusoe instances can be scaled up to 10x the base configuration.

DataCrunch logo DataCrunch

DataCrunch is based in Finland 🇫🇮 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
Tesla V100 16GB 1x V100 16GB 6 23GB $0.39 Launch
RTX A6000 48GB 1x A6000 48GB 10 60GB $1.01 Launch
L40S 1x L40S 48GB 20 60GB $1.10 Source
RTX 6000 Ada 48GB 1x RTX 6000 48GB 10 60GB $1.19 Source
A100 SXM4 40GB 1x A100 40GB 22 120GB $1.29 Launch
A100 SXM4 80GB 1x A100 80GB 22 120GB $1.89 Launch
H100 SXM5 80GB 1x H100 80GB 30 120GB $2.65 Launch
H200 SXM5 141GB 1x H200 141GB 44 185GB $3.03 Launch
A100 SXM4 80GB 2x A100 160GB 44 240GB $3.78 Source
A100 SXM4 40GB 4x A100 160GB 88 480GB $5.16 Source
H100 SXM5 80GB 2x H100 160GB 80 370GB $5.30 Source
H200 SXM5 141GB 2x H200 282GB 88 370GB $6.06 Source
A100 SXM4 80GB 4x A100 320GB 88 480GB $7.56 Source
H100 SXM5 80GB 4x H100 320GB 176 740GB $10.60 Source
H200 SXM5 141GB 4x H200 564GB 176 740GB $12.12 Source
A100 SXM4 80GB 8x A100 640GB 176 960GB $15.12 Source
H200 SXM5 141GB 8x H200 1128GB 176 1450GB $24.24 Source

DataCrunch instances can be scaled by 2x, 4x, or 8x of the base configuration.

DigitalOcean logo DigitalOcean

DigitalOcean is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
Nvidia H100 SXM 1x H100 80GB 20 240GB $3.39 Launch
Nvidia H100 SXM 8x H100 640GB 160 1920GB $23.92 Launch

Exoscale logo Exoscale

Exoscale is based in Switzerland 🇨🇭 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
3080ti Small 1x 3080ti 12GB 12 56GB $0.99 Source
P100 Small 1x P100 16GB 12 56GB $1.17 Source
V100 Small 1x V100 16GB 12 56GB $1.38 Source
A5000 Small 1x A5000 24GB 12 56GB $1.45 Source
P100 Medium 2x P100 32GB 16 90GB $1.71 Source
3080ti Medium 2x 3080ti 24GB 24 112GB $1.84 Source
V100 Medium 2x V100 32GB 16 90GB $2.01 Source
A40 Small 1x A40 48GB 12 56GB $2.14 Source
P100 Large 3x P100 48GB 24 120GB $2.25 Source
V100 Large 3x V100 48GB 24 120GB $2.65 Source
A5000 Medium 2x A5000 48GB 24 112GB $2.68 Source
P100 Huge 4x P100 64GB 48 225GB $2.82 Source
V100 Huge 4x V100 64GB 48 225GB $3.32 Source
3080ti Large 4x 3080ti 48GB 48 224GB $3.40 Source
A40 Medium 2x A40 96GB 24 120GB $4.27 Source
A5000 Large 4x A5000 96GB 48 224GB $4.93 Source
A40 Large 4x A40 192GB 48 224GB $8.54 Source
A40 Huge 8x A40 384GB 96 448GB $17.06 Source

FluidStack logo FluidStack

FluidStack is based in UK 🇬🇧 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
Nvidia A4000 1x A4000 16GB 36 128GB $0.40 Source
Nvidia A5000 1x A5000 24GB 36 128GB $0.55 Source
Nvidia A40 1x A40 48GB 32 128GB $0.60 Source
Nvidia A6000 1x A6000 48GB 48 128GB $0.80 Source
Nvidia L40 1x L40S 48GB 32 48GB $1.25 Source
Nvidia A100 PCIe 40GB 1x A100 40GB 32 128GB $1.65 Source
Nvidia A100 PCIe 80GB 1x A100 80GB 48 256GB $1.80 Source
Nvidia H100 PCIe 1x H100 80GB 48 256GB $2.89 Source

Fly.io logo Fly.io

Fly.io is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
L40S 1x L40S 48GB -- -- $1.25 Source
A10 1x A10 24GB -- -- $1.50 Source
A100 40G PCIe 1x A100 40GB -- -- $2.50 Source
A100 80G SXM 1x A100 80GB -- -- $3.50 Source

Google Cloud logo Google Cloud

Google Cloud is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
g2-standard-4 1x L4 24GB 4 16GB $0.71 Source
g2-standard-8 1x L4 24GB 8 32GB $0.85 Source
g2-standard-12 1x L4 24GB 12 48GB $1.00 Source
g2-standard-16 1x L4 24GB 16 64GB $1.15 Source
g2-standard-32 1x L4 24GB 32 128GB $1.73 Source
g2-standard-24 2x L4 48GB 24 96GB $2.00 Source
a2-highgpu-1g 1x A100 40GB 12 85GB $3.67 Source
g2-standard-48 4x L4 96GB 48 192GB $4.00 Source
a2-highgpu-2g 2x A100 80GB 24 170GB $7.35 Source
g2-standard-96 8x L4 192GB 96 384GB $8.00 Source
a2-highgpu-4g 4x A100 160GB 48 340GB $14.69 Source
a2-highgpu-8g 8x A100 320GB 96 680GB $29.39 Source
a2-megagpu-16g 16x A100 640GB 96 1360GB $55.74 Source

Green AI Cloud logo Green AI Cloud

Green AI Cloud is based in Sweden 🇸🇪 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
H200 1x H200 141GB 112 2048GB $2.75 Source
H200 8x H200 1128GB 112 2048GB $22.00 Source
H100 8x H100 640GB -- -- On Request Source
A100 8x A100 640GB -- -- On Request Source
B200 SXM6 8x B200 1536GB 96 2048GB On Request Source

Hetzner logo Hetzner

Hetzner is based in Germany 🇩🇪 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
GEX44 1x RTX 4000 20GB 14 64GB $0.31 Source
GEX130 1x RTX 6000 48GB 24 128GB $1.45 Source

While Hetzner does not currently offer on-demand GPU instances, they do offer GPUs with some of their dedicated servers.

For these, in addition to the hourly rate there's a €94.01 setup fee (one time).

Hyperstack logo Hyperstack

Hyperstack is based in UK 🇬🇧 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
NVIDIA A4000 1x A4000 16GB 6 24GB $0.15 Launch
NVIDIA A5000 1x A5000 24GB 8 24GB $0.25 Launch
NVIDIA A6000 1x A6000 48GB 28 58GB $0.50 Launch
NVIDIA A40 1x A40 48GB 28 58GB $0.50 Source
NVIDIA L40 1x L40S 48GB 28 58GB $1.00 Launch
NVIDIA L40 x2 2x L40S 96GB 28 58GB $1.00 Source
NVIDIA A6000 x2 2x A6000 96GB 28 58GB $1.00 Source
NVIDIA A100 PCIe 80GB 1x A100 80GB 28 120GB $1.35 Launch
NVIDIA H100 PCIe 80GB 1x H100 80GB 28 180GB $1.90 Launch
NVIDIA A6000 x4 4x A6000 192GB 28 58GB $2.00 Source
NVIDIA A100 PCIe 80GB x2 2x A100 160GB 28 120GB $2.70 Source
NVIDIA H100 PCIe 80GB x2 2x H100 160GB 28 180GB $3.80 Source
NVIDIA L40 x4 4x L40S 192GB 28 58GB $4.00 Source
NVIDIA A6000 x8 8x A6000 384GB 28 58GB $4.00 Source
NVIDIA A100 PCIe 80GB x4 4x A100 320GB 28 120GB $5.40 Source
NVIDIA H100 PCIe 80GB x4 4x H100 320GB 28 180GB $7.60 Source
NVIDIA L40 x8 8x L40S 384GB 28 58GB $8.00 Source
NVIDIA A100 PCIe 80GB x8 8x A100 640GB 28 120GB $10.80 Source
NVIDIA A100 80GB PCIe NVLink 8x A100 640GB 31 240GB $11.20 Source
NVIDIA H100 PCIe 80GB x8 8x H100 640GB 28 180GB $15.20 Source
NVIDIA H100 PCIe NVLink 80GB x8 8x H100 640GB 31 180GB $15.60 Source
NVIDIA H100 SXM 80GB x8 8x H100 640GB 24 240GB $19.20 Source

Koyeb logo Koyeb

Koyeb is based in France 🇫🇷 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
RTX4000-SFF-ADA 1x RTX 4000 20GB 6 44GB $0.50 Source
V100-SXM2 1x V100 16GB 8 44GB $0.85 Source
L4 1x L4 24GB 15 44GB $1.00 Source
L40S 1x L40S 48GB 30 92GB $2.00 Source
A100 1x A100 80GB 15 180GB $2.70 Source
H100 1x H100 80GB 15 180GB $3.30 Source

Lambda Labs logo Lambda Labs

Lambda Labs is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
1x RTX6000 1x RTX 6000 24GB 14 46GB $0.50 Launch
1x A10 1x A10G 24GB 30 200GB $0.75 Launch
1x A6000 1x A6000 48GB 14 100GB $0.80 Launch
1x A100 SXM 1x A100 40GB 30 200GB $1.29 Launch
2x A6000 2x A6000 96GB 28 200GB $1.60 Launch
1x H100 PCIe 1x H100 80GB 26 200GB $2.49 Launch
2x A100 2x A100 80GB 60 400GB $2.58 Launch
Nvidia GH200 1x GH200 96GB 64 432GB $3.19 Source
4x A6000 4x A6000 192GB 56 400GB $3.20 Launch
8x V100 8x V100 128GB 92 448GB $4.40 Launch
4x A100 4x A100 160GB 120 800GB $5.16 Launch
8x A100 8x A100 320GB 124 1800GB $10.32 Launch
8x H100 SXM 8x H100 640GB 208 1800GB $23.92 Launch
Nvidia H200 SXM 8x H200 1128GB 224 -- On Request Source
B200 8x B200 1440GB 224 -- On Request Source

Linode logo Linode

Linode is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
GPU1 1x RTX 6000 24GB 8 32GB $1.50 Source
GPU2 2x RTX 6000 48GB 16 64GB $3.00 Source
GPU3 3x RTX 6000 72GB 20 96GB $4.50 Source
GPU4 4x RTX 6000 96GB 24 128GB $6.00 Source

Massed Compute logo Massed Compute

Massed Compute is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
A30 1x A30 24GB 16 48GB $0.33 Launch
A5000 1x A5000 24GB 10 48GB $0.41 Launch
A6000 1x A6000 48GB 6 48GB $0.57 Launch
RTX6000 Ada 1x RTX 6000 48GB 12 64GB $0.97 Launch
L40S 1x L40S 48GB 22 128GB $1.10 Launch
A100 80GB 1x A100 80GB 12 64GB $1.72 Launch
H100 1x H100 80GB 20 128GB $2.98 Launch

Their instances can be scaled up to 8x the base configuration.

Microsoft Azure logo Microsoft Azure

Microsoft Azure is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
NC4as T4 v3 1x T4 16GB 4 28GB $0.53 Source
NC8as T4 v3 1x T4 16GB 8 56GB $0.75 Source
NC6 1x K80 12GB 6 56GB $0.90 Source
NV6 1x M60 8GB 6 56GB $1.14 Source
NV12s v3 1x M60 8GB 12 112GB $1.14 Source
NC16as T4 v3 1x T4 16GB 16 110GB $1.20 Source
NC12 2x K80 24GB 12 112GB $1.80 Source
ND6s 1x P40 24GB 6 112GB $2.07 Source
NC6s v2 1x P100 16GB 6 112GB $2.07 Source
NV12 2x M60 16GB 12 112GB $2.28 Source
NV24s v3 2x M60 16GB 24 224GB $2.28 Source
NC6s v3 1x V100 16GB 6 112GB $3.06 Source
NC24 4x K80 48GB 24 224GB $3.60 Source
NC24r 4x K80 48GB 24 224GB $3.96 Source
ND12s 2x P40 48GB 12 224GB $4.14 Source
NC12s v2 2x P100 32GB 12 224GB $4.14 Source
NC64as T4 v3 4x T4 64GB 64 440GB $4.35 Source
NV48s v3 4x M60 32GB 48 448GB $4.56 Source
NV24 4x M60 32GB 24 224GB $4.56 Source
NC24ads A100 v4 1x A100 80GB 24 220GB $4.78 Source
NC12s v3 2x V100 32GB 12 224GB $6.12 Source
ND24s 4x P40 96GB 24 448GB $8.28 Source
NC24s v2 4x P100 64GB 24 448GB $8.28 Source
NC40ads H100 v5 1x H100 80GB 40 320GB $8.82 Source
ND24rs 4x P40 96GB 24 448GB $9.11 Source
NC24rs v2 4x P100 64GB 24 448GB $9.11 Source
NC24s v3 4x V100 64GB 24 448GB $12.24 Source
NC24rs v3 4x V100 64GB 24 448GB $13.46 Source
ND40rs v2 8x V100 256GB 40 672GB $22.03 Source
ND96asr A100 v4 8x A100 320GB 96 900GB $27.20 Source

Nebius logo Nebius

Nebius is based in Netherlands 🇳🇱 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
L40S PCIe 1x L40S 48GB 8 32GB $1.58 Source
H100 SXM 1x H100 80GB 20 160GB $3.55 Launch
H100 SXM 2x H100 160GB 40 320GB $7.10 Source
H100 SXM 4x H100 320GB 80 640GB $14.20 Source
H200 8x H200 1128GB -- -- $20.72 Source
H100 SXM 8x H100 640GB 160 1280GB $28.39 Launch

OVHcloud logo OVHcloud

OVHcloud is based in France 🇫🇷 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
t1-le-45 1x V100 16GB 8 45GB $0.77 Source
t2-le-45 1x V100S 32GB 15 45GB $0.88 Source
l4-90 1x L4 24GB 22 90GB $1.00 Source
t1-le-90 2x V100 32GB 16 90GB $1.55 Source
t2-le-90 2x V100S 64GB 30 90GB $1.76 Source
l40s-90 1x L40S 48GB 15 90GB $1.80 Source
t1-45 1x V100 16GB 8 45GB $1.97 Source
l4-180 2x L4 48GB 45 180GB $2.00 Source
t2-45 1x V100S 32GB 15 45GB $2.19 Source
h100-380 1x H100 80GB 30 380GB $2.99 Source
a100-180 1x A100 80GB 15 180GB $3.07 Source
t1-le-180 4x V100 64GB 32 180GB $3.10 Source
t2-le-180 4x V100S 128GB 60 180GB $3.53 Source
l40s-180 2x L40S 96GB 30 180GB $3.60 Source
t1-90 2x V100 32GB 18 90GB $3.94 Source
l4-360 4x L4 96GB 90 360GB $4.00 Source
t2-90 2x V100S 64GB 30 90GB $4.38 Source
h100-760 2x H100 160GB 60 760GB $5.98 Source
a100-360 2x A100 160GB 30 360GB $6.15 Source
l40s-360 4x L40S 192GB 60 360GB $7.20 Source
t1-180 4x V100 64GB 36 180GB $7.89 Source
t2-180 4x V100S 128GB 60 180GB $8.76 Source
h100-1520 4x H100 320GB 120 1520GB $11.97 Source
a100-720 4x A100 320GB 60 720GB $12.29 Source

Oblivus logo Oblivus

Oblivus is based in UK 🇬🇧 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
A4000 1x A4000 16GB 6 24GB $0.20
A5000 1x A5000 24GB 8 30GB $0.50
A6000 1x A6000 48GB 16 60GB $0.55
A100 80GB PCIe 1x A100 80GB 28 120GB $1.47
H100 PCIe 1x H100 80GB 28 180GB $1.98

Oblivus instances can be scaled up to 8x GPUs per instance.

Oracle Cloud logo Oracle Cloud

Oracle Cloud is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
BM.GPU.MI300X.8 8x MI300X 1536GB 56 2000GB $6.00 Source

Paperspace logo Paperspace

Paperspace is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
GPU+ (M4000) 1x M4000 8GB 8 30GB $0.45 Source
P4000 1x P4000 8GB 8 30GB $0.51 Source
RTX4000 1x RTX 4000 24GB 8 30GB $0.56 Source
P5000 1x P5000 16GB 8 30GB $0.78 Source
A4000 1x A4000 16GB 8 45GB $0.80 Launch
RTX5000 1x RTX 5000 16GB 8 30GB $0.82 Source
P6000 1x P6000 24GB 8 30GB $1.10 Source
A5000 1x A5000 24GB 8 45GB $1.42 Launch
A6000 1x A6000 48GB 8 45GB $1.93 Launch
V100-32G 1x V100 32GB 8 30GB $2.34 Launch
V100 1x V100 16GB 8 30GB $2.34 Launch
A100 1x A100 40GB 8 90GB $3.19 Launch
A100-80G 1x A100 80GB 8 90GB $3.28 Launch
H100 PCIe 1x H100 80GB 16 268GB $5.99 Launch

Replicate logo Replicate

Replicate is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
Nvidia T4 1x T4 16GB 4 16GB $0.81 Source
Nvidia A40 (Small) 1x A40 48GB 4 16GB $2.07 Source
Nvidia A40 (Large) 1x A40 48GB 10 72GB $2.61 Source
Nvidia A100 (40GB) 1x A100 40GB 10 72GB $4.14 Source
Nvidia A100 (80GB) 1x A100 80GB 10 144GB $5.04 Source
2x Nvidia A40 (Large) 2x A40 96GB 20 144GB $5.22 Source
2x Nvidia A100 (40GB) 2x A100 80GB 20 144GB $8.28 Source
2x Nvidia A100 (80GB) 2x A100 160GB 20 288GB $10.08 Source
4x Nvidia A40 (Large) 4x A40 192GB 40 288GB $10.44 Source
4x Nvidia A100 (40GB) 4x A100 160GB 40 288GB $16.56 Source
4x Nvidia A100 (80GB) 4x A100 320GB 40 576GB $20.16 Source
8x Nvidia A40 (Large) 8x A40 384GB 48 680GB $20.88 Source
8x Nvidia A100 (80GB) 8x A100 640GB 80 960GB $40.32 Source

Scaleway logo Scaleway

Scaleway is based in France 🇫🇷 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
L4-1-24G 1x L4 24GB 8 48GB $0.81 Launch
GPU-3070 1x RTX 3070 8GB 8 16GB $1.06 Source
Render-S 1x P100 16GB 10 42GB $1.34 Source
L40S-1-48G 1x L40S 48GB 8 96GB $1.51 Launch
L4-2-24G 2x L4 48GB 16 96GB $1.62 Source
H100-1-80G 1x H100 80GB 24 240GB $2.73 Launch
L40S-2-48G 2x L40S 96GB 16 192GB $3.03 Source
L4-4-24G 4x L4 96GB 32 192GB $3.24 Source
H100-2-80G 2x H100 160GB 48 480GB $5.45 Source
L40S-4-48G 4x L40S 192GB 32 384GB $6.06 Source
L4-8-24G 8x L4 192GB 64 384GB $6.49 Source
L40S-8-48G 8x L40S 384GB 64 768GB $12.11 Source

TensorWave logo TensorWave

TensorWave is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
AMD MI300X 8x MI300X 1536GB -- -- On Request

While TensorWave's pricing is not yet publicly available, you can request a quote on their website.

The Cloud Minders logo The Cloud Minders

The Cloud Minders is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
V100 1x V100 16GB 6 32GB $0.24 Source
A4000 1x A4000 16GB 5 32GB $0.40 Source
A4000 Ada 1x A4000 20GB 5 64GB $0.55 Source
A5000 1x A5000 24GB 5 64GB $0.55 Source
H100 PCIe 1x H100 80GB 32 192GB $3.53 Source
H100 NVL 1x H100 94GB 32 192GB $4.05 Source
H100 SXM 1x H100 80GB 24 256GB $4.52 Source
H200 SXM 8x H200 1128GB 384 2048GB On Request Source
B200 1x B200 180GB -- -- On Request Source

Our examples are based on TCM's on-demand pricing. Preferred rates are available for committed reservations.

Vultr logo Vultr

Vultr is based in USA 🇺🇸 and offers GPUs in the following example configurations:

Name GPUs VRAM vCPUs RAM Price/h
A16 1x A16 16GB 6 64GB $0.51 Launch
A16 2x A16 96GB 12 128GB $1.02 Launch
A40 1x A40 48GB 24 120GB $1.86 Launch
A16 4x A16 192GB 24 256GB $2.05 Launch
L40S 1x L40S 48GB 16 240GB $2.23 Source
A100 80GB 1x A100 80GB 12 120GB $2.60 Launch
GH200 1x GH200 96GB 72 480GB $3.32 Source
A16 8x A16 384GB 48 486GB $4.09 Launch
A40 4x A40 192GB 96 480GB $7.44 Launch
A16 16x A16 768GB 96 960GB $9.19 Launch
HGX A100 8x A100 5120GB 112 2048GB $12.90 Source
H100 8x H100 640GB 112 2048GB $18.40 Source