Nvidia L4

Compare prices for Nvidia L4 across cloud providers

April 12, 2025 (updated)

The Nvidia L4, introduced in 2023, is optimized for inference and media processing. Nvidia markets it as a solution for applications like video analytics and AI-powered media generation, where energy efficiency and cost savings are priorities.

Provider GPUs VRAM vCPUs RAM Price/h
Google Cloud logo GCP 1x L4 24GB 4 16GB $0.71 Source
Google Cloud logo GCP 1x L4 24GB 8 32GB $0.85 Source
Scaleway logo Scaleway 1x L4 24GB 8 48GB $0.90 Launch
Google Cloud logo GCP 1x L4 24GB 12 48GB $1.00 Source
OVHcloud logo OVH 1x L4 24GB 22 90GB $1.00 Source
Koyeb logo Koyeb 1x L4 24GB 15 44GB $1.00 Source
Google Cloud logo GCP 1x L4 24GB 16 64GB $1.15 Source
Scaleway logo Scaleway 2x L4 48GB 16 96GB $1.72 Launch
Google Cloud logo GCP 1x L4 24GB 32 128GB $1.73 Source
Google Cloud logo GCP 2x L4 48GB 24 96GB $2.00 Source
OVH 2x L4 48GB 45 180GB $2.00 Source
Scaleway 4x L4 96GB 32 192GB $3.37 Source
GCP 4x L4 96GB 48 192GB $4.00 Source
OVH 4x L4 96GB 90 360GB $4.00 Source
Scaleway 8x L4 192GB 64 384GB $6.67 Launch
GCP 8x L4 192GB 96 384GB $8.00 Source

Note: Prices are subject to change and may vary by region and other factors not listed here. For some GPUs, I include links to Shadeform (the sponsor) so you can check if they're available right now. I don’t earn a commission when you click on these links, but their monthly sponsorship helps me keep the site running.

Nvidia L4 specs

FP32 30.3 TFLOPs
TF32 Tensor Core 120 TFLOPs
FP16 Tensor Core 242 TFLOPs
BFLOAT16 Tensor Core 242 TFLOPs
FP8 Tensor Core 485 TFLOPs
INT8 Tensor Core 485 TOPs
GPU Memory 24GB
GPU Memory Bandwidth 300 GB/s
NVENC, NVDEC, JPEG Decoders 2, 4, 4
Max Thermal Design Power (TDP) 72W
Form Factor 1-slot low-profile, PCIe
Interconnect PCIe Gen4 x16 64GB/s
Server Options Partner and NVIDIA-Certified Systems with 1‒8 GPUs

Source: official Nvidia L4 datasheet.