H100 vs A4000
Explore a head to head comparison of specifications, performance, and pricing.
H100
The NVIDIA H100 is a Hopper-based GPU that provides exceptional performance, scalability, and economics for AI, deep learning, and HPC workloads.
ManufacturerNVIDIA
GPU ArchitectureHopper
Average Price$10.09/hr
GPU VRAM80 GB
Cloud Availability13 clouds
System Memory1920 GB
CPU Cores252
Storage31.3 TB
A4000
The NVIDIA A4000 delivers high-performance computing capabilities for AI, machine learning, and data science applications.
ManufacturerNVIDIA
GPU Architecture—
Average Price$1.17/hr
GPU VRAM16 GB
Cloud Availability2 clouds
System Memory215 GB
CPU Cores56
Storage1.3 TB
See how the H100 & A4000 compare
Compare detailed hardware specifications and average pricing for the H100 and A4000.
Compare Hardware Specifications
| H100 | A4000 | |
|---|---|---|
| GPU Type | H100 | A4000 |
| VRAM per GPU | 80 GB | 16 GB |
| Manufacturer | NVIDIA | NVIDIA |
| Architecture | Hopper | Ampere |
| Interconnect | PCIe Gen5 or SXM5 | PCIe Gen4 |
| Memory Bandwidth | 3.35 TB/s | 448 GB/s |
| FP16 TFLOPS | 267.6 TFLOPS (4:1) | 19.17 TFLOPS (1:1) |
| CUDA Cores | 16896 | 6144 |
| Tensor Cores | 528 (4th Gen) | 192 (3rd Gen) |
| RT Cores | N/A | 48 (2nd Gen) |
| Base Clock | 1365 MHz | 735 MHz |
| Boost Clock | 1785 MHz | 1695 MHz |
| TDP | 350-700W | 140W |
| Process Node | TSMC 4N | TSMC 8nm |
| Data Formats | FP8, INT8, BF16, FP16, TF32, FP32, FP64 | INT8, BF16, FP16, TF32, FP32 |
Compare Average On-Demand Pricing
| H100 | A4000 | |
|---|---|---|
| 1 GPU | $2.85 /hr | $0.47 /hr |
| 2 GPUs | $5.19 /hr | $0.95 /hr |
| 4 GPUs | $9.79 /hr | $1.90 /hr |
| 8 GPUs | $19.23 /hr | $1.20 /hr |
Explore H100 & A4000 Instances
Browse available instances with H100 and A4000 GPUs. Filter by provider, availability, and more to find the perfect instance for your needs.
Explore more GPU comparisons
Select any two GPUs to compare their specifications and explore pricing across providers.