A100 vs H100
Explore a head to head comparison of specifications, performance, and pricing.
A100
The NVIDIA A100 is a powerful Ampere-based GPU designed for AI training, inference, and high-performance computing workloads.
ManufacturerNVIDIA
GPU ArchitectureAmpere
Average Price$6.63/hr
GPU VRAM40 GB
Cloud Availability5 clouds
System Memory1800 GB
CPU Cores176
Storage13.6 TB
H100
The NVIDIA H100 is a Hopper-based GPU that provides exceptional performance, scalability, and economics for AI, deep learning, and HPC workloads.
ManufacturerNVIDIA
GPU ArchitectureHopper
Average Price$10.09/hr
GPU VRAM80 GB
Cloud Availability13 clouds
System Memory1920 GB
CPU Cores252
Storage31.3 TB
See how the A100 & H100 compare
Compare detailed hardware specifications and average pricing for the A100 and H100.
Compare Hardware Specifications
| A100 | H100 | |
|---|---|---|
| GPU Type | A100 | H100 |
| VRAM per GPU | 40 GB | 80 GB |
| Manufacturer | NVIDIA | NVIDIA |
| Architecture | Ampere | Hopper |
| Interconnect | PCIe Gen4 or SXM4 | PCIe Gen5 or SXM5 |
| Memory Bandwidth | 1.55 TB/s | 3.35 TB/s |
| FP16 TFLOPS | 77.97 TFLOPS (4:1) | 267.6 TFLOPS (4:1) |
| CUDA Cores | 6912 | 16896 |
| Tensor Cores | 432 (3rd Gen) | 528 (4th Gen) |
| Base Clock | 765 MHz | 1365 MHz |
| Boost Clock | 1410 MHz | 1785 MHz |
| TDP | 250W-400W | 350-700W |
| Process Node | TSMC 7nm | TSMC 4N |
| Data Formats | INT8, BF16, FP16, TF32, FP32, FP64 | FP8, INT8, BF16, FP16, TF32, FP32, FP64 |
Compare Average On-Demand Pricing
| A100 | H100 | |
|---|---|---|
| 1 GPU | $1.71 /hr | $2.85 /hr |
| 2 GPUs | $3.91 /hr | $5.19 /hr |
| 4 GPUs | $7.80 /hr | $9.79 /hr |
| 8 GPUs | $13.54 /hr | $19.23 /hr |
Explore A100 & H100 Instances
Browse available instances with A100 and H100 GPUs. Filter by provider, availability, and more to find the perfect instance for your needs.
Explore more GPU comparisons
Select any two GPUs to compare their specifications and explore pricing across providers.