A100 vs A16
Explore a head to head comparison of specifications, performance, and pricing.
A100
The NVIDIA A100 is a powerful Ampere-based GPU designed for AI training, inference, and high-performance computing workloads.
ManufacturerNVIDIA
GPU ArchitectureAmpere
Average Price$6.63/hr
GPU VRAM40 GB
Cloud Availability5 clouds
System Memory1800 GB
CPU Cores176
Storage13.6 TB
A16
The NVIDIA A16 delivers high-performance computing capabilities for AI, machine learning, and data science applications.
ManufacturerNVIDIA
GPU Architecture—
Average Price$3.37/hr
GPU VRAM64 GB
Cloud Availability1 clouds
System Memory960 GB
CPU Cores96
Storage1.7 TB
See how the A100 & A16 compare
Compare detailed hardware specifications and average pricing for the A100 and A16.
Compare Hardware Specifications
| A100 | A16 | |
|---|---|---|
| GPU Type | A100 | A16 |
| VRAM per GPU | 40 GB | 64 GB |
| Manufacturer | NVIDIA | NVIDIA |
| Architecture | Ampere | Ampere |
| Interconnect | PCIe Gen4 or SXM4 | PCIe Gen4 |
| Memory Bandwidth | 1.55 TB/s | 4x 200 GB/s |
| FP16 TFLOPS | 77.97 TFLOPS (4:1) | 4.493 TFLOPS (1:1) |
| CUDA Cores | 6912 | 4x 1,280 |
| Tensor Cores | 432 (3rd Gen) | 4x 40 (3rd Gen) |
| RT Cores | N/A | 4x 10 (2nd Gen) |
| Base Clock | 765 MHz | 1312 MHz |
| Boost Clock | 1410 MHz | 1755 MHz |
| TDP | 250W-400W | 250W |
| Process Node | TSMC 7nm | TSMC 8nm |
| Data Formats | INT8, BF16, FP16, TF32, FP32, FP64 | INT8, BF16, FP16, TF32, FP32 |
Compare Average On-Demand Pricing
| A100 | A16 | |
|---|---|---|
| 1 GPU | $1.71 /hr | $0.51 /hr |
| 2 GPUs | $3.91 /hr | $1.02 /hr |
| 4 GPUs | $7.80 /hr | $2.05 /hr |
| 8 GPUs | $13.54 /hr | $4.09 /hr |
Explore A100 & A16 Instances
Browse available instances with A100 and A16 GPUs. Filter by provider, availability, and more to find the perfect instance for your needs.
Explore more GPU comparisons
Select any two GPUs to compare their specifications and explore pricing across providers.