GPU Types
Complete catalog of available GPUs with specifications and recommended use cases.
High Performance
Best for large model training and inference workloads.
Mid-Range
Great balance of performance and cost for most workloads.
Entry Level
Cost-effective options for development and small-scale inference.
Last updated: 2026-03-16T08:33:07Z
Choosing a GPU
- Model size: Ensure VRAM exceeds model parameters (7B params ~ 14GB for fp16)
- Training vs inference: Training needs more VRAM than inference
- Batch size: Larger batches require more VRAM but improve throughput