Gpu Accelerator Specs: AMD_Instinct_MI300X

GPU and AI accelerator specifications for datacenter and HPC planning — compute performance (FP64/FP32/TF32/FP16/BF16/FP8/FP4/INT8 TFLOPS/TOPS), memory capacity and bandwidth, thermal design power, interconnect bandwidth (NVLink, PCIe, Infinity Fabric), and physical/process node data. Covers NVIDIA A100, H100, H200, B200, B100 and AMD Instinct MI300X.

Electrical Engineeringgpu_model: AMD_Instinct_MI300X3 rows
gpu modelspec categorybf16 tc tflops (TFLOPS)fp16 tc tflops (TFLOPS)fp32 tflops (TFLOPS)fp64 tc tflops (TFLOPS)fp64 tflops (TFLOPS)fp8 tc tflops (TFLOPS)int8 tc tops (TOPS)interconnect typememory bandwidth gb s (GB/s)memory gb (GB)memory typenvlink bandwidth gb s (GB/s)pcie bandwidth gb s (GB/s)pcie gen (dimensionless)tf32 tc tflops (TFLOPS)
AMD_Instinct_MI300Xcompute_performance1,307.41,307.4163.4163.481.72,614.92,614.9653.7
AMD_Instinct_MI300XinterconnectInfinity_Fabric+PCIe8961285
AMD_Instinct_MI300Xmemory5,300192HBM3

This is a sample. Search the full gpu accelerator specs dataset and 350+ more by creating a free account and accessing the database.

Get started for FREE

Related data