AMD data center Available

AMD Instinct MI100

Instinct MI100 ยท CDNA Architecture

The AMD Instinct MI100 was AMD's first data center GPU built on the CDNA architecture, separating compute and graphics into dedicated designs. With 32GB HBM2 and matrix core acceleration, it marked AMD's serious entry into the HPC accelerator market.

Key Features

First CDNA architecture 32GB HBM2 1.2 TB/s bandwidth Matrix cores PCIe 4.0

Full Specifications

Compute

Architecture CDNA
Process Node 7nm TSMC
Compute Units 120
Base Clock 1000 MHz
Boost Clock 1502 MHz
FP32 Performance 23.07 TFLOPS
FP16 Performance 184.6 TFLOPS
BF16 Performance 92.3 TFLOPS
INT8 Performance 184.6 TOPS

Memory

Memory Size 32 GB
Memory Type HBM2
Memory Bus 4096-bit
Memory Bandwidth 1228.8 GB/s

Power & Physical

TDP 300W
Form Factor PCIe
Slot Width 2-slot
Card Length 267 mm
Power Connectors 2x 8-pin

Features & Connectivity

PCIe Version PCIe 4.0 x16
NVLink Support No
Multi-GPU Support Yes

Availability

MSRP (USD) Contact for pricing
Release Date Nov 2020
Status Available

Industries

Use Cases

HPC Simulation AI Training Scientific Computing Weather Forecasting

Interested in the AMD Instinct MI100?

Get pricing, availability, and bulk discount information from our team.

Enquire Now

Related GPUs

AMD data center

AMD Instinct MI300X

Memory

304GB HBM3

FP32

163.4 TFLOPS

TDP

750W

FP16

1307.4 TFLOPS

Available View Specs
AMD data center

AMD Instinct MI300A

Memory

128GB HBM3

FP32

122.6 TFLOPS

TDP

760W

FP16

981 TFLOPS

Available View Specs
AMD data center

AMD Instinct MI250X

Memory

128GB HBM2e

FP32

47.87 TFLOPS

TDP

560W

FP16

383 TFLOPS

Available View Specs
AMD data center

AMD Instinct MI250

Memory

128GB HBM2e

FP32

45.26 TFLOPS

TDP

500W

FP16

362.1 TFLOPS

Available View Specs