AMD Instinct MI200 to Enhance Data Center AI and HPC Performance

AMD Instinct MI200 to Enhance Data Center AI and HPC Performance

November 9, 2021 0 By Reyjon Oregas

Like other avenues of technology, data centers are now evolving. Ai and HPC are now in the forefront of data center operations and as such, advanced hardware is needed to be able to harness the capabilities and maximize the performance of these new technologies.

AMD, one of the leaders in the Data Center industry, has recently unveiled its newest offering, the AMD Instinct MI200, which delivers ground-breaking performance in both HPC (High Performance Computing) and AI (Artificial Intelligence).

AMD Instinct MI200: Exascale-Class Accelerator

AMD Instinct MI200

The new AMD Instinct MI200 series accelerators are the first exascale-class GPU accelerators. It is built on the CDNA 2 Architecture and is equipped with 2nd Gen Matrix Cores that delivers 4X peak theoretical FP64 performance compared to its predecessors. And with its industry-first multi-die GPU design dubbed as 2.5D Elevated Fanout Bridge (EFB), the Instinct MI200 delivers 1.8x more cores and 2.7X higher memory bandwidth compared to previous AMD GPU accelerators. To further push its performance, the AMD Instinct MI200 series has 8 infinity fabric links that enable it to connect to optimized 3rd Generation EPYC processors and GPUs in the same node, delivering a unified CPU and GPU memory coherency to maximize system throughput.

Exascale with AMD

AMD Instinct MI200 partnered with 3rd Generation EPYC CPU

“The Frontier supercomputer is the culmination of a strong collaboration between AMD, HPE and the U.S. Department of Energy, to provide an exascale-capable system that pushes the boundaries of scientific discovery by dramatically enhancing performance of artificial intelligence (AI), analytics, and simulation at scale.”

Thomas Zacharia, director, Oak Ridge National Laboratory

Powered by AMD 3rd Generation EPYC CPU and AMD Instinct MI250X, the Frontier supercomputer is expected to deliver more than 1.5X exaflops of peak computing power. With the power of both AMD 3rd Generation CPU and Instinct MI250X, it will dramatically enhance the AI performance, analytics, and simulation at scale, aiding scientists to pack in more data calculations, identify new patterns in data to help develop and accelerate the pace of scientific discovery.

Enabling Exascale Science through Software

AMD Instinct MI200 on AMD ROCM 5.0

To further enhance the capability of exascale systems, AMD also announced the availability of its open-source platform, AMD ROCm, which allows researchers to tap into the power of AMD Instinct accelerators and drive scientific discoveries. Built on the foundation of portability, the ROCm platform is able to support environments across multiple accelerator vendors and architectures.

And with ROCm5.0, AMD extends its open platform powering top HPC and AI applications with AMD Instinct MI200 series accelerators, increasing accessibility of ROCm for developers and delivering leadership performance across key workloads. And through the AMD Infinity Hub, researchers, data scientists, and end-users, can easily find, download and install containerized HPC apps and ML frameworks that are optimized and supported on AMD Instinct and ROCm.

The hub currently offers a range of containers supporting either Radeon Instinct™ MI50, AMD Instinct™ MI100 or AMD Instinct MI200 accelerators including several applications like Chroma, CP2k, LAMMPS, NAMD, OpenMM and more, along with popular ML frameworks TensorFlow and PyTorch. New containers are continually being added to the hub.

Available Server Solutions

The AMD Instinct MI250X and AMD Instinct MI250 are available in the open-hardware compute accelerator module or OCP Accelerator Module (OAM) form factor. The AMD Instinct MI210 will be available in a PCIe® card form factor in OEM servers.

The AMD MI250X accelerator is currently available from HPE in the HPE Cray EX Supercomputer, and additional AMD Instinct MI200 series accelerators are expected in systems from major OEM and ODM partners in enterprise markets in Q1 2022, including ASUS, ATOS, Dell Technologies, Gigabyte, Hewlett Packard Enterprise (HPE), Lenovo and Supermicro.

AMD Instinct MI200 Series Specifications

ModelsCompute UnitsStream ProcessorsFP64 | FP32 Vector (Peak)FP64 | FP32 Matrix (Peak)FP16 | bf16 (Peak)INT4 | INT8 (Peak)HBM2e ECC MemoryMemory BandwidthForm Factor
AMD Instinct MI250x22014,080Up to 47.9 TFUp to 95.7 TFUp to 383.0 TFUp to 383.0 TOPS128GB3.2 TB/secOCP Accelerator Module
AMD Instinct MI25020813,312Up to 45.3 TFUp to 90.5 TFUp to 362.1 TFUp to 362.1 TOPS128GB3.2 TB/secOCP Accelerator Module