News
With the “Antares” Instinct MI300 series, this is the third time charm for AMD’s datacenter GPU business, and the company is gearing up to have a significant GPU business, which the company finally ...
In comparison, NVIDIA's upcoming H200 AI GPU has 141GB of HBM3e memory with up to 4.8TB/sec of memory bandwidth, so AMD's new Instinct MI300X is definitely sitting very pretty against the H100 ...
AMD will have 50% more HBM3 capacity than its current Instinct MI250X accelerator, which can contain up to 128GB of HBM3 memory in total. Still, the new Instinct MI300X accelerator bumps that up ...
From a performance point of view, AMD's flagship Instinct MI400-series AI GPU (we will refer to it as to Instinct MI400X, though this is not the official name, and we will also call the CDNA Next ...
In Q3, AMD's data center revenue grew by 122% year-over-year, driven by strong ramp of Instinct GPU and EPYC CPU shipments. I think AMD’s data center business will continue to grow by 45% in FY25.
AMD also said that the MI300A, an APU accelerator for HPC and AI, has begun sampling to customers. It has 128GB of HBM3 memory, 24 Zen 4 CPU cores, and more than 146 billion transistors.
AMD Instinct MI325X AI Accelerator Ships This Quarter To Battle NVIDIA's H200 If you recall, AMD revealed the Instinct MI325X, the follow-on product to the highly successful MI300X, earlier this year.
Results that may be inaccessible to you are currently showing.
Hide inaccessible results