AMD Unveils New AI Products and Solutions

AMD Unveils New AI Products and Solutions
Depositphotos

AMD  launched the latest high-performance computing solutions, including 5th Gen EPYC server CPUs, Instinct MI325X accelerators, Pensando Salina DPUs, Pensando Pollara 400 NICs, and Ryzen AI PRO 300 series processors for enterprise AI PCs. AMD and its partners also showcased how they are deploying AI solutions at scale, the continued ecosystem growth of ROCm open-source AI software, and a broad portfolio of new solutions based on Instinct accelerators, EPYC CPUs, and Ryzen PRO CPUs.

“The data center and AI represent significant growth opportunities for AMD, and we are building strong momentum for our EPYC and Instinct processors across a growing set of customers,” said AMD Chair and CEO Lisa Su. “With our new EPYC CPUs, Instinct GPUs, and Pensando DPUs we are delivering leadership computing to power our customers’ most important and demanding workloads. Looking ahead, we see the data center AI accelerator market growing to $500 billion by 2028. We are committed to delivering open innovation at scale through our expanded silicon, software, network, and cluster-level solutions.”

During the event called Advancing AI 2024, AMD said it continues to invest in the open AI ecosystem and expand the ROCm open-source software stack with new features, tools, optimizations, and support to help developers extract the ultimate performance from Instinct accelerators and deliver out-of-the-box support for today’s leading AI models. Leaders from Essential AI, Fireworks AI, Luma AI, and Reka AI discussed how they are optimizing models across AMD hardware and software.

The chip giant also hosted a developer event joined by technical leaders from across the AI developer ecosystem, including Microsoft, OpenAI, Meta, Cohere, xAI, and more. Presentations hosted by the inventors of popular AI programming languages, models, and frameworks critical to the AI transformation taking place, such as Triton, TensorFlow, vLLM and Paged Attention, FastChat, and more, shared how developers are unlocking AI performance optimizations through vendor-agnostic programming languages, accelerating models on Instinct accelerators, and highlighted the ease of use porting to ROCm software and how the ecosystem is benefiting from an open-source approach.