Summary Bullets:
• AMD’s “Advancing AI 2025” event, held in San Jose, California (US) in June 2025 helped analysts delve deeper into the company’s strategy for the next few years.
• The chip designer aims to build a fully open ecosystem and stack, supported by a string of acquisitions, including Silo AI and Brium.
AMD continues executing upon its annual roadmap cadence since it launched the AMD Instinct MI300 GPUs in late-2023. The launch of the AMD Instinct MI350 series, with a quadruple jump in performance compared with the previous generation, was a highlight of the conference. As AI agents become conspicuous, compute requirements will grow, driving an exponential demand for infrastructure. AMD also focused on its software roadmap and highlighted the importance of an open ecosystem, something the company has invested in through acquisitions.
The chip designer announced the launch of the AMD Instinct MI350 series GPUs, the fourth generation within the AMD Instinct family, and the forthcoming rack servers based on these chips, slated for availability in late 2025. The company is also unveiling the AMD Instinct MI400 processors in 2026, which will run on AMD’s Helios rack, pitted against Nvidia’s Vera Rubin.
AI is moving beyond the data center to intelligent devices at the edge and PCs. AMD expects to see AI deployed in every single device, running on different architectures. From a portfolio standpoint the company offers a suite of computing elements spanning GPUS, DPUS, CPUs, NICS, FPGAs, and adaptive SCIs. Its strategy is based on delivering a broad portfolio of compute engines so customers can match the right compute to the right use case, and on investing in an open, developer-first ecosystem that supports every major framework, library, and model. The chip designer believes that an open ecosystem is central to the future of AI and claims to be the only company committed to openness across hardware, software, and solutions.
Openness shouldn’t be just a buzzword because it will be critical to scale adoption of AI over the coming years. AMD has invested heavily both organically and through acquisitions to promote its open software stack; in the last year, it made 25 strategic investments in this area, including the Finnish company Silo AI, and more recently, Brium. Other acquisitions across the entire AI value chain include ZT Systems, Pensando, Lamini, Enosemi, and Xilinx. However, there are always risks associated with inorganic growth that the company needs to actively address.
However powerful AMD’s hardware may be, it is a common criticism in the industry that the software cannot match up to Nvidia’s CUDA platform. AMD has pinpointed software as a key AI enabler and therefore a crucial focus, shaping M&A plans. The ROCm 7 software stack is designed to broaden the coverage of AI models by accelerating the pace of updates and foster a developer-first mentality, with integration with open-source frameworks top of mind. This lends capillarity to the AMD hardware and makes it easier to scale.
The company highlighted that demand for compute based on inference workloads will soon be equal to model training, although training will remain the foundation to develop AI systems. As AI undertakes complex tasks like reasoning, driving demand for more compute, inference will soon become the majority stake of the market. AMD is focusing on inferencing as a crucial differentiator, with a focus on “tokens-per-dollar” as a metric.
Looking ahead, the chip designer believes there is further opportunity in an environment where customers have not invested enough in the refresh cycle of the last couple of years. However, and with the industry still relatively immature in the AI stakes, it is difficult to predict how successful the agentic AI experiment will be. Many enterprises remain in the PoC phase with lots of projects still in their infancy, and it is difficult to project the real size of the opportunity within this market. For a deeper analysis of the event, please read GlobalData’s report Advancing AI 2025: AMD Announces MI350 GPUs and Targets the Inference Opportunity, June 30, 2025.

