“`html
Advanced Micro Devices (AMD) is expanding its data center portfolio with the MI300 series accelerators, designed for generative AI workloads. The MI300X model features 192 GB of HBM3 memory, surpassing NVIDIA’s H100 NVL’s 188 GB capacity, allowing it to effectively run large language models of up to 80 billion parameters. In Q1 2025, AMD’s data center revenues reached $3.674 billion, a 57.2% increase year-over-year, constituting 49.4% of total revenues, with an expected second-quarter revenue of $3.31 billion reflecting a 16.7% annual rise.
In June 2025, Meta Platforms announced the deployment of AMD’s MI300X for AI inference, showing increased interest in the upcoming MI350 Series and collaborative AI roadmap developments. However, competition remains fierce with Intel’s AI chips and NVIDIA’s data center revenues climbing to $39.1 billion in Q1 2026, a 73.3% increase year-over-year.
AMD’s share price has increased by 28.8% year-to-date, significantly outperforming the broader tech sector, while the forward 12-month Price/Sales ratio stands at 7.29X compared to the industry average of 3.92X. Current estimates suggest Q2 2025 earnings will be 47 cents per share, reflecting a 31.88% year-over-year decline, and the consensus for 2025 earnings is set at $3.82 per share, indicating 15.41% growth.
“`