AMD's A.I. Chips want to Take on Nvidia
My Semiconductor review of AMD's efforts to catch Nvidia, the latest acquisition and some deeper analysis.
Hey Everyone,
We are going to cover the State of A.I. Report tomorrow, but today I wanted to think more deeply about AMD.
To understand the Generative A.I. wave of 2023 it’s helpful to understand Nvidia’s dominance in A.I. chips and its own investment in A.I. circa 2017, or around five years earlier. In 2023, Nvidia has also become an active funder of Generative A.I. startups as well.
AI Chip Market Set to 10x in just 7 Years
Driven by adoption of artificial intelligence (AI) across multiple industries, the global AI chip market generated nearly $29 billion in 2022, according to a report from Next Move Strategy Consulting. The market is expected to generate nearly $305 billion by the end of the decade, boasting a compound annual growth rate (CAGR) of 29% from 2023 to 2030.
As Nvidia has grown stronger in this area, the harder it has become for companies attempting to build competing chips. I am very interested in AMD’s attempt to catch up with Nvidia for LLM training. Back in July, 2023 MosaicML, that was acquired by DataBricks for $1.3B, published some interesting benchmarks for training LLMs on the AMD MI250 GPU, and said it is ~80% as fast as an NVIDIA A100.
In addition to developing their own AI chips in Athena, Microsoft it also came out in May, 2023 is helping to fund AMD’s expansion into AI Chips. About two weeks ago it came out that Machine learning startup Lamini revealed its large language model (LLM) refining platform was running "exclusively" on The House of Zen's silicon. Lamini claims (The Register) its platform, which has attracted interest from Amazon, Walmart, eBay, GitLab, and Adobe, to name a few, has been running on "more than a 100 AMD GPUs in production all year" and could be scaled up to "thousands of MI GPUs."
Keep reading with a 7-day free trial
Subscribe to AI Supremacy to keep reading this post and get 7 days of free access to the full post archives.