Published: 16 August 2023
Summary
High-bandwidth memory bit demand will grow eightfold, from 123 million GBs in 2022 to 972 million GBs in 2027, driven by the need for high-performance memory in general AI and generative AI models. By 2027, HBM bit consumption will grow to 1.6% of total DRAM consumption, up from 0.5% in 2022.
Included in Full Research
Overview
Forecast Assumptions
By 2027, 30% of data center workload accelerator chips that integrate high-bandwidth memory (HBM) will be used for training AI models, down from 65% in 2022.
By 2027, HBM prices will decline by 40% compared to 2022.
By 2027, HBM stack density will increase to 48GB from 16GB in 2022 due to technology innovation and vendor prioritization.
Market Impacts
HBM revenue will grow from $1.1 billion in 2022 to $5.2 billion in 2027.
Between 2022 and 2027, there will be eightfold bit growth for HBM compared to fivefold growth in revenue.
By 2027, bit demand for HBM will increase to 972 million
To view the entire document, log
in or purchase