NVIDIA To Ship Millions Of Blackwell GPUs, Propelling TSMC CoWoS & HBM DRAM Demand To New Levels

Muhammad Zuhair
NVIDIA Shares Blackwell GPU Compute Stats: 30% More FP64 Than Hopper, 30x Faster In Simulation & Science, 18X Faster Than CPUs 1

NVIDIA's Blackwell AI GPUs are expected to lift every other associated segment, including CoWoS (TSMC) & HBM DRAM, as the market expects millions of chips to be shipped by 2025.

HBM and Chip Packaging Industry To Witness Tremendous Growth Over The Next Year Thanks To NVIDIA's Blackwell AI GPUs, Huge Amounts To Ship In 2025

TrendForce reports that the latest NVIDIA Blackwell GPUs for AI are poised to be the industry's next "holy grail" since the performance they are bringing to the markets has attracted the attention of several major clients. This includes the GB200 SUPERCHIP, which is projected to account for 40% to 50% of NVIDIA's Blackwell supply going into 2025. Hence, millions of units of Blackwell GPUs will be produced, replicating the success of NVIDIA's Hopper lineup.

Related Story Gigabyte Intros AI Top Utility & AI Top PC Components: Include NVIDIA & AMD GPUs, PSU, Motherboards & More
Image Source: Trendforce

However, with this significant increase in demand, supply firms associated with NVIDIA will have a tremendous year in market demand; hence, firms like TSMC & others will have to upscale existing facilities.

The supply chain has high expectations for the GB200, with projections suggesting that its shipments could exceed millions of units by 2025, potentially making up nearly 40 to 50% of NVIDIA’s high-end GPU market.

Trendforce

It is reported that total TSMC's CoWoS (Chip-on-Wafer-on-Substrate) capacity is expected to reach up to 40,000 units by the end of 2024, which marks a whopping 150% YoY increase, credit to the gigantic demand faced by the Taiwan giant. Moreover, for other AI products as well, CoWoS technology plays a crucial role, which means that the packaging segment will see a remarkable rise.

Apart from the CoWoS markets, NVIDIA's Blackwell GPUs are expected to elevate the HBM segment to new heights as well, especially with the pending generational switch from HBM3 to the HBM3e DRAM which we see is still awaited to occur in mainstream products from NVIDIA and other competitors.

Image Source: NVIDIA

Moreover, with the debut of NVIDIA's Blackwell AI GPUs like the GB200, the B200, and the B100, the adoption rate of HBM3e will increase significantly, not to mention the capacity upgrades as well, which are expected to reach up to 192 GB and 288 GB by the end of 2024.

The upcoming AI markets will differ significantly from the existing ones, not just because of the market hype. Still, the revenue stream will be much larger this time, as genAI and AGI have seen massive adoption recently, propelling both the computing and client segments.

Share this story

Deal of the Day

Comments