Word on the street? Samsung's sitting down with Nvidia to hash out HBM4 pricing. Here's the kicker—their next-generation high-bandwidth memory might land around the mid-$500s per unit, matching what SK Hynix is charging.
That's a hefty jump. We're talking over 50% more expensive compared to SK Hynix's current HBM3e, which runs in the mid-$300 range. Why does this matter? HBM chips power AI accelerators and high-performance computing rigs—the backbone of data centers training large language models and running complex simulations.
If pricing climbs this aggressively, expect ripple effects across AI infrastructure costs. Could this squeeze margins for cloud providers or push hardware refresh cycles further out? Worth watching how Micron positions itself in this shifting landscape.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
5 Likes
Reward
5
3
Repost
Share
Comment
0/400
RektRecorder
· 13h ago
Damn, HBM4 costs over 500 each? That price increase is insane. Cloud providers are probably going to complain about being broke again.
View OriginalReply0
AltcoinMarathoner
· 13h ago
ngl, another 50% jump on HBM4? that's the wall at mile 20 right there. everyone's sprinting early, but the real marathon runners know—these infrastructure costs eventually get priced in, adoption keeps grinding forward regardless. zoom out on the macro perspective, this is just ecosystem friction.
Reply0
FloorPriceNightmare
· 13h ago
Here comes another round of fleecing retail investors. HBM4 prices have doubled, cloud service providers must be crying their eyes out in the restroom.
Word on the street? Samsung's sitting down with Nvidia to hash out HBM4 pricing. Here's the kicker—their next-generation high-bandwidth memory might land around the mid-$500s per unit, matching what SK Hynix is charging.
That's a hefty jump. We're talking over 50% more expensive compared to SK Hynix's current HBM3e, which runs in the mid-$300 range. Why does this matter? HBM chips power AI accelerators and high-performance computing rigs—the backbone of data centers training large language models and running complex simulations.
If pricing climbs this aggressively, expect ripple effects across AI infrastructure costs. Could this squeeze margins for cloud providers or push hardware refresh cycles further out? Worth watching how Micron positions itself in this shifting landscape.