Micron Unveils HBM3E and SOCAMM Memory Solutions with NVIDIA
Micron Technology has announced its latest memory solutions, HBM3E and SOCAMM, developed in collaboration with NVIDIA, at GTC 2025. In a press release, Micron revealed that it is the first memory company to ship both HBM3E and SOCAMM products for AI servers in data centers.
The SOCAMM solution, based on LPDDR5X technology, is designed to support NVIDIA's GB300 Grace Blackwell Ultra Superchip, offering enhanced data processing, performance, and power efficiency. The HBM3E 12H 36GB and 8H 24GB memory modules are integrated into NVIDIA's HGX B300 and B200 platforms, respectively, underscoring Micron's role in accelerating AI workloads.
Micron's SOCAMM is noted for its compact size, high bandwidth, and low power consumption, making it ideal for AI servers and data-intensive applications. The company also highlighted its comprehensive AI memory and storage portfolio, including high-capacity DDR5 RDIMMs and NVMe SSDs, aimed at supporting AI from the data center to the edge.
We hope you enjoyed this article.
Consider subscribing to one of several newsletters we publish like Silicon Brief.
Also, consider following our LinkedIn page AI Chips & Datacenters.
More from: Chips & Data Centers
Subscribe to Daily AI Brief
Daily report covering major AI developments and industry news, with both top stories and complete market updates