- About Us
- Vendors
-
|
Storage
Gaming
Audio & Visual
- Our Affiliates
- News & Insights
Tech Blog — Dec 17, 2025
New LPDDR Memory Module Empowering Next-generation AI Infrastructure
Share with
In line with this trend, Samsung has developed SOCAMM2 (Small Outline Compression Attached Memory Module) — an LPDDR-based server memory module designed for AI data centers, with customer samples already being supplied. By combining the strengths of LPDDR technology with a modular, detachable design, SOCAMM2 delivers higher bandwidth, improved power efficiency, and flexible system integration, enabling AI servers to achieve greater efficiency and scalability.
Why SOCAMM2 matters beyond conventional memoryBased on Samsung’s latest LPDDR5X DRAM, SOCAMM2 expands the scope of data-center memory by combining the strengths of LPDDR and modular architectures.
While DDR-based server modules such as RDIMM (Registered Dual Inline Memory Module) continue to serve as the backbone of high-capacity, general-purpose servers, SOCAMM2 offers a complementary alternative optimized for next-generation AI-accelerated servers that demand both high responsiveness and power efficiency. It delivers more than twice the bandwidth of traditional RDIMM while consuming over 55% less power, maintaining stable, high-throughput performance under intensive AI workloads, making it an ideal solution for energy-efficient, performance-driven AI servers.
By inheriting the low-power characteristics of LPDDR technology and combining them with the scalability of a modular form factor, SOCAMM2 enables greater design flexibility for diverse AI system configurations, providing improved system versatility for next-generation AI infrastructure.
User benefit of SOCAMM2SOCAMM2’s architectural innovations enable customers to operate AI servers with greater efficiency, flexibility, and reliability.
Its detachable design streamlines system maintenance and lifecycle management. Unlike traditional soldered LPDDR solutions, SOCAMM2 enables easy memory upgrades or replacements without any mainboard modification, helping system administrators minimize downtime and dramatically reduce the total cost of ownership (TCO).
In addition, SOCAMM2’s enhanced power efficiency makes heat management easier and more effective in AI server deployments. This helps data centers maintain thermal stability and reduce cooling requirements — a critical factor for high-density AI environments.
Lastly, the transition from RDIMM’s vertical layout to SOCAMM2’s horizontal orientation further improves system-level space utilization. It enables more flexible heat-sink placement and airflow design, allowing smoother integration with CPUs and accelerators, while remaining compatible with both air and liquid cooling systems.
Close collaboration with NVIDIA and JEDEC standardization
Samsung is expanding its collaboration across the AI ecosystem to accelerate adoption of LPDDR-based server solutions. In particular, the company is working closely with NVIDIA to optimize SOCAMM2 for NVIDIA accelerated infrastructure through ongoing technical cooperation — ensuring it delivers the responsiveness and efficiency required for next-generation inference platforms. This partnership is underscored by NVIDIA’s remarks:
“As AI workloads shift from training to rapid inference for complex reasoning and physical AI applications, next-generation data centers demand memory solutions that deliver both high performance and exceptional power efficiency,” said Dion Harris, senior director, HPC and AI Infrastructure Solutions, NVIDIA. “Our ongoing technical cooperation with Samsung is focused on optimizing memory solutions like SOCAMM2 to deliver the high responsiveness and efficiency essential for AI infrastructure.”
As SOCAMM2 gains traction as a low-power, high-bandwidth solution for next-generation AI systems, the industry has initiated formal standardization efforts for LPDDR-based server modules. Samsung has been contributing to this work alongside key partners, helping to shape consistent design guidelines and enable smoother integration across future AI platforms.
Through continued alignment with the broader AI ecosystem, Samsung is helping to guide the shift toward low-power, high-bandwidth memory for next-generation AI infrastructure. SOCAMM2 represents a major milestone for the industry — bringing LPDDR technology into mainstream server environments and powering the transition to the emerging superchip era. By combining LPDDR with a modular architecture, it provides a practical path toward more compact and power-efficient AI systems.
As AI workloads continue to grow in scale and complexity, Samsung will further advance its LPDDR-based server memory portfolio, reinforcing its commitment to enabling the next generation of AI data centers.