Micron introduces a 256GB SOCAMM2 LPDDR5x memory module designed for AI servers, enabling configurations up to 2TB while reducing power consumption.
Micron launches a 256GB SOCAMM2 memory module using 64 32GB LPDDR5x chips — and yes, hyperscalers can shove 8 in an AI server to reach 2TB capacity: mere mortals need not apply
RELATED ARTICLES


