The path to high-capacity RDIMMs for servers has primarily been through 3D stacking (3DS) of DRAM dice using Through-Silicon Vias (TSVs). However, this has presented significant challenges in packaging (driving up the cost), and has also not been efficient in terms of energy consumption. The demand for large memory capacity RDIMMs is being primarily driven by the sudden emergence of large-language models (LLMs) for generative AI and increasing CPU core counts. Both of these require significant amount of DRAM to keep pace with performance requirements. Keeping these in mind, Micron is introducing 128 GB DDR5 RDIMMs capable of operating at up to 8000 MT/s today, with mass-production slated for 2024.
Micron has recently started fabricating 32 Gb monolithic DDR5 dice using its proven and mature 1β technology. The new dice have a 45%+ increase in bit density, and are capable of reaching up to 8000 MT/s while also operating with much more aggressive timing latencies compared to the standard JEDEC specs. The company is claiming that it improves energy efficiency by as much as 24% compared to the competition's 3DS TSV offerings, and the faster operation can also help in faster AI training times. Avoiding 3DS TSV allows Micron to optimize the data input buffers and critical I/O circuits better, while also reducing the pin capacitance on the data lines. These contribute to the reduced power and improved speeds.
Micron has been doubling its monolithic die density every 3 years or so, thanks to advancements in CMOS process as well as improvements in array efficiency. The company sees a clear path to 48 Gb and 64 Gb monolithic dice in the future with continued technological progress. Micron is also claiming that its 1β node has reached mass production ahead of the competition, and that it has had the fastest yield maturity in the company's history. Dual-die packages and tall form-factor (TFF) modules using 1β DRAM are expected to enable 1TB modules in the near future.
Along with the announcement of the 128 GB RDIMMs using 1β technology, the company also laid out its roadmap for upcoming products. HDM and GDDR7 are expected to dominate bandwidth-hungry applications, while RDIMMs, MCRDIMMs, and CXL solutions are in the pipeline for systems requiring massive capacity. LPDDR5X, and LPCAMM2 solutions going up to 192 GB are expected to make an appearance in power-sensitive systems as early as 2026.
from AnandTech https://ift.tt/LCmVfqj
via IFTTT
0 comments:
Post a Comment