HBM memory, or High Bandwidth Memory may be known for the first time because it is a memory technology used in AMD Radeon Vega graphics cards a number of years ago. In summary, among the advantages of this HBM memory is its compact size, lower power consumption and high data transmission rate.
Recently, Micron has introduced their latest memory offering, the HBM3 Gen2, which is their first 8-level memory chip with a capacity of 24GB, developed for the time being specifically for use in the artificial intelligence sector.
All HBM memory since HBM2 comes with an 8-layer memory layer, but this is the first time Micron has successfully introduced a memory capacity as high as 24GB with a data transmission speed of 1.2TB through a 1024-bit memory bandwidth. HBM3 memory with a capacity of 24GB may not be new, but it is usually only shown through a memory chip with 12 layers of high-level memory.
In addition, Micron also says that HBM3 Gen2 comes with an electricity consumption rate that is 2.5 times more efficient than the previous HBM3 offering, allowing data center operators to reduce power supply costs and make it easier for them to increase the scale of data center processing.
In the meantime, Micron also said that they are developing HBM3 memory with 12 tiered memory layers with a memory capacity as high as 36GB, which according to the company, allows a server machine with eight memory slots to load as much as 288GB of HBM3 Gen2 memory.
Micron says that they have started introducing samples of this HBM3 Gen2 memory to a number of their customers for pre-testing. They also confirmed that large-scale production of this memory will begin next month.