CRAM Memory Technology Reduces Electricity Consumption By AI Suddenly



Many reports have come out recently about how artificial intelligence systems use a lot of electricity to process information for data training purposes and also to process proms and user questions.


Recently, a group of researchers from the University of Minnesota Twin Cities have shown the use of memory technology called Computational Random Access Memory (CRAM) to reduce the use of electricity up to 1000 times compared to the use of existing fast memory.


In a nutshell, one of the reasons why the electricity consumption for AI data processing is so high is because of the AI ​​system's need to send data from memory or storage to logical components (CPU, GPU) to be processed, and then rewritten on these memory or storage components.


With CRAM technology, the brief explanation is that the computing substrate layer has been loaded directly into the memory cell itself to enable the memory component itself to process data directly from the memory without having to go through the transmission process to the processor component.


One of the examples that shows the efficiency of the use of electricity is the MNIST (Modified National Institute of Standards and Technology) handwriting verification test where CRAM shows a 1000-fold reduction compared to the existing AI computer in their laboratory.


The university's researchers are seen to have already started the process of filing a patent for the AI ​​processing technology in this memory, and will work with major memory manufacturers to speed up the commercialization process of this product.


AI technology is now seen as very powerful in terms of creating content quickly and accurately, but the issue of electricity consumption is a serious issue. If this technology can be used as soon as possible, the use of electricity can be reduced and used for other purposes.

Previous Post Next Post

Contact Form