The concept of processing in memory or PIM is not new, but what we did not expect was to see it in the form of a variant of the GDDR6 used in graphics cards. And it is that SK Hynix has just presented its GDDR6-AIM where it intends to integrate a processor within its VRAM memory chips. Will we see it in our gaming GPUs?
In-memory processing is a concept that stems from the idea that it is much better to execute certain instructions or a set of them, algorithms, close to the memory that stores the data than to the processor. This is due to the high latency between CPU and GPU with memory. Although the latter have mechanisms to mask latency, there are applications in which latency and bandwidth are just as important in terms of performance, as is the case with artificial intelligence.
If a few months ago Samsung presented its HBM-PIM, which integrated an accelerator for deep learning algorithms inside, now it is the turn of its counterpart for the GDDR6 developed by SK-Hynix and named GDDR6-AIM.
This is the SK-Hynix GDDR6-AIM
The South Korean multinational has made a preview of its GDDR6-AIM that it will show at this year’s ISSCC 2022 to be held from February 20 to 24. It is a 16 Gbps GDDR6 memory with a processor inside for processing data from inside memory. The advantage of this is that certain types of instructions become up to 16 times faster than if they are carried out with traditional RAM.
AIM’s name comes from Accelerator in Memory and is that SK-Hynix has added a neural network or processor for AI inside each memory chip. At the moment we do not know its capacity and power. Although taking into account that one of the disadvantages of GDDR6 compared to HBM2 is latency, it is an interesting development to say the least. Furthermore, they have achieved trim the voltage from 1.35 V to 1.25 V with respect to the conventional variant of memory. What a 80% consumption compared to the VRAM used in current graphics cards.
Not for your gaming graphics card
At the moment, the market for graphics cards for high-performance computing, big data and artificial intelligence is dominated by graphics cards from NVIDIA and AMD that use HBM2 memory. The reason? It is the type of memory that offers the lowest latency of all, apart from a much lower consumption per transmitted bit than GDDR6, which has been “relegated” to both professional and gaming graphics cards.
You will not see GDDR6-AIM in future NVIDIA and AMD graphics cards such as the RTX 40 and RX 7000 respectively. What’s more, we dare to say that neither Hopper nor the Instinct MI300 will use this type of memory. All this without forgetting what Intel is going to launch on the market. However, we cannot forget that all memory needs a processor that uses it, since if this is not the case, then its development is nonsense.
The GDDR6-AIM will be used in SAPEON X220 cards created by sister company SK Telecom. So we are talking about a design for internal use. However, they are going to seek to license this type of PIM to third parties. Will it be adopted by other companies or will it remain a design for internal use by its manufacturer? At the moment it doesn’t seem that the big graphics card manufacturers are interested in this memory, but who knows.