High-bandwidth memory, or HBM, is already fast. But Samsung wants to make it even faster. The South Korean-based technology giant has announced its HBM-PIM architecture, which will double the speeds of high-bandwidth memory by leaning on artificial intelligence.

PIM, which stands for processor-in-memory, leverages the capabilities of artificial intelligence to speed up memory, and Samsung hopes that its HBM-PIM tech will be used in applications such as data centers and high-performance computing (HPC) machines in the future.

“Our groundbreaking HBM-PIM is the industry’s first programmable PIM solution tailored for diverse A.I.-driven workloads such as HPC, training, and inference,” Kwangil Park, Samsung Electronics senior vice president of memory product planning, said in a statement. “We plan to build upon this breakthrough by further collaborating with A.I. solution providers for even more advanced PIM-powered applications.”

A potential client for Samsung’s HBM-PIM is the Argonne National Laboratory, which hopes to use the technology to solve “problems of interest.” The lab noted that HBM-PIM addresses the memory bandwidth and performance challenges for HPC and AI computing by delivering impressive performance and power gains.

According to Samsung, the HBM-PIM works by placing a DRAM-optimized engine inside each memory bank in a storage subunit to enable parallel processing to minimize data movement.

“When applied to Samsung’s existing HBM2 Aquabolt solution, the new architecture is able to deliver over twice the system performance while reducing energy consumption by more than 70%,” the company stated. “The HBM-PIM also does not require any hardware or software changes, allowing faster integration into existing systems.”

This is different than existing applications, which are all based on the von Neumann architecture. In current solutions, a separate processor and separate memory units are needed to carry out all the data processing tasks in a sequential approach. This requires data to travel back and forth, often resulting in a bottleneck when handling large data volumes.

By removing the bottleneck, Samsung’s HBM-PIM can be a useful tool in a data scientist’s arsenal. Samsung claims that the HBM-PIM is now being tested by A.I. accelerators by leading A.I. solution partners, and validation is expected to be completed in the first half of 2021.

Editors’ Recommendations






Source link

By HS

Leave a Reply

Your email address will not be published. Required fields are marked *