If we look at raw performance and power requirements alone, HBM (High Bandwidth Memory) is the superior choice over GDDR when it comes to graphics RAM. However, Nvidia and AMD still do not offer HBM on their mainstream gaming GPUs and reserve this option only for high-end compute cards that are used mostly in AI applications and HPC environments. This is because GDDR is more than 50% more affordable to implement at current market prices. Samsung, as one of the major HBM producers, could help lower the price for “regular” HBM chips, as it now prepares to launch the more advanced HBM-PIM (processing-in-memory) architecture.
Each new HBM-PIM chip integrates an AI processing component that essentially doubles the performance of regular HBM2 Aquabolt solutions while lowering energy consumption by more than 70%. The on-chip AI processor can be programmed and tailored for diverse AI-driven workloads including training and inference. Kwangil Park, Samsung’s senior vice president of Memory Product Planning, stated that the company is willing to collaborate with AI solution providers for the development of even more advanced PIM-powered applications.
In its press release, Samsung explains how the on-chip AI processors can help double the performance. “Most of today’s computing systems are based on the von Neumann architecture, which uses separate processor and memory units to carry out millions of intricate data processing tasks. This sequential processing approach requires data to constantly move back and forth, resulting in a system-slowing bottleneck especially when handling ever-increasing volumes of data. Instead, the HBM-PIM brings processing power directly to where the data is stored by placing a DRAM-optimized AI engine inside each memory bank — a storage sub-unit — enabling parallel processing and minimizing data movement.”
Another big plus for the HBM-PIM architecture is that it does not require hardware or software changes, so there are no additional integration or deployment costs with existing systems. Samsung states that the HBM-PIM chips are currently tested inside AI accelerators by leading AI solution partners and all validations should be completed by the end of 1H2021.
Are you a techie who knows how to write? Then join our Team! Wanted:
- Specialist News Writer
- Magazine Writer
- Translator (DE<->EN)
Details here
Source(s)
Join our Support Satisfaction Survey 2023: We want to hear about your experiences!
Participate here
Top 10 Laptops
Multimedia, Budget Multimedia, Gaming, Budget Gaming, Lightweight Gaming, Business, Budget Office, Workstation, Subnotebooks, Ultrabooks, Chromebooks
under 300 USD/Euros, under 500 USD/Euros, 1,000 USD/Euros, for University Students, Best Displays
Top 10 Smartphones
Smartphones, Phablets, ≤6-inch, Camera Smartphones