Notebookcheck Logo

Baidu announces a new processor for AI

Baidu Kunlun processor for AI (Source: GlobeNewswire)
Baidu Kunlun processor (Source: GlobeNewswire)
Baidu's new Kunlun processor is advertised as "China's first cloud-to-edge AI chip" that will be used in both cloud and edge scenarios, including data centers, autonomous vehicles, and public clouds. The chip uses 14 nm Samsung engineering, providing 512 GB/s and 260 TOPS for a power consumption of 100 W.

The AI processor market is quickly moving from a small niche to a global scale and the announcement of the Kunlun chip by Chinese search giant Baidu is a clear sign that the years to come will be very interesting for this industry — as long as we don't end up as bad as we did in the Matrix or Terminator series, of course.

Baidu advertises Kunlun as "China’s first cloud-to-edge AI chip, built to accommodate high performance requirements of a wide variety of AI scenarios." The two parts mentioned in the official press release are the 818-300 training chip and the 818-100 inference chip. Baidu's Kunlun was designed for cloud and edge scenarios, such as public clouds, autonomous vehicles, and data centers. 

Baidu began the development of a field-programmable gate array (FPGA)-based AI accelerator back in 2011 when they also started to use GPUs in data centers. The new Kunlun solution is made up of thousands of small cores and is about 30 times faster than the original FPGA-based accelerator. The Kunlun chips use Samsung's 14 nm process and offer a memory bandwidth of 512 GB/s, a computational power of 260 TOPS, all for a power consumption of 100 W.

Baidu's new chips will not end up in anyone's smartphone for a while, mainly given to the power consumption mentioned above. However, intelligent vehicles and various industrial/business applications should take advantage of these solutions starting in the coming months.

+ Show Press Release
static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
Codrut Nistor, 2018-07- 5 (Update: 2018-07- 5)