Processors

Baidu completes ia kunlun chip development with up to 260 tops

Table of contents:

Anonim

Baidu is entering the crowded space of AI (Artificial Intelligence) accelerators. The company announced that it has completed the development of its Kunlun chip that offers up to 260 TOPS at 150W. The chip will go into production early next year in Samsung's 14nm process and includes HBM's 2.5D package.

Baidu Kunlun offering up to 260 TOPS at 150W in AI calculations

Kunlun is based on Baidu's XPU architecture for neural network processors. The chip is capable of 260 TOPS at 150W and has a bandwidth of 512GB / s. Baidu says the chip is intended for cloud computing and science, although the company did not provide specifications for the edge variant (as these typically have a much lower TDP).

Baidu claims that Kunlun is 3 times faster in inference than conventional (unspecified) FPGA / GPU systems in a natural language processing model, but says it also supports a wide variety of other AI workloads, although it does not it says if the chip is also capable or intended for training.

Samsung will manufacture the chip for Baidu, which is slated for production early next year in the 14nm 2.5D process based on the I-Cube interposer to integrate the 32GB of HBM2 memory.

Visit our guide on the best processors on the market

For Samsung, the chip helps the company expand its manufacturing business to data center applications, according to Ryan Lee, vice president of manufacturing marketing for Samsung Electronic.

With Kunlun, Baidu joins a number of companies with data center AI inference chips on a list that includes Google's TPUs, Qualcomm's Cloud 100, Nvidia's T4, and Intel's Nervana NNP-I and its recent acquisition of Habana Goya. On the edge, companies like Huawei, Intel, Nvidia, Apple, Qualcomm, and Samsung have discrete or integrated neural network accelerators.

Tomshardware font

Processors

Editor's choice

Back to top button