Smells Like AI Spirit: Baidu will help develop Intel’s Nervana neural processor

Smells Like AI Spirit: Baidu will help develop Intel’s Nervana neural processor Ryan is a senior editor at TechForge Media with over a decade of experience covering the latest technology and interviewing leading industry figures. He can often be sighted at tech conferences with a strong coffee in one hand and a laptop in the other. If it's geeky, he’s probably into it. Find him on Twitter (@Gadget_Ry) or Mastodon (@gadgetry@techhub.social)


Intel announced during Baidu’s Create conference this week that Baidu will help to develop the former’s Nervana Neural Network Processor.

Speaking on stage at the conference in Beijing, Intel corporate vice president Naveen Rao made the announcement.

“The next few years will see an explosion in the complexity of AI models and the need for massive deep learning compute at scale. Intel and Baidu are focusing their decade-long collaboration on building radical new hardware, codesigned with enabling software, that will evolve with this new reality – something we call ‘AI 2.0.’

Intel’s so-called Neural Network Processor for Training is codenamed NNP-T 1000 and designed for training deep learning models at lightning speed. A large amount (32GB) of HBM memory and local SRAM is put closer to where computation happens to enable more storage of model parameters on-die, saving significant power for an increase in performance.

The NNP-T 1000 is set to ship alongside the Neural Network Processor for Inference (NNP-I 1000) chip later this year. As the name suggests, the NNP-I 1000 is designed for AI inferencing and features general-purpose processor cores based on Intel’s Ice Lake architecture.

Baidu and Intel have a history of collaborating in AI. Intel has helped to optimise Baidu’s PaddlePaddle deep learning framework for its Xeon Scalable processors since 2016. More recently, Baidu and Intel developed the BIE-AI-Box – a hardware kit for analysing the frames of footage captured by cockpit cameras.

Intel sees a great deal of its future growth in AI. The company’s AI chips generated $1 billion in revenue last year and Intel expects a growth rate of 30 percent annually up to $10 billion by 2022.

Interested in hearing industry leaders discuss subjects like this and their use cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London, and Amsterdam to learn more. Co-located with the IoT Tech Expo, Blockchain Expo, and Cyber Security & Cloud Expo.

Tags: , , , , , ,

View Comments
Leave a comment

Leave a Reply