ai chip Archives - AI News https://www.artificialintelligence-news.com/tag/ai-chip/ Artificial Intelligence News Fri, 06 Oct 2023 14:31:20 +0000 en-GB hourly 1 https://www.artificialintelligence-news.com/wp-content/uploads/sites/9/2020/09/ai-icon-60x60.png ai chip Archives - AI News https://www.artificialintelligence-news.com/tag/ai-chip/ 32 32 OpenAI considers in-house chip manufacturing amid global shortage https://www.artificialintelligence-news.com/2023/10/06/openai-considers-in-house-chip-manufacturing-amid-global-shortage/ https://www.artificialintelligence-news.com/2023/10/06/openai-considers-in-house-chip-manufacturing-amid-global-shortage/#respond Fri, 06 Oct 2023 14:31:18 +0000 https://www.artificialintelligence-news.com/?p=13692 OpenAI, the company behind the renowned ChatGPT, is reportedly delving into the prospect of manufacturing processing chips in-house amidst a worldwide shortage of these in-demand components. Sources familiar with the matter disclosed to Reuters that OpenAI is actively exploring options, including evaluating an undisclosed company for potential acquisition to bolster its AI chip-making ambitions. The... Read more »

The post OpenAI considers in-house chip manufacturing amid global shortage appeared first on AI News.

]]>
OpenAI, the company behind the renowned ChatGPT, is reportedly delving into the prospect of manufacturing processing chips in-house amidst a worldwide shortage of these in-demand components.

Sources familiar with the matter disclosed to Reuters that OpenAI is actively exploring options, including evaluating an undisclosed company for potential acquisition to bolster its AI chip-making ambitions.

The shortage of chips, a fundamental component in AI technology, has prompted OpenAI to consider various strategies. These options include internal chip production, forging closer ties with its primary chip supplier NVIDIA, and diversifying its chip providers.

Earlier this year, OpenAI CEO Sam Altman voiced his concerns about the chip scarcity—resulting in delays to the company’s projects.

In a since-deleted blog post by Humanloop CEO Raza Habib, the AI expert wrote about his experience sitting down with Altman:

“A common theme that came up throughout the discussion was that currently OpenAI is extremely GPU-limited and this is delaying a lot of their short-term plans. The biggest customer complaint was about the reliability and speed of the API.

Sam acknowledged their concern and explained that most of the issue was a result of GPU shortages.The longer 32k context can’t yet be rolled out to more people. OpenAI haven’t overcome the O(n^2) scaling of attention and so whilst it seemed plausible they would have 100k – 1M token context windows soon (this year) anything bigger would require a research breakthrough.

The finetuning API is also currently bottlenecked by GPU availability. They don’t yet use efficient finetuning methods like Adapters or LoRa and so finetuning is very compute-intensive to run and manage.

Better support for finetuning will come in the future. They may even host a marketplace of community contributed models. Dedicated capacity offering is limited by GPU availability.”

If OpenAI proceeds with its plan to manufacture its own chips, it will join the ranks of industry giants like Google and Amazon who have already transitioned to in-house chip production. This move could potentially alleviate OpenAI’s dependency on external suppliers, empowering the company to meet the escalating demand for specialised AI chips.

Since the public launch of ChatGPT in November last year, the demand for specialised AI chips has skyrocketed—causing a surge in NVIDIA’s share prices as companies rush to procure the desirable hardware.

OpenAI has not made a final decision regarding the acquisition or in-house chip production, and discussions are ongoing to address the pressing chip shortage and sustain the company’s AI initiatives.

(Photo by Andrew Neel on Unsplash)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London. The comprehensive event is co-located with Digital Transformation Week.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

The post OpenAI considers in-house chip manufacturing amid global shortage appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2023/10/06/openai-considers-in-house-chip-manufacturing-amid-global-shortage/feed/ 0
Omdia: AI chip startups to have a tough year https://www.artificialintelligence-news.com/2023/02/21/omdia-ai-chip-startups-to-have-tough-year/ https://www.artificialintelligence-news.com/2023/02/21/omdia-ai-chip-startups-to-have-tough-year/#respond Tue, 21 Feb 2023 16:55:05 +0000 https://www.artificialintelligence-news.com/?p=12763 Analysts from Omdia expect AI chip startups to have a difficult year. Omdia’s Top AI Hardware Startups Market Radar finds that over 100 venture capitalists invested over $6 billion into the top 25 AI chip startups since 2018. However, it seems the good times weren’t to last. The global chip shortage is becoming an inventory... Read more »

The post Omdia: AI chip startups to have a tough year appeared first on AI News.

]]>
Analysts from Omdia expect AI chip startups to have a difficult year.

Omdia’s Top AI Hardware Startups Market Radar finds that over 100 venture capitalists invested over $6 billion into the top 25 AI chip startups since 2018. However, it seems the good times weren’t to last.

The global chip shortage is becoming an inventory crisis. Meanwhile, the economic downturn and difficult monetary policies have made it difficult to raise funding.

“The best-funded AI chip startups are under pressure to deliver the kind of software support developers are used to from the market leader, NVIDIA,” says Alexander Harrowell, Principal Analyst for Advanced Computing at Omdia.

“This is the key barrier to getting new AI chip technology into the market.”

Omdia predicts that at least one major startup will exit the market this year, likely through a sale to a major chipmaker or a hyperscale cloud provider.

“The most likely exit route is probably via trade sales to major vendors,” adds Harrowell.

“Apple has $23 billion in cash on its balance sheet and Amazon $35 billion, while Intel, NVIDIA, and AMD have some $10 billion between them. The hyperscalers have been very keen to adopt custom AI silicon and they can afford to maintain the skills involved.”

Over half of the $6 billion invested in AI chip startups have focused on large-die, CGRA  accelerators that are designed with the aim of loading entire AI models on-chip. That approach is now being questioned due to the continuing growth of AI models.

“In 2018 and 2019, the idea of bringing the entire model into on-chip memory made sense, as this approach offers extremely low latency and answers the input/output problems of large AI models,” explains Harrowell.

“However, the models have continued to grow dramatically ever since, making scalability a critical issue. More structured and internally complex models mean AI processors must offer more general-purpose programmability. As such, the future of AI processors may lie in a different direction.”

(Photo by Fabrizio Conti on Unsplash)

Want to learn more about AI and big data from industry leaders? Check out AI & Big Data Expo taking place in Amsterdam, California, and London.

Explore other upcoming enterprise technology events and webinars powered by TechForge here.

The post Omdia: AI chip startups to have a tough year appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2023/02/21/omdia-ai-chip-startups-to-have-tough-year/feed/ 0
Tencent Cloud unveils three world-class AI chips https://www.artificialintelligence-news.com/2021/11/08/tencent-cloud-unveils-three-world-class-ai-chips/ https://www.artificialintelligence-news.com/2021/11/08/tencent-cloud-unveils-three-world-class-ai-chips/#respond Mon, 08 Nov 2021 14:04:19 +0000 https://artificialintelligence-news.com/?p=11345 Tencent Cloud claims to have developed three world-class AI chips that substantially outperform rivals, although details at this point are scarce. The third largest cloud services company in China, following Alibaba and Huawei, Tencent recently revealed the three chips at its 2021 Digital Ecology Conference. Current information on the three chips can be summarised as... Read more »

The post Tencent Cloud unveils three world-class AI chips appeared first on AI News.

]]>
Tencent Cloud claims to have developed three world-class AI chips that substantially outperform rivals, although details at this point are scarce.

The third largest cloud services company in China, following Alibaba and Huawei, Tencent recently revealed the three chips at its 2021 Digital Ecology Conference.

Current information on the three chips can be summarised as follows:

  • Zixiao – an “AI reasoning” chip that supposedly offers 100 percent better performance than rival products. It combines image and video processing with natural language processing, search recommendations, and other features
  • Xuangling – a SmartNIC or Data Processing Unit (DPU) that runs virtualisation of storage and networking for a cloud host’s CPU so that it doesn’t have to. Tencent claims this comes with zero cost to the host CPU and that it performs four times faster than similar industry products
  • Canghai – a video transcoding chip that supposedly delivers a 30 percent improved compression rate over other on-market products. It achieves this through multi-core expansion architecture, a high-performance coding pipeline, and a hierarchical memory layout

Whilst these suggested improvements are substantial, the reliability of these figures cannot yet be accounted for.

Their development comes on the back of Tencent establishing a chip research and development lab in Penglai in 2020. Its goal of achieving full end-to-end coverage of Tencent’s chip design and verification appears to have been realised with the announcement.

Tang Daosheng, senior executive VP of Tencent, said at the conference: “Facing strong business needs, Tencent developed a long-term chip research and development investment plan. Currently, it has already implemented three directions with substantial progress.”

Tencent currently operates outside of Asia in the USA, Brazil, Germany, and Russia, with keen plans to expand further into Europe, the Americas, and Africa.

Find out more about Digital Transformation Week North America, taking place on 9-10 November 2021, a virtual event and conference exploring advanced DTX strategies for a ‘digital everything’ world.

The post Tencent Cloud unveils three world-class AI chips appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2021/11/08/tencent-cloud-unveils-three-world-class-ai-chips/feed/ 0
Huawei unveils high-end AI chip for servers alongside MindSpore framework https://www.artificialintelligence-news.com/2019/08/23/huawei-ai-chip-mindspore-framework/ https://www.artificialintelligence-news.com/2019/08/23/huawei-ai-chip-mindspore-framework/#respond Fri, 23 Aug 2019 14:24:32 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=5963 Huawei has unveiled a high-end artificial intelligence chip for servers along with an AI computing framework called MindSpore. The Huawei Ascend 910 is the “world’s most powerful AI processor,” according to a press release on Friday. The chip’s specs were first announced during last year’s Huawei Connect event in Shanghai. Eric Xu, Rotating Chairman of... Read more »

The post Huawei unveils high-end AI chip for servers alongside MindSpore framework appeared first on AI News.

]]>
Huawei has unveiled a high-end artificial intelligence chip for servers along with an AI computing framework called MindSpore.

The Huawei Ascend 910 is the “world’s most powerful AI processor,” according to a press release on Friday. The chip’s specs were first announced during last year’s Huawei Connect event in Shanghai.

Eric Xu, Rotating Chairman of Huawei, said:

“We have been making steady progress since we announced our AI strategy in October last year. Everything is moving forward according to plan, from R&D to product launch.

We promised a full-stack, all-scenario AI portfolio and today we delivered, with the release of Ascend 910 and MindSpore. This also marks a new stage in Huawei’s AI strategy.”

Huawei claims the final version of the Ascend 910 not only performs as promised, but it does so with much lower power consumption.

For half-precision floating point (FP16) operations, Ascend 910 delivers 256 TeraFLOPS performance. For integer precision calculations (INT8), it delivers 512 TeraOPS.

Huawei initially expected the Ascend 910’s max power consumption to be 350W but the company has managed to deliver the promised performance with a max consumption of just 310W.

“Ascend 910 performs much better than we expected,” said Xu. “Without a doubt, it has more computing power than any other AI processor in the world.”

Alongside the Ascend 910, Huawei has launched an AI computing framework called MindSpore.

Last year, Huawei announced three goals for MindSpore:

  • Easy development: Reduce training time and costs.
  • Efficient execution: Use the least amount of resources with the highest possible OPS/W.
  • Adaptable to all scenarios: Including device, edge, and cloud applications.

Huawei claims that MindSpore requires 20 percent fewer lines of code than other leading frameworks when used for a typical neural network for natural language processing.

“MindSpore will go open source in the first quarter of 2020,” said Xu. “We want to drive broader AI adoption and help developers do what they do best.”

The Chinese tech behemoth continues to expand its presence despite battling a US trade ban. The US has been pressuring its allies to ban Huawei over concerns it poses a national security threat.

While security must always be prioritised, few can dispute the innovation which Huawei brings across its business. Today’s announcements show the kind of innovations which US companies may miss out on if a deal cannot be reached, putting them at a disadvantage to Chinese rivals.

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

The post Huawei unveils high-end AI chip for servers alongside MindSpore framework appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2019/08/23/huawei-ai-chip-mindspore-framework/feed/ 0
Intel unwraps its first chip for AI and calls it Spring Hill https://www.artificialintelligence-news.com/2019/08/21/intel-ai-powered-chip-spring-hill/ https://www.artificialintelligence-news.com/2019/08/21/intel-ai-powered-chip-spring-hill/#respond Wed, 21 Aug 2019 10:17:07 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=5956 Intel has unwrapped its first processor that is designed for artificial intelligence and is planned for use in data centres. The new Nervana Neural Network Processor for Inference (NNP-I) processor has a more approachable codename of Spring Hill. Spring Hill is a modified 10nm Ice Lake processor which sits on a PCB and slots into... Read more »

The post Intel unwraps its first chip for AI and calls it Spring Hill appeared first on AI News.

]]>
Intel has unwrapped its first processor that is designed for artificial intelligence and is planned for use in data centres.

The new Nervana Neural Network Processor for Inference (NNP-I) processor has a more approachable codename of Spring Hill.

Spring Hill is a modified 10nm Ice Lake processor which sits on a PCB and slots into an M.2 port typically used for storage.

According to Intel, the use of a modified Ice Lake processor allows Spring Hill to handle large workloads and consume minimal power. Two compute cores and the graphics engine have been removed from the standard Ice Lake design to accommodate 12 Inference Compute Engines (ICE).

In a summary, Intel detailed six main benefits it expects from Spring Hill:

  1. Best in class perf/power efficiency for major data inference workloads.
  2. Scalable performance at wide power range.
  3. High degree of programmability w/o compromising perf/power efficiency.
  4. Data centre at scale.
  5. Spring Hill solution – Silicon and SW stack – sampling with definitional partners/customers on multiple real-life topologies.
  6. Next two generations in planning/design.

Intel’s first chip for AI comes after the company invested in several Isreali artificial intelligence startups including Habana Labs and NeuroBlade. The investments formed part of Intel’s strategy called ‘AI Everywhere’ which aims to increase the firm’s presence in the market.

Naveen Rao, Intel vice president and general manager, Artificial Intelligence Products Group, said:

“To get to a future state of ‘AI everywhere,’ we’ll need to address the crush of data being generated and ensure enterprises are empowered to make efficient use of their data, processing it where it’s collected when it makes sense and making smarter use of their upstream resources.

Data centers and the cloud need to have access to performant and scalable general purpose computing and specialized acceleration for complex AI applications. In this future vision of AI everywhere, a holistic approach is needed—from hardware to software to applications.”

Facebook has said it will be using Intel’s new Spring Hill processor. Intel already has two more generations of the NNP-I in development.

Interested in hearing industry leaders discuss subjects like this? Attend the co-located 5G Expo, IoT Tech Expo, Blockchain Expo, AI & Big Data Expo, and Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London, and Amsterdam.

The post Intel unwraps its first chip for AI and calls it Spring Hill appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2019/08/21/intel-ai-powered-chip-spring-hill/feed/ 0
Nvidia CEO is ‘happy to help’ if Tesla’s AI chip ambitions fail https://www.artificialintelligence-news.com/2018/08/17/nvidia-ceo-help-tesla-ai-chip/ https://www.artificialintelligence-news.com/2018/08/17/nvidia-ceo-help-tesla-ai-chip/#respond Fri, 17 Aug 2018 14:46:20 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=3650 Nvidia CEO Jensen Huang has teased his company is ‘happy to help’ if Tesla fails its goal to launch a competitor AI chip. Tesla currently uses Nvidia’s silicon for its vehicles. The company’s CEO, Elon Musk, said earlier this month that he’s a “big fan” of Nvidia but that an in-house AI chip would be... Read more »

The post Nvidia CEO is ‘happy to help’ if Tesla’s AI chip ambitions fail appeared first on AI News.

]]>
Nvidia CEO Jensen Huang has teased his company is ‘happy to help’ if Tesla fails its goal to launch a competitor AI chip.

Tesla currently uses Nvidia’s silicon for its vehicles. The company’s CEO, Elon Musk, said earlier this month that he’s a “big fan” of Nvidia but that an in-house AI chip would be able to outperform those of the leading processor manufacturer.

During a conference call on Thursday, Huang said its customers are “super excited” about  Nvidia’s Xavier technology for autonomous machines. He also notes that it’s currently in production, whereas Tesla’s rival is yet-to-be-seen.

Here’s what Huang had to say during the call:

“With respect to the next generation, it is the case that when we first started working on autonomous vehicles, they needed our help. We used the 3-year-old Pascal GPU for the current generation of Autopilot computers.

It’s very clear now that in order to have a safe Autopilot system, we need a lot more computing horsepower. In order to have safe computing, in order to have safe driving, the algorithms have to be rich. It has to be able to handle corner conditions in a lot of diverse situations.

Every time there are more and more corner conditions or more subtle things that you have to do, or you have to drive more smoothly or be able to take turns more quickly, all of those requirements require greater computing capability. And that’s exactly the reason why we built Xavier. Xavier is in production now. We’re seeing great success and customers are super excited about Xavier.

That’s exactly the reason why we built it. It’s super hard to build Xavier and all the software stack on top of it. If it doesn’t turn out for whatever reason for them [Tesla] you can give me a call and I’d be more than happy to help.”

The conference call was carried out following the release of Nvidia’s fiscal earnings report where the company reported better-than-expected earnings.

“Growth across every platform – AI, Gaming, Professional Visualization, self-driving cars – drove another great quarter,” said Huang. “Fueling our growth is the widening gap between demand for computing across every industry and the limits reached by traditional computing.”

However, due to lower-than-expected revenue guidance, Nvidia stock fell by six percent on Thursday following the earnings report.

What are your thoughts on Huang’s comments? Let us know below.

 Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London and Amsterdam to learn more. Co-located with the  IoT Tech Expo, Blockchain Expo and Cyber Security & Cloud Expo so you can explore the future of enterprise technology in one place.

The post Nvidia CEO is ‘happy to help’ if Tesla’s AI chip ambitions fail appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2018/08/17/nvidia-ceo-help-tesla-ai-chip/feed/ 0
Intel’s AI chip business is now worth $1bn per year, $10bn by 2022 https://www.artificialintelligence-news.com/2018/08/09/intel-ai-business-worth/ https://www.artificialintelligence-news.com/2018/08/09/intel-ai-business-worth/#respond Thu, 09 Aug 2018 16:00:38 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=3615 The size of Intel’s AI chip business today is huge, but it’s nothing compared to where it expects to be in just four years’ time. Speaking during the company’s Innovation Summit in Santa Clara, Intel Executive VP Navin Shenoy revealed a new focus on AI development. The company’s AI-focused Xeon processors generated $1 billion in... Read more »

The post Intel’s AI chip business is now worth $1bn per year, $10bn by 2022 appeared first on AI News.

]]>
The size of Intel’s AI chip business today is huge, but it’s nothing compared to where it expects to be in just four years’ time.

Speaking during the company’s Innovation Summit in Santa Clara, Intel Executive VP Navin Shenoy revealed a new focus on AI development.

The company’s AI-focused Xeon processors generated $1 billion in revenues during 2017. By 2022, it expects to be generating around $10 billion per year.

AI is set to be implemented in many areas of our lives in the coming years, across a variety of devices.

Shenoy claims recent breakthroughs have increased the company’s AI performance by 200x since 2014. He teases further improvements are on their way in upcoming releases.

The company will be launching its ‘Cascade Lake’ Xeon processor later this year with 11 times better performance for AI image recognition.

Arriving in 2019 will be ‘Cooper Lake’ which uses 14-nanometer manufacturing and will feature even better performance. In 2020, however, the company is targeting ‘Ice Lake’ with 10-nanometer manufacturing technology.

“After 50 years, this is the biggest opportunity for the company,” says Shenoy. “We have 20 percent of this market today.”

The admission it currently has a small share of the market today is bold and shows the company is confident about significantly upping that percentage in the coming years. It faces significant competition from Nvidia in particular.

Intel’s revenues were around a third data-centric five years ago. Now, it’s around half of Intel’s business.

Shenoy’s comments today show how seriously Intel is taking its AI business and the firm’s confidence it will be a major player.

What are your thoughts on Intel’s AI business? Let us know in the comments.

 Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London and Amsterdam to learn more. Co-located with the  IoT Tech Expo, Blockchain Expo and Cyber Security & Cloud Expo so you can explore the future of enterprise technology in one place.

The post Intel’s AI chip business is now worth $1bn per year, $10bn by 2022 appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2018/08/09/intel-ai-business-worth/feed/ 0
Facebook is helping Intel with AI for the first Neural Network Processor https://www.artificialintelligence-news.com/2017/10/18/facebook-helping-intel-ai-first-neural-network-processor/ https://www.artificialintelligence-news.com/2017/10/18/facebook-helping-intel-ai-first-neural-network-processor/#respond Wed, 18 Oct 2017 11:48:41 +0000 https://d3c9z94rlb3c1a.cloudfront.net/?p=2595 The CEO of Intel has revealed Facebook is providing its AI knowledge ahead of the launch of the world’s first Neural Network Processor. Brian Krzanich made the comment during an on-stage interview at the WSJD Live conference in Laguna Beach, California. The news Intel is working on its own AI chips is no surprise, but... Read more »

The post Facebook is helping Intel with AI for the first Neural Network Processor appeared first on AI News.

]]>
The CEO of Intel has revealed Facebook is providing its AI knowledge ahead of the launch of the world’s first Neural Network Processor.

Brian Krzanich made the comment during an on-stage interview at the WSJD Live conference in Laguna Beach, California. The news Intel is working on its own AI chips is no surprise, but the choice of partner may be.

“This is the first piece of silicon,” Krzanich said. “We have a whole family planned for this, (Facebook) is helping us, along with others, as to where this is going.”

Many consumers are wary of Facebook because, like Google, the company relies on collecting vast amounts of data about users. While it’s unlikely Intel would allow Facebook to perform any data collection of its users; there will doubtless be some concerns.

Facebook was the only named company but Intel is also collaborating with others for its AI chips. The extent of the partnerships, or what benefits the partners receive for providing their resources, is currently unknown. We’ve reached out to Facebook and Intel for clarification.

Intel is aiming to build the first Neural Network Processor (NNP) before the end of this year. The company is calling this ambition Nervana, following the company of the same name Intel acquired in August last year, and it promises to “revolutionise AI computing” across a myriad of industries.

In a blog post, Krzanich provided the following examples:

  • Healthcare: AI will allow for earlier diagnosis and greater accuracy, helping make the impossible possible by advancing research on cancer, Parkinson’s disease, and other brain disorders.
  • Social media: Providers will be able to deliver a more personalized experience to their customers and offer more targeted reach to their advertisers.
  • Automotive: The accelerated learning delivered in this new platform brings us another step closer to putting autonomous vehicles on the road.
  • Weather: Consider the immense data required to understand the movement, wind speeds, water temperatures and other factors that decide a hurricane’s path. Having a processor that takes better advantage of data inputs could improve predictions on how subtle climate shifts may increase hurricanes in different geographies.

Krzanich says multiple generations of Nervana products are in the pipeline. Last year, the company set the goal of achieving 100 times greater AI performance by 2020. Intel believes these NNPs will help them achieve this lofty goal.

Nervana, even prior to its acquisition by Intel, has been working on neuromorphic chips for years and even developed its own called ‘Lake Crest’ as it found traditional GPUs to be unsuitable for neural networking. These chips are designed to mimic the human brain to make decisions based on patterns and associations. Intel announced its own ‘Loihi’ chip self-learning neuromorphic chip back in September.

According to Naveen Rao, co-founder of Nervana, the first member of the NNP family will begin shipping “soon”. We’ll keep you informed of all developments.

What are your thoughts on the NNPs being developed by Intel and partners? Let us know in the comments.

 Interested in hearing industry leaders discuss subjects like this and sharing their use-cases? Attend the co-located AI & Big Data Expo events with upcoming shows in Silicon Valley, London and Amsterdam to learn more. Co-located with the  IoT Tech Expo, Blockchain Expo and Cyber Security & Cloud Expo so you can explore the future of enterprise technology in one place.

The post Facebook is helping Intel with AI for the first Neural Network Processor appeared first on AI News.

]]>
https://www.artificialintelligence-news.com/2017/10/18/facebook-helping-intel-ai-first-neural-network-processor/feed/ 0