Intel Corp said on Monday at the Consumer Electronics Show in Las Vegas that it is working with Facebook to complete a new artificial intelligence chip in the second half of this year.
Nervana Neural Network Processor (NNP-I), an AI chip for inference based workloads that fits into a GPU-like form factor. It wasn’t a startling reveal Intel declared it was taking a shot on a new generation of inference chip way back in 2017 yet its appearance at the press conference today clarified the company’s ambition to capture a vast slice of the budding AI chip market.
The new “inference” AI chip could help Facebook and others deploy machine learning efficiently and cheaply. The social network uses AI to complete a wide scope of things, incorporating tagging people in images, translating posts from one language to another, and catching prohibited content. These tasks are all the more exorbitant, as far as time and energy, if run on more generic hardware.
“This new class of chip is dedicated to accelerating inference for companies with high workload demands and is expected to go into production this year,” Intel said in a statement.
Intel will make the chip available to other companies later in 2019. It is currently a long ways behind the market leader for AI hardware, Nvidia, and faces rivalry from a large group of chip-production upstarts.
Cover Image : Intel