Intel wants to make artificial intelligence 100 times faster with new class of processors

Machine-learning and artificial intelligence are considered by many to be radical tools that will revolutionize entire industries. Everything from self-driving cars, to our photo apps, Netflix recommendations, and digital assistants, is driven by this technology, and we’ll become even more reliant upon it in the future. That’s why Intel is looking to capitalize on this and has created its own AI-optimized chip.

Most neural networks, machine-learning algorithms, and pretty much everything we’d describe as artificial intelligence currently relies on graphics cards. Both the learning, and a big part of the implementation is driven by GPUs which have proven remarkably adept at processing such data. However, GPUs were never designed to handle these types of tasks, so they’re far from ideal processing units when it comes to AI. Just like a general-purpose CPU is orders of magnitude worse at processing graphics compared to a dedicated GPU, so might a graphics card be significantly inferior to silicon created specifically for AI.

Intel believes it can achieve that quantum leap in performance, with a new platform and chip, called Nervana. Created from the start for AI applications, Intel says it’s looking with Nervana at a 100x reduction in the time to train a deep learning model over the next three years compared to GPU solutions.

The new class of chips, codenamed Lake Crest, will be tested in the first half of 2017, with commercial availability coming some time after that. Diane Bryant, Intel VP, explained the company’s vision for Nervana:

We expect the Intel Nervana platform to produce breakthrough performance and dramatic reductions in the time to train complex neural networks.Before the end of the decade, Intel will deliver a 100-fold increase in performance that will turbocharge the pace of innovation in the emerging deep learning space

Though Intel claims Nervana will be the first chip specifically designed for AI, that’s not exactly true. Google has created its own silicon, the Tensor Processing Unit, while IBM is also working on a commercial product due to be released in the future.

For now, details about the actual capabilities of Nervana and upcoming chips from Intel remain very scarce. However, given the relentless pace of innovation in this field, and the huge potential that AI has, we’ll no doubt learn a lot more in the near future.

Report a problem with article
Previous Story

Report shows DDoS attacks have increased by 71% in a year, hitting new records

Next Story

HP's Elite x2 Windows 10 tablet with 4G LTE launches on Verizon, priced from $899

46 Comments - Add comment