shutterstock-550630102
Software Architectures

A look at the new breed of AI chips

A $30m injection of cash into the UK’s Bristol-based AI chip startup Graphcore in July this year was another clear indication of how AI chip development is disrupting the traditional CPU players. Earlier in the same month, leading AI chipmaker NVIDIA struck a deal with Chinese web services firm Baidu with a promise to “accelerate AI”, while Microsoft executive vice president for AI research Harry Shum recently talked about how Microsoft is developing its own AI chip for HoloLens. Throw in Google’s TPU, Apple’s hints at an AI chip for its smartphones and a host of startups including Austin, Texas-based Mythic and you have a processor bunfight brewing.

It may prove to be a defining period for the technology industry and for traditional processing giant Intel in particular. Moore’s Law has stood Intel in good stead over the years, multiplying the transistors to boost performance, making it a dominant player in datacentre servers. There is of course, still plenty of life in that particular beast but the rapidly increasing drive for a data driven world is piling pressure on existing technologies and chip designs. As a result, Intel’s dominance is under threat and although the playing field may not have been entirely levelled, there’s enough of a shift to for new entrants to take a shot.

As you would expect Intel has not been idle. It launched its own AI chip Nervana last year and in March it acquired Israeli chipmaker MobileEye for $15bn but it’s having to play catch-up. Other businesses such as Nvidia have stolen a march, recognising as early as 2006, that a new approach was needed. According to Marc Hamilton, vice president of solutions architecture and engineering at NVIDIA, the company’s work in developing CUDA 11 years ago, a parallel computing platform and application programming interface, has been the rock on which the business is now building its AI-targeted products.

“We spend around $3bn a year on R&D and the vast majority of that benefits AI,” says Hamilton. “To be a success in AI you need to do things differently. You can’t rely on existing commodity processors.”

So, what does a $3bn R&D fund buy you? How many AI-related patents does the business hold? Hamilton didn’t talk numbers but pointed to NVIDIA’s Pascal architecture products and its work to develop improved ‘inferencing’ speeds. Its Tensor RT chip he says, is ten times faster than anything else in the inference step of a deep learning process, something which is already attracting early users of machine learning technologies.

To continue reading...


PREVIOUS ARTICLE

« From SharePoint to Yammer: What's the best social software solution?

NEXT ARTICLE

The desktop of the future is here now: Augmented Reality in the workplace »
Marc Ambasna-Jones

Marc Ambasna-Jones is a UK-based freelance writer and media consultant and has been writing about business and technology since 1989.

  • Mail

Recommended for You

How to (really) evaluate a developer's skillset

Adrian Bridgwater’s deconstruction & analysis of enterprise software

Unicorns are running free in the UK but Brexit poses a tough challenge

Trevor Clawson on the outlook for UK Tech startups

Cloudistics aims to trump Nutanix with 'superconvergence' play

Martin Veitch's inside track on today’s tech trends

Poll

Is your organization fully GDPR compliant?