A look at the new breed of AI chips
Software Architectures

A look at the new breed of AI chips

A $30m injection of cash into the UK’s Bristol-based AI chip startup Graphcore in July this year was another clear indication of how AI chip development is disrupting the traditional CPU players. Earlier in the same month, leading AI chipmaker NVIDIA struck a deal with Chinese web services firm Baidu with a promise to “accelerate AI”, while Microsoft executive vice president for AI research Harry Shum recently talked about how Microsoft is developing its own AI chip for HoloLens. Throw in Google’s TPU, Apple’s hints at an AI chip for its smartphones and a host of startups including Austin, Texas-based Mythic and you have a processor bunfight brewing.

It may prove to be a defining period for the technology industry and for traditional processing giant Intel in particular. Moore’s Law has stood Intel in good stead over the years, multiplying the transistors to boost performance, making it a dominant player in datacentre servers. There is of course, still plenty of life in that particular beast but the rapidly increasing drive for a data driven world is piling pressure on existing technologies and chip designs. As a result, Intel’s dominance is under threat and although the playing field may not have been entirely levelled, there’s enough of a shift to for new entrants to take a shot.

As you would expect Intel has not been idle. It launched its own AI chip Nervana last year and in March it acquired Israeli chipmaker MobileEye for $15bn but it’s having to play catch-up. Other businesses such as Nvidia have stolen a march, recognising as early as 2006, that a new approach was needed. According to Marc Hamilton, vice president of solutions architecture and engineering at NVIDIA, the company’s work in developing CUDA 11 years ago, a parallel computing platform and application programming interface, has been the rock on which the business is now building its AI-targeted products.

“We spend around $3bn a year on R&D and the vast majority of that benefits AI,” says Hamilton. “To be a success in AI you need to do things differently. You can’t rely on existing commodity processors.”

So, what does a $3bn R&D fund buy you? How many AI-related patents does the business hold? Hamilton didn’t talk numbers but pointed to NVIDIA’s Pascal architecture products and its work to develop improved ‘inferencing’ speeds. Its Tensor RT chip he says, is ten times faster than anything else in the inference step of a deep learning process, something which is already attracting early users of machine learning technologies.

To continue reading...


Please login or register to view your article. If you do not have or do not remember your password, please click on the “Forgotten your password?” link at the bottom.
If you do not yet have a password but are an existing user, please use the “Forgotten your password?

PREVIOUS ARTICLE

«The desktop of the future is here now: Augmented Reality in the workplace

NEXT ARTICLE

Everything you need to know about... Smart cities & IoT »
Marc Ambasna-Jones

Marc Ambasna-Jones is a UK-based freelance writer and media consultant and has been writing about business and technology since 1989.

Add Your Comment

Most Recent Comments

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Will Kotlin overtake Java as the most popular Android programming language in 2018?