What will AI mean to the traditional datacentre?
Business Process Automation

What will AI mean to the traditional datacentre?

The recent decision by US-Norwegian business Kolos to build a giant, 600,000m2 datacentre in the city of Ballangen, Norway, is on the surface at least, a wise decision. Ballangen, which incidentally is the birthplace of singer Frida from 70s pop group Abba, is inside the Arctic circle, which sort of solves the cooling issue. Maybe.

As demands to store and manage data accelerates, driven by increased machine connectivity and data analytics, so the pressure on traditional datacentres increases, certainly in terms of capacity and cooling. This Kolos site, chosen primarily for its clean, renewable energy, is expected to house 70MW of IT equipment, eventually scaling up to offer 1,000MW, within 10 years of construction.

AI protagonists are chipping away at Moore’s Law to redefine future processor. We take a look at the new breed of AI chips

The quest for bigger and better is understandable but when does size become an issue? Surely we can’t keep building huge datacentres in the arctic to keep up with the insatiable demand for data? There is a school of thought that the demand for AI-enabled machines will alter the course of datacentre development. Machine learning is, after all, capacity and power hungry.

“A combination of deep machine learning and analytics on growing data sets is driving adoption of IT systems that are more similar to high-performance computing clusters,” says Daniel Bizo, a senior analyst for datacentre technologies at 451 Research. “Such systems are high-powered and require much more power and cooling capacity than the average rack. Many traditional facilities will be stretched to meet such requirements at cost.”

The pressure is understandable and Kolos is in many ways a reaction to that but as we know, things move fast in the datacentre world. No sooner have you built a state-of-the-art complex than the technology shifts up a gear. But this can also lead to innovation. Certainly the demand for machine learning capability is already having an impact. So-called AI chips are being used in datacentres to help businesses who cannot afford their own high-powered servers to gain access to a sort of AI capability. This is the thinking behind Nvidia’s recent deal with Chinese firm Baidu.

For Aaditya Sood, senior director of engineering and products at Nutanix, machine learning can influence datacentre development significantly. Sood, who sold his DevOps automation firm Calm.io to Nutanix 12 months ago, believes that giving businesses access to machine learning GPUs through the datacentre has to be a good thing.

“This shouldn’t be limited to the big seven – the Googles and Facebooks and so on,” he says adding that while machine learning in datacentres is opening doors for potential innovation for business users, it can also have its own impact on how datacentres are designed and managed.

“Where AI – I prefer machine learning – is going to help is in two things. Firstly, the modelling system – discovering how my datacentre applications start better, how they are connected and, secondly, the more important part is how do I take corrective actions when things go wrong.”

While Sood admits that the make-up of any future datacentre will remain heterogeneous, in the sense that “it will never be 90 percent one vendor, there will always be different vendors at different layers fighting it out,” he says, it could lead to complexity. How you manage that complexity is always a challenge, which is where machine learning algorithms could probably come in and help.

If anything, it’s an opportunity. Bizo at 451Research agrees that demand for increased machine learning will re-invigorate the datacentre space and lead to further startups and acquisitions. ​

“Technology shifts always invite new entrants and datacentres are no different,” says Bizo. “In the foreseeable future, datacentre capacity demand is already forecast to grow. Advancements in deep learning and big data analytics will create appetite for an increased use of these techniques and in turn the systems on which they run. This will represent net new capacity requirements over the coming ten years. Even though there is a lot of technology and clever design involved, the most valuable component may still be location: access to and ownership of strategically positioned real estate.”

Bizo suggests that high-performance deep learning will naturally be attracted to power, where cost is lower and availability is higher. This could influence location but also may lead to some innovation.

“We expect some novel approaches, such as location next to electrical utility distribution substations and remote sites close to hydro and thermal power plants, to gain currency with some major customers training their AI or performing data mining on vast data archives,” says Bizo. “Bandwidth and its cost remain an issue, however.”

So where will this innovation be? Are there any signs of companies trying to tackle datacentre issues at the moment?

Ray Chohan, senior vice-president of corporate strategy at global intellectual property analytics company PatSnap, says AI can have a significant impact on how the infrastructure is run and supported, everywhere from energy usage and cooling technology, to robotic handling of day-to-day maintenance and security.

“For example, this patent published by EMC in 2014 describes the use of AI to improve the cost/performance storage and energy savings incurred by datacentre owners,” says Chohan. “Another interesting application of AI in the datacentre comes from Oracle, in a patent that was published in May 2017. The patent describes a system and method for Distributed Denial of Service (DDoS) identification and prevention. Among other things, the author sees this invention as a step up from traditional intrusion detection methods that use machine learning, as they “typically require human interaction to improve rule-based binary classifications.” As IoT botnets become more present, Cisco expects DDoS attacks to reach approximately 17 million per year in 2020. With so much IT infrastructure moving toward cloud, the ability to leverage AI and minimise the damage of these attacks will become extremely valuable.”

Chohan adds that the companies innovating most in this area include Microsoft, Numenta, Amazon, Harris Corporation and IBM. The US is by far the most authoritative in this field, with 84 percent of the patents filed in this area but if we have learned one thing from history it’s that when there is a shift in technology – and machine learning/AI is that shift – it sort of levels the playing field a bit for new entrants to stake their claim. One thing is certain. The datacentre is already changing and will no doubt be the source of a new wave of innovation and acquisition targets for the next 10 years.

PREVIOUS ARTICLE

«What enterprises can learn from national data breaches

NEXT ARTICLE

How can AI be used to boost sales?»
Marc Ambasna-Jones

Marc Ambasna-Jones is a UK-based freelance writer and media consultant and has been writing about business and technology since 1989.

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Should the government regulate Artificial Intelligence?