How enterprises can embrace Deep Learning
Wireless Technologies

How enterprises can embrace Deep Learning

Artificial Intelligence (AI) is quickly becoming a hot topic in the world of tech. No longer just an ambitious research project, companies both large and small can introduce Machine and Deep Learning into their business.

Speaking at the Re: Work Deep Learning Summit in London, Arjun Bansal, Co-founder and VP of Algorithms at Nervana Systems, outlined how businesses can approach their first foray into the world of Deep Learning.

Step 1: The framework

“You don’t need to reinvent the wheel,” said Bansal, whose company provides cloud-based AI platforms (and was recently acquired by Intel). “You can use an existing model.

“The world of AI and Deep Learning have been unusually open so you get a lot of publically available models, and through the process of transfer learning you can take one of these publically available models and apply them to your data.”

He warned that simply using pre-trained models or APIs would not be enough, and any system you do use would have to be fine-tuned to your own data sets in order to achieve the best results.

“If you’re trying to detect different kinds of tumours you don’t want an API that’s just been trained on classifying cats versus dogs, and with a micro-transcription system you don’t want something that was just trained on voicemail datasets.”

Step 2: The people

As you’d expect with such a new and complicated technology, the issues of skills has to be addressed.

“The talent is really tight,” Bansal said. “All the Deep Learning talent is getting absorbed by the big companies in the space, so how do you as an enterprise or a startup, go about building your talent in the space?

“Aside from employing professional services to do the leg work for you, training staff from other but similar areas of the business could be fortuitous.

“There's a lot of talent in adjacent fields to Deep Learning: traditional machine learning, or areas such as computational neuroscience, or computational physics, or even just in software engineering.

“There's a lot of really talented software engineers who are really interested in AI and Deep Learning, and with the right kind of training, they can pick up what's needed to be successful so we recommend hiring and training these individuals.”

Step 3: On-prem vs. the cloud

Once you’ve got both your people and your data ready, it’s time to decide how and where you want to process all the data.

“There's a big trend towards using public clouds and data's pretty secure on a lot of these public cloud offerings, but many companies are on the longer, costlier, five-step program towards the public cloud,” Bansal explained.

“If you're in one of these companies you have an on-prem appliance option as you're building up your deep learning stack, or you can go straight to using a cloud service and skip these intermediate steps.”

Given the relatively low cost and agility of today’s cloud services it makes sense to start small and experiment on your preferred public cloud of choice and then scale up once you’re happy with the early results.

Bansal does warn, however, to be aware of vendor lock in.

“Although Deep Learning has been very open - a lot of academics are getting into Deep Learning and bringing this very open spirit through publishing papers, code, and frameworks - there are still some things that are proprietary, so you want to evaluate what kind of lock-in you're getting into as you're choosing these different technologies.”

Step 4: Speed = advantage

“Deep learning can take weeks, or even months in some cases,” said Bansal. “So if you have access to technology that can give you a big speed advantage – that can give you a huge competitive edge over the competition in whatever space you're in.”

A whole host of companies have launched Deep Learning-specific processors – Nervana, Google, Microsoft, NVidia etc. – so do your research into what works best with the frameworks and models you’re going to be using.

Step 5: Getting data moving

“Another really important consideration is planning your data strategy. You could have all the processors in the world, and if you don't have it continuously fed with data, then that's going to be a huge bottleneck and you could experience significant slowdowns.”

Where data is being moved from and to, where much of the different sorting and processing is being done, and storage, all need to be considered.

“You want to think about the upstream processes as you're getting the data over to the processor that you're using. In addition to that, you want have support for all the different file formats that you expect to encounter and bring to do the load, how it transforms, and the augmentations for those formats as well.”

Step 6: Delivery strategy

Once everything else in place, you can start putting everything to work. But Bansal highlighted the need to know your strategy. Different form factors – whether phones, small embedded devices, websites, or big desktops – have very different power, memory, accuracy, and connectivity trade-offs. Planning ahead and making sure these are taken into consideration beforehand is important.

PREVIOUS ARTICLE

«The real meaning of… Trolledge

NEXT ARTICLE

Quotes of the week: “Yes, I'm grumpy”»
author_image
Dan Swinhoe

Staff Writer at IDG Connect.

  • twt
  • twt
  • Mail

Add Your Comment

Most Recent Comments

Resource Center

  • /view_company_report/775/aruba-networks
  • /view_company_report/419/splunk

Poll

Crowdfunding: Viable alternative to VC funding or glorified marketing?