How Edge AI represents the next major opportunity for enterprise

As artificial intelligence continues to advance and adoption rates grow across different industries, more sophisticated tech will be needed to power it. Does Edge computing fit the bill?

Artificial intelligence is without a doubt one of the hottest areas in the technology world right now. Whether it's writing novels, beating champion chess players or strong-arming cyber security plans, AI has limitless possibilities and is constantly making headlines all across the world.

For many, AI - along with the Internet of Things - marks the next major technological revolution. It's set to change the way we live and work in a plethora of ways, automating mundane processes and making life easier across the board. Capabilities include problem solving, speech recognition and machine learning.

What's more, artificial intelligence represents a major commercial opportunity. According to research from Markets and Markets, the AI sector will reach $190 billion in 2025. Meanwhile, IDC claims that spending on AI systems will top $57.7 billion and that 75 per cent of commercial enterprise apps will be powered by AI by 2021.

However, while artificial intelligence has already demonstrated a wealth of capabilities, it's still in the early days and has a long way to go. One of the next big advancements for this technology will be the rise of Edge AI, a technological approach that sees computational algorithms processed locally on devices as opposed to the cloud. We explore the benefits in practice.

Increasing device power

Today, a lot of artificial intelligence technologies rely on the cloud. But as they become increasingly powerful, new processing methods will be necessary. Paul Neil, VP of product management at XMOS, tells IDG Connect: "The issues associated with managing the control of smart home devices in the cloud, combined with questions surrounding the security and privacy of the data transmitted, suggest that a wholly cloud-based model may not scale as we usher in the era of ambient computing.

To continue reading this article register now