AI is becoming a tradeable commodity

Artificial Intelligence (AI) has evolved and is now being grafted into our enterprise applications at surface level and throughout architectures and infrastructures -- how do the core mechanics of the AI marketplace work in this new world of smartness and how can we apply caveat emptor smartness of our own to ensure we bring in the right 'packaged' AI when we don't build it ourselves?


Artificial Intelligence (AI) may be on the way to becoming a commodity. If not quite the same as oil, gold, wheat, cotton and sugar, AI could still be traded and exchanged over the dedicated networks that are now being established to serve this new marketplace.

Data itself is already being exchanged. Data marketplace platform companies like France and US-based Dawex specialise is providing the arena, platform and tools for organisations to be able to use each other's anonymized datasets.

Once personally identifiable information inside datasets is appropriately masked, another business can observe data patterns and use them as reference templates to guide their own business decisions.

Sterilise, sanitise… then share

A similar push is happening in AI. If one company builds an AI engine or even a smaller neural node of intelligence designed to solve a business problem, then why shouldn't another business be able to use the same tool? Once sterilised and sanitised for onward use, one retail purchase ledger system AI accelerator can be grafted onto another, or so the theory goes.

So what needs to happen before we can start to componentise and productise AI to make it a tradeable commodity? In real business terms, it often comes back to the quite rudimentary practices of document management, workflow analysis and process mining alongside its higher-level cousin, task mining.

With a background in Optical Character Recognition (OCR) and related document intelligence technologies, US and Europe-headquartered Abbyy like so many other firms now refers to itself as a digital intelligence company. This plea for broader categorisation comes from Abbyy's concerted efforts to put higher-level software functions into its 'solutions' over much of the last decade.

Using its latest (virtual) user conference to explain its product developments, Abbyy (you may know the firm as ABBYY from its branding bravado) detailed its Vantage 2 software designed to deliver cognitive skills for RPA robots, automation systems, chatbots and mobile solutions that need to get insights from documents and content.

This low-code/no-code AI software is supposed to be used by business users to digitise operations faster, without complete reliance on IT. So this is all about ingesting, processing and analysing documents on a large scale. It is also all about getting some resulting actions from that process, or as the industry likes to call them 'insights', which can direct business decisions.

Beyond common or garden business

In many ways, these are the building blocks of AI for the standard common or garden business. In layperson's terms, this is a question of: what happens in our business, what processes go on, what documents are filled with what data as a result, what does that data pertain to and mean… and so what should we do about it? The promise of AI is, obviously, to go beyond the common or garden.

Alongside its new AI skills software, Abbyy has also launched Abbyy Marketplace as an open marketplace where organisations can discover, try and purchase reusable AI skills to drive their automation initiatives. The company says that partners can also build and publish skills to automate all types of content-centric processes.

"This marketplace provides an online collection of reusable technology assets including cognitive skills for classification of documents and data extraction, ready-to-go process flows and pre-built connectors," said Bruce Orcutt, SVP of product marketing at Abbyy.  

If more enterprises actually are interested in the try-and-buy method in the age of cloud services, then Abbyy may capitalise well upon this segment of business intelligence. Its marketplace offers pre-trained skills for documents like invoices, purchase orders, receipts, loan documents, insurance claims, bills of loading etc. Abbyy also notes that its partners also can contribute new skills and other technology assets to the marketplace or utilise ready-to-deploy assets to speed up automation projects.

"We see a new kind of business user within enterprises today, looking for a faster way to consume and leverage data contained within documents. Their goals can be achieved through AI-enabled, easily consumable, ready-to-use technology we simply call skills," added Orcutt.

Caveat emptor AI mercatus

One of the main challenges associated with AI is of course AI bias. Recent IBM research suggested that there are as many as 180 human biases that can impact how we make decisions and so ultimately make decisions about how AI engines are built.

IBM warns that so-called synthetic data could become a powerful solution in solving AI's bias problem. Synthetic data is generated partially or completely artificially rather than measured or extracted from real-world events or phenomena. If the dataset is not diverse or large enough, AI-generated data can fill in the holes and form an unbiased dataset.

Manually creating these data sets can take teams several months or years to complete. When designed with synthetic data, it can be done overnight. So this could become a part of the total technology proposition here in terms of what we do to build the data being used to build the AI, which is in turn the AI we use to build the AI products in the AI marketplaces.

Coming next, citizen AI?

The question obviously arises - where is all this leading us? If the forces of tech coalesce in this space according to normally observed celestial patterns, we could see more AI functions being created through the use of low-code/no-code software tools by non-programmer business users.

As these Citizen AI Engineers (formalisation through capitalisation is only around the corner) start to define themselves, their ability to plug packaged backend AI intelligence into frontline business use case procedures will be validated if they connect the blocks together (even if by haphazard luck) in the right way.

At the risk of naysaying and even with the 'benefit' of synthetic data to plug gaps, the danger here is that AI built this way could sometimes lack enough fine-grained bias control, software engineering excellence, software integration power and forethought for scalability.

The truth is, AI is really only just ready for AI marketplaces and some low-code/no-code tooling. So let's avoid AI vending machines for now if we can please. We're smart enough not to, right?