Hyperautomation without the hype

Since it was named the number one strategic technology trend in 2020, hyperautomation is predicted to generate a global addressable market of almost $600 billion in 2022. For firms seeking to enhance their competitive edge and exploit greater business value, it is becoming increasingly important to understand hyperautomation and create the foundations for it to be successfully adopted.

Abstract giving hand with cogwheels. Hyperautomation concept
Shutterstock

Historically, automation workflows have focused on the processing of structured data in a highly prescribed manner. However, with 80-90% of the world’s data being unstructured there is a massive hitherto untapped potential to create a competitive edge with automation. Historically, the most effective processor of unstructured data has been the human brain, but with no known way to deploy humans at scale in an economically feasible way, the possibilities have gone relatively unexplored.

Hyperautomation is the technological discipline that aims to crack this scalability issue and routinely incorporate unstructured data into the next generation of workflow automation, greatly increasing the pool of automation opportunities that firms can exploit to create a competitive advantage.

Putting the hyper in hyperautomation

In the context of data, the term “unstructured” is a misnomer. Images, portable documents, and audio files are all classified as unstructured data, but they are all highly structured. The real issue is that unstructured data isn’t in a form that existing systems can process. Existing systems expect information flows to communicate not only the data but also, either implicitly or explicitly, what this data means. Unstructured data does not do this; an image doesn’t tell you whether it contains an image of a cat or an order form. You must deduce the image’s content through inference and cognition.

Gartner’s definition of hyperautomation relates to an alphabet soup of underlying technologies, but the element which really puts the “hyper” in hyperautomation is analytics. It is analytics that, to a limited extent, provides the essential human-like cognitive capabilities required to extract information from unstructured data and map it into structured data formats that IT systems can process. Task competent Artificial Intelligence (AI) such as image recognition and natural language processing is the principal mechanism. It expands automation opportunities by providing enhanced data and contextualisation to what is likely to be relatively well-prescribed automation problems.

To expand the automation opportunities past complex data extraction, analytical capabilities will need to evolve to more closely approximate human-like capabilities, supporting more complex decisioning based on the enriched data. This includes providing decisions on more open-ended problems under uncertainty, requiring higher levels of reasoning and cognitive function.

The potential rewards

AI does not need to surpass human capabilities in order to drive significant benefits into hyperautomation. Distinctly sub-human task competence presents firms with a real opportunity to re-allocate their workforce to achieve their objectives more efficiently. In a recent study, firms shifting mundane tasks from humans to hyper-automated workflows exceeded their expectations and reduced operational costs by 27%. And whilst reducing costs is a laudable objective, arguably the real prize for firms is the opportunity for growth.

Currently, there are no computer CEOs. Growing a business and creating and exploiting new opportunities is a profoundly human endeavour. However, firms using hyperautomation can use AI to free up the real intelligence present in the workforce so it can be directed towards the growth-oriented activities that only humans are capable of.

Foundations for success

Firms should temper their enthusiasm to (hyper) automate, at least initially, and focus efforts on creating a solid foundation for their process landscape. This wise investment provides two benefits. First, it enables firms to strip out redundancy and cruft which inevitably accumulate over time and ensure that the firm’s operating processes are clearly and efficiently aligned to its strategic objectives. Secondly, it promotes a clearer understanding of where automation should and, just as importantly, should not be deployed. Without this investment in preparation, the costs of implementing and maintaining unnecessary or unwise automation solutions and their associated technical debt are likely to exceed the benefits they were intended to exploit.

A lack of competent data management is a critical impediment to hyperautomation initiatives. Despite significant investment in this area, firms are becoming less competent as data volumes increase and the data infrastructure fragments across cloud and on-premise environments. Accordingly, firms must invest significantly in hybrid integration capabilities that can eradicate data silos between the extremes of on-premise mainframes to cloud data lakes and API Management solutions that can transparently and securely share this data across the enterprise.

Making hyperautomation a reality

Hyperautomation solutions provide the promise of a new era for workflow automation. However, the increased opportunities to automate will require even more emphasis on creating strong business process and data management foundations. Routinely incorporating these foundations into planning and investment decisions will likely maximise the chances of hyperautomation initiatives being successful.

Dr Paul Fermor is UK Solutions Director at Software AG. He specialises in the solution of business problems with a rich data theme via the use of data analysis and system design, implementation and optimisation. Fermor has spent his career creating solutions for very large-scale data analysis and real-time processing in a variety of applications, ranging from medical imaging to capital markets trading.