Shifting the 90/10 factor by operationalising AI: How to make the most of your data science talent

How can organisations operationalise AI and create the right processes to enable a structured approach that helps improve data insights.

This is a contributed article by Björn Brinne, Head of Data Science, Peltarion

The main driving forces behind the popularisation of AI are centred on the will to solve business and organisational problems. But the technology will only gain traction if resources are allotted intelligently and lead to improved outcomes, which hinges on two key factors: productivity and skills. Organisations need to free up data science resources to focus exclusively on value generating activities, while also widening the pool of talent that can contribute to AI projects.

To do this, organisations need to support the data science team with solutions that reduce effort spent on support tasks, to increase the time that can be spent on high-value tasks. The 90/10 rule in data science claims that 90% of the benefits of data analytics comes from the first 10% of effort. This 90% includes strategic planning, model optimisation and creating the right processes to enable a structured approach that will help further and improve data insights.

So how can organisations ensure they get off to the right start? And how can organisations ensure they are making the most of the talent available to them?

The data scientist, still a rare breed

Data scientist is one of the most sought-after roles in industry today; in fact, postings for data scientists on have risen 75% over the last three years. IBM predicts an increased demand for 700,000 more data scientists by 2020 in the U.S. alone, while the European Commission estimates that 100,000 data-related jobs will be created in Europe by 2020. However, qualified, talented data scientists remain hard to find and expensive.

The recent explosion in demand for data science experts in industry has created a talent gap, with large, deep-pocketed technology companies securing most of the best talent. And If this continues, then we will see a widening gap emerge between the haves and have-nots in terms of data science talent, giving larger players a vast advantage, making it harder than ever for smaller organisations to complete.

Adding to the challenge is the fact that you don't need just any data scientist; you need the right data scientist mapped to your project and objectives. Also, historically, most people working in data science came from academia; not business. Yet aligning your data science team to your business objectives over time is imperative.

For AI to be operational and make more sense for businesses, it must have reduced inputs and increased outputs and value. And this has to be true over time, not just as a one-point-in-time boost. To get the most value from data, and drive productivity from AI projects, the data scientist is a key resource on your team. In this context, making your AI team's time as productive and efficient as possible is vital.

Productivity, the economic measure of output per unit of input

Productivity improvements can come when your data scientist and your team can focus on mainly high-value activities, like modelling, insight and the business domain areas and outcomes, rather than all the support processes. Every percentage point of effort that can be shifted in this equation improves productivity dramatically. Freeing data scientists up for more value-add tasks has a direct impact on entire teams and the project overall.

There's healthy potential to shift the formula and "upskill" the talent pool. For example, moving toward "productising" tasks would allow more narrowly skilled or less experienced team members (such as developers) to take more involved and leading roles. Productising allows the platform (versus the individual) to undertake more tasks, such as experiment management and auditing, and to learn and evolve in responsibility over time; taking over more and more of these tasks.

Giving AI teams an easy to use end-to-end platform that's GUI-based is another way to reduce the overall effort and free up data science teams to focus on more high-value tasks. In addition, speeding up the experimentation cycle has big impact - reducing the time to import and process data, create a deep learning model to test in minutes and running tests on models quickly and in parallel.

Shifting the 90/10 factor

Operationalising AI and taking a platform approach enables better use of resources, helping to keep data scientists focused on high value tasks, while also opening the field to those with less experience. In AI projects, productivity metrics hinge on freeing up your data science resources to focus exclusively on high-value activities; shifting the 90/10 factor and eliminating the lower-value support functions from their repertoire. By shifting the 90/10, AI will become more affordable and accessible, levelling the playing field to ensure all organisations can gain value from AI.

Björn Brinne is the head of data science at Peltarion. Previously head of business intelligence Stockholm for King, he has over a decade of experience working in data science for companies such as Electronic Arts and Bwin Games. He joined Peltarion in November 2016 to help the company in its mission to make AI and deep learning accessible, affordable and reliable for everyone.