In the thirty years our firm has been tracking, reporting, analysing and forecasting tech labour compensation and supply/demand, we’ve never seen what’s currently considered normal in the tech labour marketplace. Sure, there have been ‘sellers’ markets’ before when employees have more of an upper hand in negotiating for promotions, new jobs, higher pay, and valued perks. But today is different: geographical limitations are minimal; workers have more say about when, where, and how much they work; and employers are finally realising that pay is not the motivator they’ve always been convinced it is.
But the fact is that organisations are quickly shifting their business models and creating new ones in the race to adopt several high momentum technologies that have deeply altered the depth and variety of skills and experience required in their workforces. This has had radical implications for IT talent and workforce dynamics.
One way of measuring these shifts is tracking increases or decreases in the cash pay premiums that employers are willing to pay tech workers—in addition to their salaries---for more than a thousand in-demand tech skills. Our firm has been tracking skills pay since 2000 in our quarterly-updated IT Skills and Certifications Pay IndexTM (ITSCPI), currently reporting pay for 584 certified and 661 noncertified tech skills. That’s a lot of skills, and the survey demographics behind the ITSCPI are equally impressive: 88,920 U.S. and Canadian tech professionals in as many as 4,010 private and public sector employers.
Not every employer offers this critical element of compensation for adjusting to extreme market volatility. But those that do value flexibility and pay agility as key ingredients in hiring and retaining the talent they need, when they need it. This gives them tremendous competitive advantage at a time when failure means being acquired by or merged with other more successful companies who have not buckled in the face of unprecedented pressures to ‘meet the moment’ with the skills and talent to build, deliver, and support the new business models.
High paying and going higher
The following noncertified tech skills meet three prerequisites:
- They don’t require formal certification
- They earn workers cash pay premiums well above the average of all 661 skills reported in our latest IT Skills and Certifications Pay IndexTM
- They recorded gains in cash market value in the six months ending July 1, 2022
No skill below is earning less than the equivalent of 17% of base salary—significant considering the average for all skills reported is 9.6% of base. They are listed in descending ranked order of cash premium and market value increases (including ties). Not surprising, the list contains a number of data management, analytics, AI, risk and infrastructure skills.
In alphabetical order they are:
- Apache Cassandra
- Artificial Intelligence
- Artificial Intelligence for IT Operations
- Azure Machine Learning
- Big Data
- Big Data analytics
- Bioinformatics
- Blockchain
- Complex Event Processing/Event Correlation
- Cryptography
- Data Analytics
- Data Architecture
- Data Engineering
- Data Integration DataOps
- Data Strategy
- Deep Learning
- DevOps
- Functional Programming
- Informatica
- Machine Learning
- Microservices
- Natural Language Processing
- Network Architecture
- Neural Networks Prescriptive Analytics
- Redis
- Scaled Agile Framework (SAFe)
- Security architecture and models
- Smart Contracts
- Zachman Framework
- Big Data Analytics
Average pay premium: 20% of base salary equivalent
Market value increase: 17.6% (in the six months through July 1, 2022)
Big Data Analytics is the use of advanced analytic techniques against very large, diverse big data sets that include structured, semi-structured and unstructured data, from different sources, and in different sizes from terabytes to zettabytes.
With Big Data Analytics, you can ultimately fuel better and faster decision-making, modelling and predicting of future outcomes and enhanced business intelligence. As you build your big data solution, consider open source software such as Apache Hadoop, Apache Spark and the entire Hadoop ecosystem as cost-effective, flexible data processing and storage tools designed to handle the volume of data being generated today.
- Smart Contracts
Average pay premium: 20% of base salary equivalent
Market value increase: 5.3% (in the six months through July 1, 2022)
Smart contracts help you exchange money, property, shares, or anything of value in a transparent, conflict-free way while avoiding the services of a middleman. They’re the product of the decentralised ledger systems that run the blockchain and so skills in smart contracts are being catapulted along with Ethereum and others for an almost unlimited number of uses, ranging from financial derivatives to insurance premiums, breach contracts, property law, credit enforcement, financial services, legal processes and crowdfunding agreements.
- [Tie] Artificial Intelligence for IT Operations (AIOps)
Bioinformatics
Average pay premium: 19% of base salary equivalent
Market value increase: 18.8% (in the six months through July 1, 2022)
Artificial intelligence for IT operations (AIOps) is an umbrella term coined by Gartner for the use of big data analytics, machine learning (ML) and other artificial intelligence (AI) technologies to automate the identification and resolution of common IT issues. As the systems, services and applications in a large enterprise especially produce immense volumes of log and performance data, AIOps can use this data to monitor assets and gain visibility into dependencies within and outside of IT systems. AIOps is also an acronym of "Algorithmic IT Operations" which includes automation, performance monitoring and event correlations, among others.
AIOps is generally used in companies that use DevOps or cloud computing and in large, complex enterprises. AIOps aids teams that use a DevOps model by giving development teams additional insight into their IT environment, which then gives the operations teams more visibility into changes in production. AIOps also removes a lot of risks involved in hybrid.
Bioinformatics is an interdisciplinary field that develops methods and software tools for understanding biological data, in particular when the data sets are large and complex. As an interdisciplinary field of science, bioinformatics combines biology, chemistry, physics, computer science, information engineering, mathematics, and statistics to analyse and interpret the biological data. It includes biological studies that use computer programming as part of their methodology, as well as a specific analysis “pipelines” that are repeatedly used, particularly in the field of genomics. In experimental molecular biology, bioinformatics techniques such as image and signal processing allow extraction of useful results from large amounts of raw data. In the field of genetics, it aids in sequencing and annotating genomes and their observed mutations. It plays a role in the text mining of biological literature and the development of biological and gene ontologies to organise and query biological data.
Bioinformatics tools aid in comparing, analysing and interpreting genetic and genomic data and more generally in the understanding of evolutionary aspects of molecular biology. At a more integrative level, it helps analyse and catalogue the biological pathways and networks that are an important part of systems biology. Databases are essential for bioinformatics research and applications, covering various information types: for example, DNA and protein sequences, molecular structures, phenotypes and biodiversity. They may contain empirical data (obtained directly from experiments), predicted data (obtained from analysis), or, most commonly, both. They can incorporate data compiled from multiple other databases. These databases vary in their format, access mechanism, and whether they are public or not.
Analysing biological data to produce meaningful information involves writing and running software programs that use algorithms from graph theory, artificial intelligence, soft computing, data mining, image processing, and computer simulation. The algorithms in turn depend on theoretical foundations such as discrete mathematics, control theory, system theory, information theory, and statistics.
Software tools for bioinformatics range from simple command-line tools to more complex graphical programs and standalone web-services available from various bioinformatics companies or public institutions. The combination of a continued need for new algorithms for the analysis of emerging types of biological readouts, the potential for innovative computer simulation experiments, and freely available open code bases have helped to create opportunities for all research groups to contribute to both bioinformatics and the range of open-source software available.
The range of open-source software packages includes titles such as Bioconductor, BioPerl, Biopython, BioJava, BioJS, BioRuby, Bioclipse, EMBOSS, .NET Bio, Orange with its bioinformatics add-on, Apache Taverna, UGENE and GenoCAD. SOAP- and REST-based interfaces have been developed for a wide variety of bioinformatics applications allowing an application running on one computer in one part of the world to use algorithms, data and computing resources on servers in other parts of the world. Then there are Bioinformatics workflow management systems, specialised forms of workflow management systems designed specifically to compose and execute a series of computational or data manipulation steps, or a workflow, in a Bioinformatics application.
- Blockchain
Average pay premium: 19% of base salary equivalent
Market value increase: 11.8% (in the six months through July 1, 2022)
Blockchain is the innovative database technology that is at the heart of nearly all cryptocurrencies. By distributing identical copies of a database across an entire network, blockchain makes it very difficult to hack or cheat the system. While cryptocurrency and a range of applications including NFT ownership are probably the most popular use for blockchain presently, the technology offers the potential to serve a wide range of applications including DeFi smart contracts.
While any conventional database can store this sort of information, blockchain is unique in that it is totally decentralised: rather than being maintained in one location, by a centralised administrator—think of an Excel spreadsheet or a bank database—many identical copies of a blockchain database are held on multiple computers spread out across a network. These individual computers are referred to as nodes. A blockchain network can track orders, payments, accounts, production and much more. And because members share a single view of the truth, you can see all details of a transaction end-to-end, giving you greater confidence, as well as new efficiencies and opportunities.
Blockchain is hot right now. The industry has a staggering CAGR rate projected to be over 69% between 2019 to 2025. According to a report by Research and Markets, the global blockchain market was valued at $4.93 billion in 2021 and is expected to hit $67.4 billion by 2026 and $1.43 trillion by 2030 (and as high as $3.1 trillion by some estimates) driven mainly by the global recognition of cryptocurrencies.
We see greater investment in blockchain in the coming year by the financial industry, as the use of blockchain in the financial sector is anticipated to reach a value of 22.5 billion U.S. dollars by 2026. But outside of finance, blockchain in health care is expected to rise to $5.61 billion by 2025.
- [Tie] Deep Learning
Prescriptive Analytics
Security architecture and models
Average pay premium: 19% of base salary equivalent
Market value increase: 5.6% (in the six months through July 1, 2022)
A type of artificial intelligence, Deep learning is a subset of machine learning, which is essentially a neural network with three or more layers. These neural networks attempt to simulate the behavior of the human brain—albeit far from matching its ability—allowing it to “learn” from large amounts of data. While a neural network with a single layer can still make approximate predictions, additional hidden layers can help to optimize and refine for accuracy. Deep learning drives many artificial intelligence (AI) applications and services that improve automation, performing analytical and physical tasks without human intervention.
Key to skills demand driving up skills premiums is that deep learning technology lies behind everyday products and services (such as digital assistants, voice-enabled TV remotes, and credit card fraud detection) as well as emerging technologies (such as self-driving cars). Among the industries benefitting from this technology:
· Law enforcement. Deep learning algorithms can analyse and learn from transactional data to identify dangerous patterns that indicate possible fraudulent or criminal activity. Speech recognition, computer vision, and other deep learning applications can improve the efficiency and effectiveness of investigative analysis by extracting patterns and evidence from sound and video recordings, images, and documents, which helps law enforcement analyse large amounts of data more quickly and accurately.