17 high-paying tech skills that will go higher

Despite more volatility than normal, pandemic conditions have benefitted market values for some skills more than others.

1 2 3 Page 2
Page 2 of 3

Market value increase: 13.3 percent (in the six months through April 1, 2021)

Apache Pig is a high-level platform for analysing large data sets that consists of a high-level language for expressing data analysis programs, coupled with infrastructure for evaluating these programs. The salient property of Pig programs is that their structure is amenable to substantial parallelisation, which in turns enables them to handle very large data sets. Right now, Pig's infrastructure layer consists of a compiler that produces sequences of Map-Reduce programs, for which large-scale parallel implementations already exist (e.g., the Hadoop subproject). Pig can execute its Hadoop jobs in MapReduce, Apache Tez, or Apache Spark. Pig's language layer currently consists of a textual language called Pig Latin which abstracts the programming from the Java MapReduce idiom into a notation which makes MapReduce programming high level, similar to that of SQL for relational database management systems. Pig Latin can be extended using user-defined functions (UDFs) which the user can write in Java, Python, JavaScript, Ruby or Groovy and then call directly from the language.

Pig's language layer has the following key properties:

  • Ease of programming.It is trivial to achieve parallel execution of simple, "embarrassingly parallel" data analysis tasks. Complex tasks comprised of multiple interrelated data transformations are explicitly encoded as data flow sequences, making them easy to write, understand, and maintain.
  • Optimisation opportunities.The way in which tasks are encoded permits the system to optimse their execution automatically, allowing the user to focus on semantics rather than efficiency.
  • Users can create their own functions to do special-purpose processing
  1. [Tie] Cryptography (encryption, VPN, SSL/TLS, Hybrids)
    Data Architecture

Ethereum

Identity and access management

PyTorch

Market value increase: 6.3 percent (in the six months through April 1, 2021)

Cryptography (or cryptology) is the practice and study of techniques for secure communication in the presence of third parties called adversaries. More generally, cryptography is about constructing and analysing protocols that prevent third parties or the public from reading private messages. Modern cryptography exists at the intersection of the disciplines of mathematics, computer science, electrical engineering, communication science, and physics and includes various aspects of information security such as data confidentiality, data integrity, authentication, and non-repudiation. Applications of cryptography include electronic commerce, chip-based payment cards, digital currencies, computer passwords, and military communications.

Data architecture is the process of standardising how organisations collect, store, transform, distribute, and use data. The goal is to deliver relevant data to people who need it, when they need it, and help them make sense of it. And it’s the skyrocketing growth and availability of real-time data from internal and external sources that is driving skills demand are business strategists demanding more and faster insights from data. The promise of modern data architecture design is that a well-designed process puts business strategists and technical expertise at the same table. Together, they can determine what data is needed to propel the business forward, how that data can be sourced, and how it can be distributed to provide actionable information for decision makers.

What’s also pushed big data into the real world is the growing influence of the cloud, which provides the kind of fast, easy, and low-cost scalability that modern data architecture requires. The cloud also allows organisations to pool much or all of their data in one place, where ideally, one master version of the data is available to all who need it.

Data architecture in its current phase has to be built around certain characteristics which are also prerequisites to earning cash pay premiums:

  • User-driven: In the past, data was static and access was limited. In modern data architecture, business users can confidently define the requirements, because data architects can pool data and create solutions to access it in ways that meet business objectives.

  • Built on shared data: Effective data architecture is built on data structures that encourage collaboration. Good data architecture eliminates silos by combining data from all parts of the organisation, along with external sources as needed, into one place to eliminate competing versions of the same data. In this environment, data is not bartered among business units or hoarded, but is seen as a shared, companywide asset.

  • Automated: Automation removes the friction that made legacy data systems tedious to configure. Processes that took months to build can now be completed in hours or days using cloud-based tools. If a user wants access to different data, automation enables the architect to quickly design a pipeline to deliver it. As new data is sourced, data architects can quickly integrate it into the architecture.

  • Driven by AI: Smart data architecture takes automation to a new level, using machine learning (ML) and artificial intelligence (AI) to adjust, alert, and recommend solutions to new conditions. ML and AI can identify data types, identify and fix data quality errors, create structures for incoming data, identify relationships for fresh insights, and recommend related data sets and analytics.

  • Elastic: Elasticity allows companies to scale up or down as needed. Here, the cloud is your best friend, as it allows on-demand scalability quickly and affordably. Elasticity allows administrators to focus on troubleshooting and problem solving rather than on exacting capacity calibration or overbuying hardware to keep up with demand.

  •  Simple: Simplicity trumps complexity in efficient data architecture. Do you need a show dog or a workhorse? Strive for simplicity in data movement, data platforms, data assembly frameworks, and analytic platforms.

  • Secure: Security is built into modern data architecture, ensuring that data is available on a need-to-know basis as defined by the business. Good data architecture also recognises existing and emerging threats to data security, and ensures regulatory compliance with legislation like HIPAA and GDPR.

Ethereum is one of the most popular decentralised open source, public blockchain-based distributed computing platform and OS for smart contract functionality. If you want to become a blockchain expert, learning how to build apps on Ethereum is a great place to start.  It is the second-largest cryptocurrency platform by market capitalisation, behind Bitcoin, serving as the platform for over 1,900 different cryptocurrencies and tokens, including 47 of the top 100 cryptocurrencies.

Identity and access management (IAM) in enterprise IT is about defining and managing the roles and access privileges of individual network entities (users and devices) to a variety of cloud and on-premises applications. Users include customers, partners and employees; devices include computers, smartphones, routers, servers, controllers and sensors. The core objective of IAM systems is one digital identity per individual or item. Once that digital identity has been established, it must be maintained, modified and monitored throughout each user’s or device’s access lifecycle.

Identity has become more important since the COVID pandemic has made physical boundaries irrelevant, with the aggressive move to remote users and giving more users outside the organisation greater access to their internal systems. With digital transformation accelerating, identity has become the cornerstone of customer acquisition, management, and retention, and COVID-caused disruption has surfaced weaknesses in many organisations’ IAM architecture and greatly accelerated IAM evolution.

PyTorch is an open source machine learning framework based on the Torch library that accelerates the path from research prototyping to production deployments. It is used for applications such as computer vision and natural language processing, primarily developed by Facebook's AI Research lab (FAIR). Although the Python interface is more polished and the primary focus of development, PyTorch also has a C++ interface. A number of pieces of Deep Learning software are built on top of PyTorch, including Tesla Autopilot, Uber's Pyro, PyTorch Lightning, and Catalyst.

PyTorch provides two high-level features:

  • Tensor computing (like NumPy) with strong acceleration via graphics processing units (GPU)
  • Deep neural networks built on a tape-based automatic differentiation system

Key features and capabilities of PyTorch include:

  • Production Ready, Transition seamlessly between eager and graph modes with TorchScript, and accelerate the path to production with TorchServe
  • Distributed Training. Scalable distributed training and performance optimisation in research and production is enabled by the torch.distributed backend.
  • Robust Ecosystem. A rich ecosystem of tools and libraries extends PyTorch and supports development in computer vision, NLP and more.
  • Cloud Support. PyTorch is well supported on major cloud platforms, providing frictionless development and easy scaling.    
  1. [Tie] IT Governance

Market Value Increase: 14.3 percent (in the six months through April 1, 2021)

At its essence IT governance provides a structure for aligning IT strategy with business strategy. By following a formal framework, organisations can produce measurable results toward achieving their strategies and goals. A formal program also takes stakeholders' interests into account, as well as the needs of staff and the processes they follow. In the big picture, IT governance is an integral part of overall enterprise governance.

But what is driving popularity in IT governance right now that has resulted in higher pay premiums than before? We believe it’s because organisations are being subjected to more and more regulations governing the protection of confidential information, financial accountability, data retention and disaster recovery, among other things. They're also under more pressure from shareholders, stakeholders and customers. To ensure they meet internal and external requirements, more organisations are implementing formal IT governance programs that provide a framework of best practices and controls. This applies to both public- and private-sector organisations; a formal IT governance program should be on the radar of any organisation in any industry that needs to comply with regulations related to financial and technological accountability. Implementing a comprehensive IT governance program requires a lot of time, effort and especially expertise that should be rewarded with pay premiums. 

There’s also GRC (governance, risk and compliance) which is practically the same thing as IT governance but necessarily incorporates security domains. While GRC is the parent program, what determines which framework is used is often the placement of the CISO and the scope of the security program. For example, when a CISO reports to the CIO, the scope of GRC is often IT focused. When security reports outside of IT, GRC can cover more business risks beyond IT.

  1. [Tie] Clojure

Prescriptive Analytics

Risk management

Teradata   

        Market value increase: 6.7 percent (in the six months through April 1, 2021)

Clojure is a general-purpose, dynamic, compiled, and predominantly functional programming language from the Lisp family tree. Amazon, Staples, and Walmart are just some examples of major companies that use it in their technology stacks. Clojure embraces Functional Programming (FP). Functions are treated as first-class citizens, and data is immutable by default. When you create lists, maps, vectors, etc., they are immutable by definition.

Functional features of Clojure include:

  • Declarative programming model.You express the logic of a program’s structure and elements (what you want data to do) without having to describe its control flow (how it’s done).
  • Support for higher order functions.These are functions that can take in functions as arguments and/or return functions as results.
  • Immutable persistent data structures.When a change occurs, the old data structure is preserved, and a new structure is returned expressing the relevant parts of the old structure with the newly created data. Because they are immutable, they eliminate many typical errors found in most concurrent programming.
  • Absence of side effects.While complete absence of side effects is impossible for real-world applications, Clojure’s immutable information model does a good job of isolating them. Clojure uses side effects explicitly via its language syntax.

Clojure is unique in several ways, which may be why employers are willing to pay higher cash pay premiums for it. One is that it was designed to be a hosted language: Instead of defining its own platform (as Python, Ruby, Java, etc.) have done, Clojure was meant to take advantage of existing platforms and to build on top of them. Clojure currently is developed on two platforms, the Java Virtual Machine and JavaScript. Clojure has incredible reach, running wherever Java does, any web browser, or any mobile device. While most functional languages, such as Scala and Haskell, tend toward static types, Clojure is dynamic. The tool’s REPL (Read-Eval-Print Loop) makes it easier to catch errors as you code, and dynamism makes code more flexible and extensible. 

1 2 3 Page 2
Page 2 of 3