Data 2020: Change is coming

The way businesses will handle and manage data in 2020 will change because of areas like cloud, open source and microservices.

This is a contributed article by Matt Yonkovit, Chief Experience Officer, Percona

There have been lots of analogies developed around data over the past few years, from data as the new oil to it being equivalent to nuclear waste or bacon. Whichever one is your favourite, data will continue to be essential to businesses. However, how businesses handle and manage that data over time will evolve in 2020, spurred on by changes in areas like cloud, open source and microservices.

How will companies deal with these changes? Who will be their allies in this fight around dealing with data? And why am I now hungry?

Databases will get more autonomous

There is a skill shortage in the area of database implementation, particularly around the cloud. More companies want to take advantage of their data, but they are finding it difficult to run operations successfully at the speed that they want to achieve. Developers picking databases to run with their applications just want them to work, without the administrative duties and having to become DBAs to make this happen.

The database vendors have responded in the past by launching more managed services - however, this can move the problem elsewhere. This year, companies have started talking through how to automate database management and make these instances autonomous and self-healing. It was a big theme at Oracle's customer conference, and we, at Percona, have launched our own initiatives into how to make databases in the cloud more autonomous.

Next year, more autonomous database services will become available to meet the need for speed. However, the important thing to be aware of here is how this autonomous service is designed and delivered. What is great for the majority may not be suitable for everyone.

The impact of changes around open source licensing will increase

In 2019, we saw multiple companies make changes to their licenses for their open source projects to protect their operations. The reason behind this is that it has become easier for cloud operators to launch their own open source project offerings as a service, making it easier to adopt a project and scale it up rapidly. The problem here is how much of that adoption benefits and supports the open source project itself.

From a development and support perspective, open source projects need to keep the lights on and they rely on a mix of community members providing updates in their own time, on vendors supporting a developer to provide commits, or a commercial model that monetises open source projects through support or enterprise features. The ubiquity of the cloud affects these three areas. However, the move to restrict licenses and stop cloud providers can backfire.

In 2020, open source licensing will continue evolving to make it easier for everyone to take advantage of what is being created and developed. Access to the source code is a precondition of one of the ‘Four Freedoms' that define the Free Software movement, and which is echoed by the Open Source Initiative's own definition. Whatever happens in the world of politics, this freedom of access is worth fighting for.

This is also not as simple as just cutting cloud companies out of the loop, as this affects the companies that want to use these projects more adversely than the cloud operators themselves. This will be a litmus test for the open source projects themselves - if you can carry on growing without the scale and access that cloud provides, then your project will be successful. If you can't grow without that ease of access, then your project is in trouble.

Cloud and lock-in will be a big battleground

Multi-cloud is a growing trend - our research points to companies either wanting to run across multiple platforms or having to do so based on decisions that were made in the past.

The reason for this is lock-in - more specifically, how to avoid being too tied to any one provider. Most CIOs and CTOs have long memories of being overly dependent on the likes of a Microsoft or an Oracle in the 1990s and 2000s, where they had great technology but were effectively tied to those vendors. They don't want to risk repeating that situation. Multi-cloud helps them keep leverage over their vendors.

It's a similar story in the database sector, where companies are breaking away from reliance on any one provider. In our recent survey, 92 percent of enterprises have more than one database platform in place. Many of them have more than half a dozen, with a mix of relational and non-relational databases being used. Those are spread across multiple cloud providers, on-prem servers, and partner systems.

This will continue in 2020 - companies need more help picking the right database to run in the cloud and support their applications, and they have to avoid the conflicts that result in poor choices.

Database security and best practices will get more attention

For most of the major security breaches in 2019, poorly managed database instances were the culprits. When you have developers building new applications and looking to get them live as quickly as possible, security planning often gets either left behind or forgotten completely.

Making databases secure is not a hard process as long as best practices are in place and there are defaults established to make security impossible to avoid. Basically, it should be harder to deploy an insecure database than a secure instance.

Securing databases in the cloud will get a lot more attention in 2020 - from the providers that implement security by design into their set-up processes through to the developers that are responsible for these choices in the first place. Without this attention, security failures will continue to happen.

Cost around cloud will lead to complaints

The move to automate and speed up deployment of applications and their associated database instances will help companies of all sizes deliver what they require and quickly. The challenge here is that whatever you automate, you get the results at scale.

When everything is working well, this is great; when you have made poor assumptions on your cost models, or on how many resources your database will consume in the cloud as you scale up, it can lead to a far larger bill at a much faster rate. These kinds of projects can lead to massive problems for IT when teams make the move from capital to operational expenditure. The costs can be much less predictable than initially assumed and much higher over time.

In 2020, we'll see the first projects come through where companies have been seriously affected by these kinds of problems, and they are willing to go on the record around them. In time, this will help everyone get a better grasp on how to run IT operations and cloud projects, but it's still a nascent market.

For data in 2020, the problems are potentially huge, as everyone is looking at data as the source of insight to make themselves more competitive. This will mean that they can't simply stop and start again when an assumption is wrong. Getting good advice and consultancy around cloud deployments before going into production can be a big help.


Matt Yonkovit is Percona's Chief Experience Officer, overseeing company strategy & marketing functions, as well as operating as the chief storyteller. Before joining Percona in 2009, Yonkovit worked at MySQL AB and Sun Microsystems as a Solution Architect, building out and optimising high performance, scalable, and always-available infrastructure for Fortune 500 and countless other web properties.