Master Data Management

Manuel Sevilla (Europe) - Disruptive Memory Databases

This is the first in a three-part series on the global issue of ‘Big Data’ authored by Capgemini, which will be published every Wednesday until 25th January. Part one looks at Europe.

We are currently seeing a huge evolution in the way companies everywhere store, process and gain insight from business information in response to the ‘Big Data’ phenomenon, which has seen a massive explosion in the amount of information both generated by and available to organizations. The value of that data is becoming ever more critical to business success, not only offering greater insight into markets and opportunities, but also imposing a regulatory mandate for the storage and management of that data, in particular within the financial services and telecommunications industries.

However, there are a number of new approaches and revolutionary tools that are now available for business information management, designed specifically to address the issues associated with Big Data. Thanks to developments in technology, including the introduction of 64-bit multi-core processors and the low price of memory capacity, it is possible to have databases that operate directly in-memory. An in-memory database primarily relies on main memory for computer data storage as opposed to more traditional database systems which employ a separate disk storage mechanism.

Going from disk based systems to in-memory means processes are improved drastically, often running hundreds of times faster, as in-memory systems do not need to wait for information to be retrieved from disk-based storage systems.  This means that not only are current workflows and processes highly accelerated, but also that new workflows and analytics that were not possible in the past are now becoming a reality. Specifically, in-memory information processing capabilities provide companies with a valuable ‘real-time’ insight into their business that was not previously available. It means:

•    A bank is able to analyze in real-time every one of its transactions and therefore look out for fraudulent behavior.
•    A retailer is able, in real time, to predict a shortage of stock and fill it before it runs out.
•    An electricity company is able to decide very quickly if they should produce more or less electricity or the way routes should be designed to best reduce electricity loss.

As another example, since the nineties, for law enforcement reasons, European legal authorities have constrained telco operators to building very large databases able to analyze Call Data Records (CDRs) a few minutes after each call. Thanks to this, multi-terabyte data warehouses have become the norm for telco operators, and Business Intelligence technology has had to become very innovative in managing high volumes of data.

The in-memory revolution is changing the game. Big Data and intense competition are the key drivers which are changing the analytics market. The question is not “should we go to in-memory?”, but “how can we leverage in-memory capabilities for significant competitive advantage?”

Regardless of the type of business or sector they operate in, organizations across Europe are finding new and innovative ways to use information to gain business advantage over their competitors. In-memory information processing is transforming classic business processes with real-time data-mining capabilities, including real-time fraud detection and out-of-stock prediction, which ultimately has a positive effect on the way businesses function and provides significant competitive advantage.

By Manuel Sevilla, Chief Technical Officer for the Global BIM TLI of Capgemini. For more information, please visit Manuel’s blog.


« Igor Kravchenko (Russia): How e-Government is Making Russian IT Companies More Competitive


Jeffrey Voas (US) - Mobile App Addiction: Threat to Our Businesses? »


Do you think your smartphone is making you a workaholic?