big-data-velocity
Statistical Data Analysis

Overcoming "decay of insight" through utilising velocity of big data

Big data was one of the hottest IT buzz-words of 2012; the ability to harness big data largely comes down to people getting to grips with the so-called “Four V’s of big data.” So, what are the “Four V’s”? These are volume (scale of data), variety (different forms of data), velocity (analysis of streaming data) and veracity (uncertainty of data).

Velocity plays a significant role in one of the most important aspects of big data, namely storage or “data warehousing”. As the time between collecting the data and analysing the data passes, you encounter a phenomenon called ‘decay of insight’. This occurs when the data you have stored becomes less relevant and accurate as the data becomes older. In order for big data to be as useful as possible, it needs to be collected, stored and analysed as quickly as possible – velocity is therefore absolutely key.

Velocity is not just about speed of data in and out. Velocity is more subtle than purely speed; it is about being able to see something in context. Companies and institutions need to be able to assess incoming data against patterns and benchmarks. For example, telecommunications (Telco) companies such EE or O2 might want to know how many customers left them in any given day but importantly, they might want to dig deeper and see where those customers have moved to and why.   

The Telco industry has gone through a significant amount of change in recent years due to the widespread adoption of mobile communications in developing countries, and the rapid use of smart devices across the globe. This has resulted in the increased use of 3G/4G networks which in turn has created a surge in text message usage, mobile internet access and wireless phone calls. The resulting influx of data could be overwhelming to many, but solutions being adopted by Telco companies are able to translate and turn this big data into valuable insights.

The business results from this are tangible, as companies can trigger marketing promotions to a selected set of customers, and this can be done in real time across many different markets. Once these promotions are offered to customers, companies can then determine the success of these promotions almost instantly and react accordingly.

One company that has embraced this approach is Ufone, a multi-million pound Pakistan based mobile services provider. The Telco industry in Pakistan and Asia is crowded with five large organisations are all jostling for their share of the market. Pakistan for example is experiencing momentous growth – ten years ago the size of the market was around 1 million subscribers, but it is around 120 million today.

Ufone is using a combination of big data software services to more precisely, and on a micro individual level analyse the masses of data it is collecting. The challenge was it was estimated that Ufone had a “churn” rate of customers leaving of around 3% per month, as you can imagine, for a company with 24 million subscribers, this figure represents a sizable chunk of business. By analysing in real time the data coming from customers, Ufone was able to understand particular customers, their likes and dislikes, giving them the ability to instantly offer the best deals and promotions to retain their business.

It isn’t just commercial benefits that come from utilising real time analytics either.

Dublin City Council for example is responsible for delivering services for 1.2 million people.   The roads and traffic department manages the city’s road network and its 1,000 buses. In order to help with traffic management and deliver a more user friendly bus timetable, the city deployed what is termed an “intelligent traffic control solution” that uses geospatial data from GPS-equipped buses to visually display the near real-time position of each bus on a digital city map. Controllers can then locate areas where delays are present and instantly access live camera streams and identify the causes and possible solutions.

Harnessing and understanding velocity in a situation like this is essential – the system simply cannot work with data which is even slightly out of date.

Stepping back then and looking at all of these examples and the different industries and sectors using and embracing big data for business advantage, velocity it seems is not just about how quick one piece of software can process data; it is far more nuanced than that. Velocity is about really understanding the context of the data you have and being able to process that data real-time whilst making sense of that data and using it to effect better business or better services. And by being able to react to big data instantly and confidently, companies are able to deliver far better services for consumers.

 

Chris Nott is CTO Big Data & Analytics at IBM UK & Ireland

PREVIOUS ARTICLE

« Technology to improve India's shocking health indicators

NEXT ARTICLE

Re-looking at Kenya's digital learning school project »
author_image
IDG Connect

IDG Connect tackles the tech stories that matter to you

  • Mail

Recommended for You

International Women's Day: We've come a long way, but there's still an awfully long way to go

Charlotte Trueman takes a diverse look at today’s tech landscape.

Trump's trade war and the FANG bubble: Good news for Latin America?

Lewis Page gets down to business across global tech

20 Red-Hot, Pre-IPO companies to watch in 2019 B2B tech - Part 1

Martin Veitch's inside track on today’s tech trends

Poll

Do you think your smartphone is making you a workaholic?