By Laurence Armiger, Director, Zizo
For the past two decades organisations have been looking to delve deeper into customer information in order to improve retention and drive up sales. The Big Data phenomenon has simply reinforced the importance of transforming business through data, with organisations looking to enrich internal data sets with the addition of new, increasingly open, data sources.
From weather to crime, smart meters and traffic, the diversity of open data sources is incredibly exciting. But it is also creating a challenge. Not every data set will deliver significant incremental value; it may take 20, 50, even 100 different data sources before a company hits on the killer piece of information that delivers real business transformation. So how can organisations experiment with this data quickly and efficiently without incurring the untenable costs associated with traditional data analytics?
Here, I explain the value of the next generation analytics platforms that enable the fast, low cost data experimentation required to truly explore and exploit these new, untapped open data sources.
Unprecedented Data Choice
This concept of data enrichment is not new — leading retailers, for example, have integrated demographic data sets to enhance customer data not only to improve customer understanding but also inform key strategic decisions such as the location of new stores. However, the sheer volume and innovation associated with the new generation of open data sources is radically transforming the opportunity.
From the familiar such as weather, crime statistics and flood information to a raft of incredibly innovative, real-time data sources including smart metering and traffic information, the opportunities for enriching existing customer data with the open source data grow almost daily.
The choices are virtually unlimited — so just as it presents opportunity, therein also lies a problem: How can organisations begin to experiment with these data sets without incurring prohibitive costs?
The biggest cultural challenge for organisations is that not all of these data sets will deliver significant incremental value — certainly not enough to justify the mammoth data warehouse style investments that have dominated the market for the past two decades. To incorporate new datasets within a traditional data warehouse requires significant and time consuming redevelopment and testing which simply cannot be justified for data that may or may not reveal any compelling insight.
Big data is all about being first - organisations need to be able to experiment with these data sets fast, effectively and cheaply. The more companies think about potential uses for these open data sources and combining diverse data sets, the more chance they have of discovering a killer application or piece of data that is going to deliver massive business value. Open source data has the power to transform business — but it might take 100 different data sources before a company hits on the right one. Trying to do that with an old style data warehouse is impossible.
So how can a company begin to experiment with these compelling yet diverse data sets and evolve data analytics into new areas that could deliver a massive bottom line impact? The latest generation of analytics database technologies have been designed not only to manage vast data quantities but also to compress that data into manageable — and affordable - volumes and provide the business with the insight required. Exploiting innovation in areas such as data compression and pattern matching, these solutions require not only minimal infrastructure — and hence cost — but deliver a new way of locating information within the mass of data to enable rapid exploration of each new open source dataset.
With the right model organisations can embrace these new data sources quickly and efficiently — and that includes the new real time feeds that are coming on stream driven by the Internet of Things. Take the smart city project underway in Milton Keynes which is collecting vast amounts of data relevant to city systems from a variety of sources, such as local and national open data repositories; data streams from both key infrastructure networks, including energy, transport and water and other relevant sensor networks such as weather and pollution data; satellite data; and data crowd sourced from social media or through specialised apps.
The diversity of data being collected and shared is amazing and will provide a chance for innovative organisations to devise extraordinary new ways of exploiting that data to drive better business. For example, can a shopping centre combine the empty/occupied tweets from the city’s parking spaces with existing footfall measures to improve understanding of patterns of shopping centre usage? The value is not only in gaining better understanding of the dynamics of the high street but also perhaps in selling that insight to the retail community — even reconsidering the value of different shop units based on shopping activity.
Alternatively a retailer can use smart information feeds from utilities regarding planned and emergency works that may affect customer access to stores and automatically contact customers due to use click & collect that day and suggest an alternative location. This is a simple but effective way of not only ensuring the click & collect service is unaffected but also improving overall customer perception.
These are just some of the more obvious applications for open data sources — the potential value of enriching existing customer data with these massively diverse information sets simply cannot be foretold.
Time for Change
Companies have begun to recognise that the old fashioned goliath databases have no place in the big data era. Many, however, have been understandably reluctant to walk away from multi million pound investments in customer databases to embrace the new breed of analytics databases despite the clear benefits of cost, speed and responsiveness.
The rise and rise of open data sources is the final nail in the coffin for the legacy approach. This new data model is all about speed of response and about creating actionable insights within a timeframe that enables truly effective business change.
Data enrichment is fast becoming an essential sidebar to every big data strategy — isn’t it time for every business to accept the need for a new analytics model that enables fast, effective and affordable data experimentation?