By Dave Greenfield, Product Marketing Manager, Silver Peak

All organisations are aware that ‘big data’ — the use of new search and discovery technologies to extract value from huge volumes of information — has arrived. They also know that the opportunities and payouts for big data can be huge. What they don’t always realise, however, is that none of this will happen without the proper network infrastructure in place. And with so much of it crossing long-distance functional and organisational boundaries, this means the wide area network (WAN) plays a critical role.

WAN pressure

Organisations are dealing with rapidly growing data. Not only must they manage these current volumes, but according to IDC, data is expected to grow tenfold by 2020, therefore they must plan for even greater data volumes in the next decade. At the same time, Gartner has predicted that enterprise storage system expenditure growth will be minimal this year, at only 2.6 percent. Budget constraints will undoubtedly be a big challenge.

What organisations may not realise is that even with improved compute capabilities and more storage, these massive amounts of data can no longer be handled by ‘normal’ processing capabilities. Simply having a lot of data sitting around does not really accomplish anything; the real key to big data is being able to analyse large, diverse sets and act on the results. Indeed, aggregating data from different resources and sharing the results during data analysis is a big challenge; one that requires a stable network environment in order to be a success. Furthermore, this data also needs to be backed up. In addition to moving vast amounts of big data around, it is also crucial that all data is protected and kept secure, not only for regulatory and compliance reasons, but also to maintain customer trust.

Defying distance

All these big data requirements place a huge dependency on the WAN. As such, there are numerous obstacles to overcome, the first of which is geographical distance. The further away the data centre is from the user, the more latency there is to contend with and the longer data will take to reach its destination — this will likely be a slow and disjointed process. There might also be a compliance reason to take into account. Where it’s not always sufficient or acceptable to replicate data across town or even in the same state, data will need to be replicated over a much greater distance.

Insufficient bandwidth is also rarely recognised as a challenge to the success of big data, yet it can significantly slow down data transfers, resulting in data analysis being stale and outdated once it has been evaluated. Bandwidth is often limited and costly to provide, and in many environments, network congestion can exist. If it does, big data mobility can become extremely costly and be at risk of failure.

Network optimisation

Optimising the WAN presents a solution that can rectify the bandwidth, distance, and quality issues that often plagues organisations trying to move data over distance. By taking a network-centric approach, the underlying network will be able to cope with users accessing and moving big data, while network performance and end-user experience is drastically improved, and costs are significantly reduced.

Ensuring a stable network environment is also vital for any organisation wishing to take advantage of big data mobility over distance. This includes capabilities that reduce the amount of data transmitted across the WAN, correcting the network quality issues that are present in more networks, and accelerating protocols to help overcome distance challenges. In fact, optimising the WAN can significantly reduce traffic by as much as 99 percent.

Taking advantage of big data

Ultimately, big data will continue to grow in importance and companies of all sizes can take advantage of the business benefits it has to offer. While larger companies will be able to use it to deliver new products and new ways of servicing their customers, smaller companies will use big data in a more innovative way to outshine and compete with larger competitors. The challenge for many organisations will be how to accomodate big data applications using their existing network infrastructure.

This begs the question; if big data is complex, expensive and requires a lot of people and new skills that are not yet available, then why even think about it? Put simply, big data can create big value. But, like all the big-data predecessors — i.e., databases, data warehousing, data mining, data analytics and business intelligence — you need to know what you are looking for, what it is worth to you, and how you will take advantage of it before you start. If an organisation wants to take advantage of big data, it needs to make sure that the WAN is included in its plans.