By Joram Cano, solutions consultant at Keynote, a Dynatrace company
‘Big data’ became the big buzzword back in 2012. Since then, a number of companies have launched analytics solutions, some of which allow unprecedented access to large volumes of data, the ability to analyse that data and query what you want, whenever you want. However, like much of the technology trends in recent years, there is a gap between promised vision and reality. While these analytics solutions enable companies to quickly correct site performance issues – such as a slow load or crashing site – they often cannot provide enterprises with comprehensive insight into how site composition truly impacts the customer experience. It’s only with a sophisticated big data architecture that companies can follow the journey of their customers in real-time, and be able to use this information to pinpoint the granular areas to improve before it’s too late.
Big data architecture
Big data architecture is a virtual representation of web content, design and performance. Rather than a fortnightly or monthly report that works out the average site load time and availability, big data architecture can bring to the surface systemic site performance issues over time. These issues are often difficult to detect with traditional analytics solutions; because they may be caused by one small component, such as an image, are in amongst a high variability of performance data, or the issue is not easy to spot by looking at a single waterfall chart.
A negative customer experience can push customers directly into the hands of competitors. Avoiding this risk is critical, and businesses must, therefore, benchmark their sites against competitors to ensure that they are up to speed with the level of service being provided within the industry. While benchmarking is certainly not new, the level of data granularity and insight that enterprises now have access to today, is new.
Companies can now identify how other businesses have architected their sites, by knowing what gets loaded, when, and which third party services are being used. The business and its IT team can gain insight into where the points of failure are in real-time, as well as compare how features perform on their site compared to competitor sites. A business can even compare third party services within the same category, to ensure they are using the best one.
Third party services
Big data architecture allows companies to benchmark third party services against their competitors. If a third party service is not performing up to the required standard, a company can identify a superior alternative and replace it. The ability to benchmark third party services against competitors enables companies to ensure they don’t lose out on new customers and profit due to a poorly performing external service. Essentially, companies can make informed decisions before entrusting the quality of their customer experience to third parties.
Ultimately, big data architecture is allowing companies to analyse data like never before; they can quickly understand errors and trends down to the smallest component level, across multiple pages, in real-time. This allows businesses to make better decisions, as they can correlate page performance with business metrics like lead generation, to maximise the profitability of the site and remove errors that are preventing customers getting to the check out. What’s more, a well performing site can improve brand perception; drawing in even more potential leads. Enterprises that invest in big data to improve the customer experience will improve much more than digital stats – it’s a competitive advantage to accelerate business growth.