By Amir Peles, CTO, Radware
Web traffic, but more so the analysis of web traffic, is extremely valuable to organisations which offer products and services over the internet. Real-Time Intelligence tools provide a very detailed and immediate incite into the behaviours of the users by tracking activities and movements online.
This detailed information provides vital business intelligence including trend mapping, demographic proportioning, influencers and influences and potential demand.
The web analysis is fed into the Business Intelligence of the organisation providing each department with vital and competitive intelligence. Teams including product development, customer services, advertising & promotions, sales & marketing online then use this information for forecasting, planning and development.
However, the key business benefit is the competitive advantage this micro-knowledge about customers and what influences their purchasing decisions.
The value of real-time actionable traffic spans multiple domains and departments. Following are some typical examples:
• IT: monitoring the system performance and usage from the user perspective and recognising potential problems and fixes. Potential demand statistics and advanced forecasting for new lines, for example, enable the IT team to prepare for traffic increases and ensures the required applications, CRM tools and delivery platforms are budgeted for and installed in time.
• Marketing: increases conversion rate and boosts repeat and cross-selling sales opportunities by alerting customers to the right promotion at the right time. Promotions and campaigns can also be created and delivered by the appropriate medium based on observed online activities.
• Customer care: monitoring customer experiences and feedback ensures the appropriate after-sales and support services are in place. It also means that any product recalls or service issues, for example, can be addressed immediately minimising customer complaints and protecting brand value.
• Security: identifying common consumer fraud patterns by tracking behaviour in the sites visited, files viewed and downloads actioned enables security policies to be put in place that automatically alert the business to such behaviour patterns in real-time.
However, it is only possible for real-time analysis and behaviour monitoring to work if the business or organisation has the IT infrastructure in place to support it. Therefore it is vital that the network teams follow these four golden rules. The major issues spring from the fact that the team responsible for the web traffic analysis and the team responsible for the application are not the same and therefore conflicts can occur.
The Four Golden Rules
1: Keep Web Traffic Analysis Away from the Web Application
Most commonly, web traffic analysis follows the web application path, either on the web server, on a reverse proxy, or on a client browser by enriching the web page with special purpose scripts. The problem with these methods is that they interfere with the customer-facing applications, compromising the stability, speed and performance of them. The only way to right this is via a long-term development process.
The best way to do traffic analysis is to capture the data on a network switch. This can be done without interfering with or withholding the packets on their way to and from the web server. This will eliminate the highlighted problems, allowing for fault free analysis.
To work, special attention should be given to encrypting HTTPS traffic. The passive capturing needs to have the organisation’s key pairs to be able to decrypt the traffic.
2: Use an Analysis Tool with Flexible Scripting
Web applications have no fixed standard for “performing” a transaction. For example, an online banking web page varies from one online bank to another and can change over time without warning.
The analysis tool must incorporate a flexible, easy to use scripting environment, one that can customise the extraction of events and transactions from raw HTTP and HTML traffic specific to the need of the site.
3: Base Analysis on Complex Event Processing (CEP)
The business intelligence sits behind the patterns of events rather than coming from the web events themselves. For example, if a user that submits a wrong password three times they are blocked and encouraged to call the helpdesk. When a CEP system receives web events it correlates them into real-time patterns using programmable rules, which become an integral part of the web analysis solution.
4: Integrate Analysis Tools with Enterprise Resource Planning Tools (ERP)
Web traffic analysis usually is fed into an enterprise systems such as an CRM platform or managements systems. This approach provides clean integration with other data sources, or it can be stored and actions achieved effectively based upon analysis.
For successful integration, the analysis system must be equipped with industry standard protocols used to feed such systems, for example SQL and message queues.
The execution of the web traffic tool is as critical as the analysis it provides. A retailer that follows these Four Golden Rules will have an even greater competitive advantage online than one that doesn’t. For data to be intelligent, it needs to be accurate and it needs to be real-time.