12/03/2015

By Robert Gothan, CEO and founder of Accountagility


Businesses are completely overloaded by data and related technology these days, which means that it can be very difficult for them to obtain meaningful information quickly and accurately. For example, many businesses are still managing their most valuable business data via a series of complex spreadsheets. Not only is this approach prone to human error, but it also tends to involve a confusing mix of manual and system-sourced data.

As a result, switching to automated processes may seem like a sensible alternative for firms looking to improve their efficiency, but the tighter controls and more formal processes involved with this approach can actually have a number of downfalls if not applied correctly. For example, automated processes that are not sufficiently agile will not cope adequately with business change, leading to the proliferation of manual ‘workarounds’. In this case, efforts to improve efficiency through automation can often end up increasing risk as well.

We’ve seen this before in manufacturing, of course. What happens when a manufacturing process goes wrong? Product recalls. In the business world, there are similar “product recalls” of data, reports, and other output every day, and people seem to accept this as the norm. I strongly argue that this is time consuming and resource draining in many organisations, since the need for such extensive manual work has a negative impact on both process accuracy and efficiency.
That’s not to say that automation isn’t an effective way of streamlining operations, of course – it can actually be an indispensable tool, as long as it is used effectively. In order to get the right balance between efficiency and risk, however, any attempts at automation need to focus first and foremost on process outcomes.

With this approach, firms can create a data management process that delivers numbers that actually mean something, and that will satisfy auditors and regulators alike. As a result, ‘product recalls’ of data, reports, and other everyday output will no longer be accepted as the norm, since firms will be putting much more effort into designing a highly efficient data production line.

This aspect of business re-engineering and automation is integral to driving efficiency and cost reduction. The first step in this process typically centres on data acquisition, often from different parts of the business. Bringing data together from disparate systems creates its own risks, however.
For example, bits of data coming in from different parts of the business may all look like bricks, but do they fit together?

The typical data management infrastructure is spaghetti junction; it is a tangled web of data and non-relational legacy systems that don’t ‘talk’ to each other. Validation is therefore required to prove whether or not the data actually fits.

Once this has been determined, firms will need to crunch this data into something useful. Again, as in manufacturing, quality assurance (QA) needs to play a central role in the process, from start to finish. Once the right data controls are in place, these transformation processes are much easier to design and will run like clockwork. Likewise, where validation and processing steps are combined, firms can even begin to create “self-testing” processes, which surely must be the holy grail of efficient data processing.

When validations like these are built into processes, and new knowledge about potential errors is captured and embedded quickly, we can start to see some interesting changes straight away. Suddenly processes are much more efficient, easier to manage, fail less often, and, most importantly, we tend not to see the same error twice.

Whereas many processes deteriorate over time, processes built in this way actually improve, thereby boosting both productivity and efficiency at the same time. Thanks to improvements like these, much of the risk, complexity and cost associated with manual data activities can be eliminated. As a result, 40-hour processes can often be reduced to minutes, and project development time can be reduced by person-years – with a better and more sustainable.

Without a doubt, business and financial planning is a key organisational process, but most firms would agree that it that consumes an extraordinary amount of resources. It can be particularly challenging due to multiple data sources and countless structural and numerical changes in the planning lifecycle.

The good news is that it is now possible to bring together sophisticated data acquisition, validation, number crunching and reporting into one controlled, manageable and flexible environment. With this approach, firms can begin to increase their efficiency by automating processes and simplifying workflow – yet without adding a new layer of risk.