Financial & accounting executives are often the most data-savvy leaders within an organization.
Their jobs are focused on making sure that their teams produce reliable financial reports and analyses to effectively support smart and informed executive decision-making. Together with leaders of other functional areas, they are responsible for internal controls to ensure reports are accurate and opportunities for fraud, error and abuse are minimized.
Yet how often do financial and accounting executives encourage the use of data analysis to help ensure the integrity of financial reports and the massive amounts of financial activities and transactions that underlie them?
The answer is that very few do so—which is surprising given the ability of data analysis to do a lot more than simply “crunch numbers.” Why not? Both a lack of knowledge about the techniques involved and, probably, a tendency to over-rely on the effectiveness of internal controls.
What’s wrong with relying on built-in ERP system controls?
Most financial and accounting reports are driven from ERP systems—possibly multiple instances of several different ERPs in most large multinationals—together with a number of other sub-systems. The traditional approach is to rely on internal controls built into the ERP systems themselves. Examples include controls designed to maintain segregation of duties, or designed to prevent payments being made without appropriate approvals, duplicate payments, or a host of other things that can go wrong with transactions that are processed through purchasing, payment, payroll, expense and order-to-cash systems.
In an ideal world, automated ERP control settings would prevent any bad stuff from happening. Theoretically, most ERPs are capable of enforcing adherence to appropriate internal controls. However, a few things get in the way of this materializing in the real world.
- Controls get sidelined during configuration. The first issue is that configuring all the appropriate ERP settings is a laborious and complex task. In the already very expensive efforts to implement a new ERP and actually get it running productively by a deadline, it is often the controls aspect that gets relatively little attention.
- The need for agility and efficiency. The second major issue is that implementing effective automated controls in an ERP usually results in a very large increase in time needed for input and overall transaction processing. Every time a transaction is flagged or blocked from input or processing, the result is increased delays and efforts required to resolve the issue. Because of this, controls are often by-passed, people come up with workarounds, or the control settings themselves are turned to “off.” This is quite understandable. Managers should be making smart decisions about risk and the trade-off between having perfect controls and “getting the job done.” That is one of the reasons that financial and business process managers may tend to roll their eyes at internal audit recommendations that are great in theory, but may not reflect the pragmatic world of business and financial systems.
Data analysis can do more than crunch numbers
Fortunately, there is an approach that effectively balances the often-conflicting objectives of “getting the job done” with reducing the risks of fraud, error, waste and abuse. Data analytic technology has the ability to independently examine each transaction in a very large population of financial and business transactions to identify when a risky transaction appears to have actually taken place.
Instead of preventing every possibly risky transaction from ever occurring, the objective is to zero in on the bad stuff that is actually happening. If the risk exposure or dollar amounts involved are relatively low, it may be sufficient to simply monitor for instances in which, for example, managers have bypassed approval controls by splitting transactions into multiple items under approval limits. However, if the risk—or impact—is high, the transaction can immediately be red-flagged—and specific controls can be implemented to prevent the issue arising in the future.
Data analysis and transaction monitoring allows the fine-tuning needed to build a system that is neither over-controlled nor under-controlled. However, one of the practical fears of implementing such an approach is that of “false-positives.” If the testing of say, all purchase-to-payment transactions, is too simplistic, then the red flags generated can include many that are potentially suspect but not real problems. If this occurs too frequently, the manager becomes unlikely to respond. Here again, the ability to fine-tune analytics is important to ensuring only significant problems are highlighted. And a big bonus: Technology can also ensure that there is effective follow-up on red flagged transactions and not simply ignored.
A real-world example of just one flaw in ERP control systems
I recall describing this approach to a financial manager responsible for controls within a leading ERP system installed across their organization. The manager argued that there was no point in analyzing payment transactions for possible duplicates, since the ERP setting that prevented duplicate payments was set to “on”—duplicates were therefore impossible. It was agreed at the meeting that an internal auditor should perform some transactional data analysis and report back on whether the “no duplicate payments” control setting was doing its job.
A week later the auditor reported back a number of duplicate payments identified for a one month test period, some very large. The biggest arose from the same invoices (with identical invoice numbers) processed through two separate ERP instances, in two different countries; the control setting only addressed attempts to enter the same vendor invoice number in the same instance of the ERP. There were numerous other examples of duplicate payments despite the control setting—several, for example, resulting from the same invoice payment information being entered more than once, due to a minor change in the invoice number itself that allowed it to be processed for payment undetected.
Data analysis: enabling the fine balance of internal controls best practices
The point is that no internal control system is perfect in practice—and that overly-controlled systems are inefficient and cumbersome, while under-controlled systems create risks of fraud and error.
Neither of these extreme approaches are acceptable when there is a practical means to monitor transactions on an ongoing basis for a wide range of fraud, error and abuse. In fact, data analysis becomes a very powerful and efficient additional layer of control.
Data analysis has been used by auditors, both internal and external, for many years to test internal controls and to identify risks and problem transactions. Arguably, the time has come to move these techniques to the front lines of responsibility for managing risks and maintaining effective and efficient controls.