In a recent blog post, we took a look at how data analytics provides greater insights into risks and controls in purchase to pay (P2P) processes. Financial control specialists, managers of risk and compliance activities, and auditors are all increasingly aware of how data analysis can provide the ability to continuously monitor transactions and identify suspect activities and weaknesses in control systems.
One question that may come to mind is “why do relatively few organizations use this approach and improve their risk and compliance management activities?” The answer is usually that many finance and risk professionals are uncertain of what is involved in the practical implementation process and may be concerned that it is a complex and resource-intensive project.
So here is a quick guide to the practical things you need to know in order to get going and successfully implement data analytics to monitor P2P transactions.
What types of analysis should you apply?
The way to decide which analytics make sense is to think about the risks of things that could go wrong (e.g., fraud, error, non-compliance) and then consider what type of analysis would indicate the existence of a problem. Another complementary approach is to examine the control rules that are meant to be in place and work out the question that would determine whether a transaction complies with the rule, e.g., in order to determine whether there is effective segregation of duties in the payment approval process, look for any transaction where the approver ID = employee ID for the individual who input the transaction.
Here are just a few examples of common analytic tests applied in P2P processes:
- Examine vendor accounts and payments for indications of fraudulent, unauthorized vendors by comparing the vendor master file and vendor payments file data with employee master file data, looking for matches on name, address and bank account numbers.
- Test that P.O. approvals are made by appropriate individuals by matching transaction approver IDs with entries in a table of authorized approvers.
- Also test that P.O. amounts are less than the authorization approval limits of the approver.
- Determine whether P.O. and payment approval controls have been circumvented by splitting transactions into multiple items just below an approval authorization limit.
- Look for duplicate payments—not just the obvious duplicate vendor invoice number—but by looking for many variations of possible duplicates, including amounts and dates.
- Compare invoices payment details to goods received report data
- Examine changes to vendor master file data in which changes to information such as bank accounts or addresses or approver IDs have been changed for a brief period of time and then reversed.
If some of these tests seem redundant, as you assume that the ERP settings are meant to prevent such things from occurring, remember that controls do not always work as intended and circumstances can arise for which the control setting turns out to be ineffective.
What data do you need?
If you go to the IT department or a database administrator and say you want to analyse P2P data, the response may be that this is a massive task involving many hundreds of data tables and thousands of data elements. The good news is that in most cases only a relatively few data fields are required for analytic testing, from a few data tables. Typical data fields for many P2P tests require:
- P.O. number, reference/description, date and amount
- Part/service item number and description
- Vendor number and name
- Invoice number, amount and date
- Payment amount, date and reference/description
- P.O. and payment approver IDs, and authorization limits.
How do you get the data?
In practice, this can be one of the more time-consuming parts of the data analytics process, at least when first implementing analytics. The usual best practice is to arrange for an extract of the data for the period covered by the analytic testing. Despite what the IT department may recommend, it is usually best to avoid data that has been put into a corporate data warehouse and get an extract directly from the data tables or files that support an ERP or accounting application.
Once you get the data there are two important things to consider: Is the data complete and correct? and How do we automate this process for the future? Failure to confirm that the data extracted agrees to financial summaries and record counts can often mean that the analysis performed is itself invalid, as it is relatively common to find out that the data extract actually misses some records or includes incorrect data. Once the extract and validation procedures are confirmed, then there are huge benefits in making sure that the process can be automated and regularly repeated with minimal effort.
Getting the right data and knowing what forms of analysis to apply are critical components of the P2P transaction monitoring process. In a subsequent post we will take a look at the other important issues to be addressed in order to gain the most benefit from ongoing monitoring, including what to do with results, analytics scheduling, and software approaches.