PREPARING GLOBAL DOCUMENTATION IS EXPENSIVE. UNDERSTANDING WHY IT IS SO EXPENSIVE IS THE FIRST STEP IN TRYING TO DETERMINE HOW TO LOWER COSTS WHILE IMPROVING QUALITY.
Let’s start with some simple arithmetic. The OECD local file template has 20 information requests, 4 asking for legal entity information and 16 asking for information by transaction. For a legal entity with two transactions, this translates into 36 distinct data requests. (4 + 16*2 = 36.)
Each data request involves carrying out several tasks – for simplicity, let’s say these tasks include scoping the work needed, collecting the data needed, reviewing the data, analyzing the data, and then writing up the response. Collectively, this totals five tasks per question, or 180 tasks per documentation report. For an MNE that is preparing global documentation for 100 legal entities, the total number of task is 180 * 100 = 18,000 tasks. Even at $100 per task, this implies a total cost of $1.8 million. If we double either the number of legal entities or the average cost per task, the cost climbs to $3.6 million. And these are annual recurring costs.
This is clearly a somewhat artificial exercise – I don’t know of any company that approaches its global documentation as a series of thousands of individual micro-tasks. Instead, companies aggregate micro-tasks with common attributes into more manageable groups.
HOWEVER, IT IS IMPORTANT TO THINK ABOUT TASKS AT A VERY DETAILED MICRO-TASK LEVEL BECAUSE HOW THESE MICRO-TASKS ARE GROUPED TOGETHER CAN HAVE A SIGNIFICANT IMPACT UPON BOTH QUALITY AND COSTS.
Companies often default to grouping tasks by country simply because separate reports are published for each country. But the publication of reports is only one subset of tasks; the more numerous and costly micro-tasks focus on scoping, data collection and analysis. And there is no reason to expect such data collection and analysis tasks to be most efficiently organized by country – for example, the collection of intercompany agreements and/or APAs/tax rulings is often best treated as a single task rather than as one that should be carried out be each legal entity.
Perhaps the most important questions to ask when aggregating micro-tasks into common groups is (1) whether there are common elements that allow them to be carried out together (e.g., the information is obtained from the same source or using the same approach to data collection) and (2) does the selected approach to grouping allow for efficient scaling – e.g., minimizing the incremental costs of adding an additional micro-task to the same group. Using the collection of intercompany agreements as one example, once a collection strategy has been developed (e.g., asking the legal department for all agreements; sending out a survey to local controllers asking for the agreements in their possession), the cost of collecting 100 agreements may not be all that much different than the cost of collecting 500 or 1,000 agreements. (Reading or summarizing the agreements is obviously a different matter.) Similarly, if a company has a business model that uses low-risk distributors with analogous functions, assets and risks, once a FAR has been prepared for one entity, it can presumably be shared with all low-risk distributors, regardless of whether there are 10, 20 or 100 such entities. This is not to say that the incremental cost of adding an additional entity will be zero – a survey procedure may be needed to verify the accuracy of the description for each of the legal entities that use it – but the incremental cost of adding a legal entity is likely to be much lower than if the task of preparing the FAR was organized by country or by legal entity.
Developing an appropriate strategy for aggregating micro-tasks is also likely to contribute to more consistency and better quality. Even if the same low-risk business model is applied to each of 100 distributors, preparing a separate FAR for each country will almost certainly lead to differences and inconsistencies among the different FARs, and may often result in contradictory statements. Centralizing the process of developing the FAR from the outset as a single task, and then treating the process of publishing it/incorporating it into discrete documentation reports as a separate task, should minimize the risks of unintended variations in the underlying FAR.
THINKING ABOUT HOW TO ORGANIZE MICRO-TASKS INTO COMMON TASKS THAT ALLOW FOR EFFICIENT SCALING HAS ANOTHER ADVANTAGE AS WELL
In that it reduces the need to cut corners by simply not documenting certain legal entities. As a general rule, all tax authorities in today’s post-BEPS world expect taxpayers to affirmatively document the arm’s length nature of their transfer prices on a contemporaneous basis. If the process of preparing global documentation is such that each legal entity is a new and costly task, then doubling the number of entities that are documented doubles the cost. However, if a large share of the global documentation is prepared in a way that minimizes incremental costs, then it becomes possible to prepare credible (even if not “complete” or “perfect”) documentation for additional entities without a simple linear increase in cost. More generally, a well-thought out process for collecting and analyzing data should also allow the “repurposing” of such data into other tasks, such a planning or risk evaluation.
And software – which is designed to perform repetitive tasks efficiently – can obviously help? Or not? As will be discussed in a later blog, the answer to both questions is yes.
Dr. Chandler has worked as an applied economist since 1978, and has been actively involved in transfer pricing analysis since the mid-1980s. Dr. Chandler has substantial experience in assisting taxpayers at all levels of transfer pricing controversy, planning and compliance. Dr. Chandler has assisted taxpayers at audit and appeals in the United States, as well as by providing expert testimony in Tax Court.
CONTACT CLARK DIRECTLY: