As I approach two years of my vlog, I thought reflecting on its purpose might be helpful. I’ve been working on how to improve financial reporting for 30 years this fall, when I took Eric Denna’s first college course at BYU introducing me to the idea of Resources, Events and Agents, REA by Bill McCarthy. I’ve written a textbook about what I’ve learned, “Balancing Act: A Practical Approach to Business Event Based Insights.” This week’s episode attempts to reduce the 362 pages to a couple of paragraphs here, or 4 minutes in the video.

We would have ultimate flexibility for all financial reporting if we could do it from transactional data. However, the cost of maintaining time perspectives with transactional data would be very expensive if it is even possible. Thus we have always used posting processes, thousands of them in almost every business system which aggregate transactions, to maintain these views. Yet these posting processes drop selected transactional attributes.

The daily financial cycle that drives our aggregation posting processes. The originating systems typically consume the first part of the overnight processing cycle; and those systems are all partitioned by product, geography, or customers. After that, the enterprise financial perspective must be updated, with results available by next day business opening. Thus, bringing together all the partitioned data–every transaction for the company, in a very short period of time.

This need drives the General Ledger to be highly aggregated. The GL by and large gives us the right answer, but increasingly it does not answer why that is the answer. The why requires those missing transactional attributes, dropped in the earlier posting processes. So we produce the General Ledger, and then later in the day, using different computing logic, the data warehouse becomes available with more detail. The differences in posting logic increases reconciliation requirements, adding to GL to the original source systems reconciliations.

The alternative is to radically change the posting process; posting processes to customer/vendor contracts increases the master file detail dramatically–I call it an Instrument Ledger–but eliminates enough transactional data to allow cost effective processing within the required windows. Doing so allows many more perspectives to be produced off of the same master file–consolidating Data Supply Chains. The ability to query against the instrument ledger and pivot the posted balances to other customer/vendor and contract attributes enables a Metric Engine–something as simple as a Search Engine, but which creates the require aggregated balances.

This is Episode 111 of Conversations with Kip, the best financial system vlog there is.