Reporting from transactional detail is the most flexible thing possible. But it also is the most expensive because of data volumes. Our approach has been to test if we could use transactional data to produce our reports. In Step 4 of estimating metric engine processes, we’ve determined that it is not possible.
Reducing processing costs is possible with a posting processes. Posting processes eliminate unneeded transaction detail, thus reducing data volumes. A simple type of posting process is just data aggregation. If all the transactions are summarized by some set of fields used to produce a number of reports, then the benefit of a one-time expensive aggregation process might be recouped on report production.
A true posting process, unlike simple aggregation, is the aggregation of just new transactions since the last posting process, adding those to the master file, and doing that consistently over time.
In some cases, not all reports can be produced from one master file; multiple files are necessary to reduce the data volumes to a sufficient level, each aggregated with different fields, to produce the reports in a reasonable period at a reasonable cost. Multiple master files increase reconciliation; yet if the alternative is not having the needed information or the cost of the computing environment is higher than that reconciliation costs, this might be the best route.
This last step entails producing sample master files to be used in the reporting process, and then repeating step 4 tests against those additional files.
This is Episode 138 of Conversations with Kip, the best financial system vlog there is.
Leave a Reply