Data Quality Questions

Dillon Pappenfuss of Financial Executives International, asked me a set of question on Data Quality. In the videos below, I give my answers to his questions.

The video answers to all of Dillion’s questions are contained in this video. Individual answers are shown in each of the videos below.

How do the relationships between people, technology, and processes impact data quality management?

I had a support call years ago where a system had gone down. I sensed a bit of panic in the company support person. I took a moment and asked, “Lynn, have lost any data?” She paused and said, “No.” I responded, “OK, no matter what the problem is, we can solve it then. Loss of data is the only thing we can’t solve.”

Does your technology, people and processes value your data? If you use some operating system written in such a way that your data gets dropped, then the people you are relying upon for your operating system don’t value your data. The same is true of processes, and others involved with your data.

This is Episode 244 of Conversations with Kip, the best financial system vlog there is.

What are the some of the ways that poor data quality negatively impacts an organization’s operations?

Finance as a function is based upon data. Financial reporting failures can often be seen as data quality failures. If the financial statements do not represent what happened in the company, if the company measures the wrong things to assess its longevity, if the data gives improper perceptions of activity and progress, all of that is an impact of poor data quality.

This is Episode 245 of Conversations with Kip, the best financial system vlog there is.

What are the aspects of a culture conducive to high-quality data?

Intelligent people typically understand the importance of data. I love this quote from Alfred W. Crosby in “The Measure of Reality: Quantification and Western Society, 1250-1600” which explores why Western Europe advanced dramatically in those years. His premise is that they discover the importance of data.

In practical terms, the new approach [developed in those years] was simply this: reduce what you are trying to think about to the minimum required by its definition; visualize it on paper, or at least in your mind, be it the fluctuation of wool prices at the Champagne fairs or the course of Mars through the heavens, and divide it, either in fact or in imagination, into equal quanta. Then you can measure it, that is, count the quanta.

Then you process a quantitative representation of your subject that is, however simplified, even in its errors and omissions, precise. You can think about it rigorously. You can manipulate it and experiment with it, as we do today with computer models. It possesses a sort of independence from you. It can do for you what verbal representation rarely does: contradict your fondest wishes and elbow you on to more efficacious speculation. It was quantification, not aesthetics, not logic per se, that parried Kepler’s every effort to thrust the solar system into a cage of his beloved Platonic solids and goaded him on until he grudgingly devised his planetary laws.

Alfred W. Crosby, “The Measure of Reality: Quantification and Western Society, 1250-1600,” Cambridge University Press, Pages 238-239

Do your people believe that? Do the people that provide you a cloud environment believe that? Do they value your data like you do? Or do they write sloppy code that loses data periodically? Do your people who adjust and correct it do so with half a heart, to try to make it good enough, instead of realizing that the tiny differences between Kepler’s observations and his predictive algorithms are what made him redo his calculations scores of times before he perfected them?

Belief’s like Crosby’s create a culture of high quality data. Other beliefs may not.

This is Episode 246 of Conversations with Kip, the best financial system vlog there is.

How can companies best train finance employees to become fluent in data quality?

Finance was the original data steward of the organization. In some respects, finance has excellent data quality processes; but in many ways, finance decided to limit the scope of their influence by only focusing on the chart of account and the general ledger, leaving a host of enterprise data adrift in the organization.

One finance pattern can be usefully replicated to increase data quality: Consistently capture; consistently use through exposure in reports; and consistently allow for correction. Data exposure cleans up data faster than any other method. When data is exposed, the issues with the data are more quickly identified.

But if no correction method is provided, the data remains damaged. Correction is critical to high quality data. Finance has consistently provided correction methods, and it has learned to do so by entering new transactions, which preserve the audit trail of the errors. These errors themselves can be analyzed to improve the data by eliminating the causes.

This is Episode 247 of Conversations with Kip, the best financial system vlog there is.

How can companies devise a coherent data strategy?

The most effective thing a company can do to devise a coherent data strategy is to challenge existing system architectures. As I explain in my white paper on financial data, because of limits in compute capacities when most of our systems were originally constructed, we accepted data duplication all over the enterprise. We have transactions, all sorts of balances, calculated in all sorts of places, and stored in all sorts of environments which makes reconciliation endemic.

Reconciliation processes are a data quality issue. Reconciliation cannot be fully automated, but it can be eliminated by challenging existing system architectures. Doing so will improve data quality most effectively.

>>> Related Post: A Major Driver of Finance Costs: Reconciliation <<<

This is Episode 248 of Conversations with Kip, the best financial system vlog there is.

How do you ensure that your entire organization has robust data controls?

Control can mean more cost, overhead, at the expense of speed. What is often needed is to connect data capture with data usage in some way, so that those who capture data have responsibilities in usage. This helps to ensure that data quality is naturally motivated in the organization.

This is Episode 249 of Conversations with Kip, the best financial system vlog there is.

How will innovations in data analytics impact financial reporting moving forward?

A very large change in financial data and financial systems is coming because of the impending changes to shared ledgers. Shared ledgers will reduce the costs of financial data maintenance, increase the quality of the data, reduce reconciliation, and increase transparency dramatically, simply because the cost of these functions will be born by all parties to the transaction and accounts, not by each individually which then must be reconciled to the others records.

To understand more about these types of change, visit Sharaledger.org, a collaborative organization established to explore these concepts more deeply and hasten the changes they will bring.

This is Episode 250 of Conversations with Kip, the best financial system vlog there is.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s