To rec or not to rec – A few questions I asked myself when penning this article about Cutover reconciliation;
How important is it to reconcile your data before go live? – Very!
What is reconciliation? – Reconciliation is the process of ensuring the data from various source systems which are being migrated to a target system are consistent with and match the data in the target system – a function of Data Governance. Where valid inconsistencies occur, these are justified and approved.
What is the business value of this exercise? – Streamlined and improved running of business functions and processes. Realisation of target system benefits. Business reaps the rewards of the new system and their data.
Often as a consultant, you encounter a challenge at your client’s project which you have not had to deal with in the past. You quickly do your research or reach out to thought leaders in your field to get solutions.
Such was the case a number of years ago when I was engaged as a cutover lead to a multinational company. The requirement was to ensure the data migration was successful. What does successful even mean?
This was a fast paced and highly regulated environment and I was brought in after it became apparent that a cutover manager was required to ensure all the migrated data was as expected, to fulfil regulatory compliance. The reconciliation scripts needed to be developed and a good understanding of the legacy and new system had, to ensure the queries were checking the right fields and the target system was showing the correct information. All within a very limited timeline. Thankfully, this was in an industry where I had substantial exposure. Sometimes we are expected to pull the rabbit out of the hat with the benefit of industry knowledge and experience. Normally, delivering this reconciliation within such timeframe wouldn’t be realistic nor advisable. Also, this was early in my consulting career and I was very optimistically naive! This was a failing project that needed saving. It was failing because it could not and would not satisfy the regulatory checks and balances in its then current state, although everything else seemed to be on track!
In such challenging situations, reaching out to tried and tested methods and skipping innovation works. I enjoy innovating but I didn’t have time to build from scratch so I used my pre-existing reconciliation scripts to point to the right fields and perform the reconciliations. To complicate things, the reconciliations had to be done across different servers, necessitating the requirement for querying across linked servers. Linked Servers had to be created following various bureaucratic processes – this is a way of querying data that resides on multiple SQL server instances.
There were times the reconciliation scripts would run as quick as a flash and other times the scripts took frustratingly long hours to run. The data Engineers blocking queries to prioritise others didn’t help much. It wasn’t until I spoke to a knowledgeable data engineer that I realised what was going on. No matter how hard I tried and how brilliant the code I was using was or how much I tweaked the code to a lite version, the compute memory I needed to run my queries was just not available. This was turning out to be an impossible task. I had failed to liaise with the data engineering stakeholders as an oversight. I believed everyone on the project understood the importance of reconciling the data prior to go live. I was wrong! I therefore had to engage the key stakeholders, make the case and communicate the impact of a delay to the reconciliation effort. This was especially important as the auditors were present in the room to ensure that the data was complaint. Any issues raised by the auditors needed to be justified by the data.
It didn’t take long to get the resources I needed and this obstacle was quickly overcome. The reconciliation was performed and approved and the go live approval was finally obtained. A runbook was also created to ensure the reconciliation process could be replicated over time.
Hooray! My job was done… but what happens if the data was migrated without reconciliation? That would be the danger zone for any organisation. You see, the data engineers in the project above, had the job of building, designing the new data warehousing tool and making sure the data was loaded for business use but not so much about the completeness, correctness, quality or reasonableness – That is the remit of another function(more of that in my Data Governance series post).
The business was excited about the new tool/platform that promised to enable smooth and efficient running of their business and had assumed an inbuilt data quality into the migration process done by the data engineers. This is akin to asking a student to mark their own work. This ended up costing the business crucial time and money. I went in as a resource that was not costed into the original Project cost but should have been and time would have been saved!
This was years ago however similar issues still persist. Indeed, I recently went into a data governance project where the business, keen to position itself as a data driven industry insight leader, had migrated the data and then ran into issues, later realising the need for a data governance initiative working alongside the migration project. The problem however was the stakeholder’s lack of appreciation for what their data or data governance was. The business wanted realisation of value from their data but did not value the data or the processes involved. The view was that data governance is a tool dependent initiative that would resolve the data issues which stemmed from the data processing, data model and the data migration(more in an upcoming post).
When reconciliations/data governance are not a part of the data migration , this leads to a failure of the data project. When the data gets migrated, the business cannot reconcile it’s data especially when access to the legacy system is no longer available. More time is therefore spent on remediation and convincing the business users on the benefits of the target platform.
In cases where access to legacy system is still available, you find businesses maintaining costs to two platforms and data still being a challenge to understand by the overall business users. Thereby diminishing any value or benefit of implementing the new tool/platform. It is always mandatory to perform data checks and approve such checks before migrating your data. This should involve all business domains to ensure the data is as expected for all domains and issues resolved prior to migration. A dry run or dress rehearsal serves as useful tools to help various domain assess the final migrated data.
Finally, the time taken to engage the business stakeholders is less costly than the data debt incurred post migration when no data governance is in place.


Leave a comment