Best practices for cut over planning

Each cut-over requires a leap of faith. You are letting go of your legacy system and embarking on a journey with the new one. There is no parallel run. You cannot go back. The only way is to move forward. It is like jumping over a cliff and hoping your parachute opens. System cutovers certainly feel that way. For such risky propositions, preparation is the key to success. 

Cut-over preparation starts with Conference Room Pilots (CRPs) which were discussed in a go-live readiness blog before. Each CRP gives you a chance to test the cut-over process. Start with a blank piece of paper and make a list. Write each step you need to perform. Make it detailed. Do not simplify. State each step. Verify the sequence. Identify the dependencies. This list should contain several items. Data migration will have most of them. Don’t forget to incorporate a validation process for the data migration. The list should also have configurations, security settings, even tasks that need to be completed in external systems. The list - which is also referred to as a cut-over checklist - will eventually contain hundreds of lines.

As you complete each CRP, you will soon realize that some of these cut-over tasks can be done earlier. A typical best practice is to load your master data before the go-live. For example, you can upload your items, customers, vendors, warehouses days before the go-live. You can even allow dual entry of this master data between your legacy and new system during this short period of time. Identification and execution of such tasks prior to the actual go-live will help you on two fronts. First, it will minimize risk. Second, it will save you time. 

Most people underestimate the time it takes to complete a successful end-to-end cut-over. If your checklist ends up taking a few days, you may not have enough time to complete your cut-over. Thus, it is important to time each step in the cut-over checklist. Once you identify the tasks that take the most time, you need to focus on finding ways to speed them up. For instance, if it is a massive data load, try to break the data into multiple pieces and allow multi-threading to upload the data. For example, if you have tens of thousands of on-hand inventory values you need to upload, you can break them up by location groups and upload multiple files simultaneously. You can also try net change mechanisms. You can first load a large amount of the data prior to go-live and then incrementally load net changes until you go live. 

If you are interested in learning more, please connect with me on LinkedIn, follow me on Twitter, or watch me on YouTube.

My name is Cem and this has been another gem. 

Previous
Previous

How to conduct your weekly project updates

Next
Next

Running blind without system diagrams