Why do we migrate data?

We typically migrate data for several reasons -

Migration Planning

Migrating data is not a trivial process. It needs to be planned out carefully, as it's essential that loss of access to data is kept to a minimum, and critical that you don't lose any data in the process. Here is a suggestion on how to set up a data migration plan.

  1. Analyse the Source data
    What do you need to migrate? How will it fit on the target devices? Do you need to consolidate data or just move like for like? Are there any problems with the existing data before you start? If you don't check and record exisiting problems, you will be blamed for them later
  2. Design a Migration Plan
    What type of migration will you use? A 'Big Bang' disruptive approach or a slower, non-disruptive approach. A large part of this decision will depend on how much downtime your customers can take. You must also design a test plan to make sure all is well after the migration.
  3. Build the Code
    Exactly what this means will depend on what solution you use. In Mainframe terms this means building the batch jobs and JCL. In the Windows and Linux worlds it would mean building the command scripts. In my experience, you break the data down into subsets and migrate part at a time, usually overnight or at weekends. You also need to validate these jobs or scripts by running them in simulation mode, to make sure the syntax chacks out.
  4. Test
    Now you are nearly ready to go, but first you need to pick out some less important data and run the jobs for real, to make sure everything is working correctly. Exactly what is test data will be your choice. Don't forget, you need a backout plan so you can get back to the original if everything goes pear shaped. So, test you backout plan here too! Don't leave anything to chance.
  5. Do it!
    Follow your plan and migrate the data
  6. Check and test
    Did all the data move correctly? Are you certain you are running from the new disks? Run some high I/O work to make sure the data is performant in its new home. Keep checking performance for several days after migration.
  7. Clean up
    Wipe the data off the old disks to your companies data protection standards.
  8. Celebrate
    If it all works as expected, have a party!

Windows Migration

Migrating data from an old PC or Laptop is usually straightforward, if the old machine is still operating. If you use OneDrive, then your data should be accessble from the new PC once you sign in. OneDrive is already installed on Windows 10. It's free and comes with 5 GB of OneDrive storage. You need a Microsoft account to sign up, and an email address.
If you don't use OneDrive, then it is easy to just copy your files to an external drive, a large USB drive is very suitable. Insert the drive into your old PC, copy the files onto it, take it out and stick it in the new PC, then move the files onto the new computer. This method won't copy parameters and installed programs and settings, you will need to re-input and re-install those manually

If you are upgrading equipment for your company, then you will need to copy over parameter files like email setups and browser data, and all installed programs. There are many tools on the market to help with this, Mulesoft, Ubackup and EaseUs are three examples.

If you need to migrate a complete physical Windows server, including its files and configuration, to another server then one option is to use the Microsoft Storage Migration Service

Disruptive Maintenance

What you could call the traditional way to migrate data is to shut all your applications down so that nothing is accessing the data, then copy the data to the new disks, point your applications to the new disks, restart your applications again and test. We used to do this with quite critical business systems maybe 20 years ago when we could get the applications down for a weekend, but generally speaking this is not practical today. Businesses require their applications to be up 7*24 or as close as possible and data sizes are so big that migration takes a considerable time. However the requirement to move data still remains, so various tools have been developed to make this as non-disruptive as possible.

Hardware Data Migration tools

Two different types of tool exist, Hardware Virtualisation and Disk Mirroring.

Most devices now virtualise data so that the 'disk' you see in Explorer is actually carved up and spread over several physical disks. Redundant copies pf the data may also be held for resilience. The devices have the ability to move data around internally without affecting applications, and so free up older disks. Some examples of this are the HDS VSP device which can virtualise external disk subsystems behind a VSP controller. Once an old subsystem is virtualised, the data can then be moved transparently to the VSP owned disks, or to another virtualised external disk subsystem. While best done at a quiet time to avoid possible performance issues, this migration can happen with no impact at all on running applications. IBM's SVC virtualisation controller works in pretty much the same way, both systems will virtualise and migrate Open Systems data 'on the fly'. The difference is that the SVC does not support mainframe data at all, and while the VSP does support mainframe data, the initial virtualisation process is disruptive.

Just about every major disk subsystem has a facility to remotely mirror data to another subsystem, though usually both subsystems need to be from the same manufacturer. You can use this to shortcut the disruptive migration option. You can set up mirrors between two multi-terabyte subsystems and leave it for some time to complete the mirror process. Once you get there, then you need to stop your application so that no more updates happen to the old system, stop mirroring and convert the target mirror so it can be used for primary data, (the Remote Mirroring section explains this) then point your applications to the target disks and restart them. The non-disruptive mirroring part can take several days, the disruptive swap-over just minutes.

Mainframe Software Data Migration tools

Software tools exist to transparently migrate disks between mainframe subsystems. Two examples are FDRPAS and TDMF, both of which have their own pages on this site.

However, all of the options above move disks, not data. That is, at the end of the move the data will be one one disk, so none of these will assist if you want to consolidate disks. Two mainframe migration tools exist to help here, FDRMOVE and zDMF. These products move datasets, and if they can get exclusive use for a file then they will just move the data. If they cannot get exclusive use they will copy the data and maintain it up to date while waiting for the ENQ to be released so they can swap the file to the new location. It may be necessary to stop applications briefly for this to happen.

Deleting the data

OK, so you have moved all your data off your old disks successfully and you are ready to power them down and ship them off to the scrap yard. Before you can do that, you need to wipe all the data off them, so it does not fall into criminal hands. FDR Erase is a product that can so this for you on mainframe disks.

If you want to securely wipe data from your laptop, then many virus protection applications have a 'shred it' facility. One example is AVG internet security File Shredder. However you should be aware that this might not be as foolproof as it looks. If you use Volume Shadow Copy to take regular snapshots of your data, so you can get something back that you accidentally deleted or overwrote, then shredding will securely shred the live file, but all the old backups still exist. So it is easy enough to get the file back again. This applies to any backup product of course. Shredding will destroy the live or active file, but all the backup versions will exist until they are rolled off.

Data Migration

Enterprise Disk

Lascon latest major updates

Welcome to Lascon Storage. This site provides hints and tips on how to manage your data, strategic advice and news items.

back to top