Painless Data Migration Extract Transform Load to Data Warehouse

Painless Data Migration Extract Transform Load to Data Warehouse

 

 

Painless data migration sounds like a fairy tale. Data migration is an issue the comes up in almost every ERP system implementation. It’s an issue that usually takes some time to explain to customers. You can make it easy or you can make it a huge, complicated thing. Or you can side step it all together.

An excellent white paper by Jet Reports explains how you can use an ETL (Extract, Transform, and Load) tool to populate a data warehouse separate from your new ERP system that will house the data from your legacy system. You can then use this data in your reporting.

This is such a great white paper I can’t really improve upon it. I just strongly encourage you to get it for yourself and read the whole thing. Here are some specific points directly from the white paper:

First the problem:

  • “Legacy data … can present a major problem when we are talking about a big switch to a different ERP system with a new data structure, new posting methodologies and new workflows.”
  • “The task of bringing forward old or outside data not only requires transforming it into a suitable format for the new system, but also posting that data. Because the process of posting also transforms the data, it can be very challenging to get the various data sets to gel into a functional whole.”
  • “Many businesses attempt to avoid this by importing beginning balances and open documents. This simplifies the process, but can leave businesses at a competitive disadvantage versus companies that have found a way to solve this problem with their history intact.”

” People love their old data not only because it is valuable, but because it provides context, and context equals reassurance in the new system.”

A solution:

Use an ETL tool to extract the data from your legacy system and store it in a data warehouse. Then use business intelligence and other reporting tools to use the data.

Steps to the process:

  • “Data extraction involves pulling data out of legacy systems, usually for further manipulation, before loading into the new system.”
  • “An ETL tool will use programmatic data cleansing rules that can be enforced easily and executed automatically. It will also allow you to create verification rules to make sure that history is accurate and that new data can also be verified.”
  • Data can be, “loaded programmatically into a reporting environment at the same time it is extracted from the legacy database.”

This is a solution I’ve proposed a few times, but this white paper makes it much more compelling. Get the white paper and read it:

 

[Hidden Fact] Hydrogen gas is the least dense substance in the world, at 0.08988g/cc

 

No Comments

Post A Comment