Skip navigation

Salesforce.com has been increasingly capturing the Cloud CRM market these days. 

 

I have provided here important tips & tricks in executing a Salesforce Data Migration project :

 

    1. The sequence of migration of the different objects has to be followed. Otherwise, it is likely to mess up the data. For example, Users have to be migrated first, next Accounts, next Contact & so on.
    2. The Legacy_ID__c is the key field which serves as  the link between the parent Salesforce Org & the new Salesforce Org. The Legacy_ID field has to be chosen carefully. Mostly it will be the primary Key of the object. For example, for Account object it will be AccountID, for User object it will be UserID.
    3. Data can be migrated in a single full load or as delta loads. If the data is migrated as delta loads, the Created_Date, Last_Modified_Date, SystemModTimeStamp audit fields are the best Keys using which the delta load can be performed.
    4. When migrating data from multiple Salesforce Orgs, the following checks are good to perform before loading the data :
      1. De-duplication of standard object data records. For example, a big company may have two different Salesforce Orgs/systems. It  may sell a  similar set of products but not exactly the same. Few hundreds of Products may be the same. In such a scenario, when moving data to a new Org, the redundant ProductIDs are to be eliminated. Eliminating multiple ProductIDs from different Orgs into a single ProductID in the target Org is called data de-duplication. The dependent object of the product object  like Pricebook are to be merged in the target org.
      2. Cleansing of bad/Old data. The transactional objects like Case may have Closed Cases which need not be retained beyond a certain period of time. Those data has to be deleted/filtered while moving the data to the new Org. Another scenario is a Product may have become defunct. The Master record & transaction record of that Product should not be migrated. For example, if we are migrating the CRM/Salesforce Org of Maruti Udyog to  a new Org , then most of the records related to Maruti 800 or it’s sub-parts need not be migrated since Maruti stopped production of Maruti 800 long time ago.
    5. A single object can have multiple Master object dependencies. Make sure all the dependent objects are migrated before migrating the dependent object. Otherwise, the dependent object migration is likely to fail & lead to unnecessary re-work.
    6. A common show-stopper is the extraction of special characters into the flat file/staging table which cannot/may not need to be migrated to target Org. Learn to identify those special characters. Salesforce product generates lot of  special characters (similar to New-line,Tab characters) by itself. These sometimes get extracted as part of data. Learn to identify these characters & eliminate them on a need basis before migrating to Target org. The below link which explains how to handle these special characters will come in handy.
    7. Migration of BLOB object may pose a challenge. For example the Attachments in an EmailMessage, Attachment for Cases will contain PDF files, Excel, Word documents. Migrating these documents may pose a challenge because of  the nature of  these objects. These objects take a significantly long time to load into Salesforce compared to normal records. Schedule the load of these data in nightly batches rather than day-time loads.
    8. Informatica has a limitation of supporting only upto v33 of the Salesforce API for most of the current installations. However, if we upgrade the Informatica license/use PowerExchange for Salesforce , this limitation can be overcome. I faced this limitation in  my last project. The workaround is to upload data via DataLoader rather than using Informatica in case upgrading the license will take time. Related article on this is provided below which will be helpful :https://network.informatica.com/thread/81207
    9. For Bulk loading of data into the same table, the smart way to do is to use multiple sessions for the same mapping. We used 5 sessions in parallel to execute the same mapping in a single workflow to speed up the data upload process. This will overcome the Salesforce timeout limitations.
    10. The DataLoader tool has a limitation of not being able to upload data via VPN networks. This is based on my experience. Instead, if we upload data directly via Internet, local firewall the data upload will be fast & efficient.

I have outlined below few tips on best practices for Data Migration which I have learned from my experience. This article might lean towards SAP data migration but I have added perspectives on data migration from non-SAP sources also. These are based on my experiences.

Best Practices

1) Check for the data quality before starting the migration. Make sure the data quality is more than 95% as per end user needs in order for the migration to be successful.

2) Progressive data cleansing is a must . Ideally data cleansing has to start 3 to 6 months before the actual data migration. This will give enough time & space for the End Users to cleanse the data. Informatica AddressDoctor is one good tool for cleansing the addresses & creating a master list of addresses.

3) There may arise a situation when data from multiple sites/locations in the source system needs to be merged into a single target system. In this case, master data de duplication has to be performed.

4) A data migration project can also be a good point where the client can be asked to think about introducing a Master Data Management(MDM) module into their systems. There are several good MDM tools like SAP MDM, Informatica MDM which the client can make use of.

5) In case of data migration from multiple source systems from diverse Data sources like Oracle DB, SQL Server DB, SAP ERP source systems etc., the following points needs to be kept in mind before finalizing the data migration design:

a) The format,datatype, length of the primary keys/Control IDs(like Employee ID for HR module/Material ID for MM module) etc. which will form a basis needs to be finalized in the beginning itself. Consider the hypothetical scenario of a big MNC acquiring 2 or 3 small companies & wanting to merge their HR systems. One HR module may be in SAP HR, the other may be a custom built application using Oracle DB & the third may be a custom built application using SQL Server. In this scenario, it is very important to make sure to standardize the DB schema/tables of the target HR module considering the data present in all the three different source systems.

b) The driving tables,keys in each source system needs to be identified & a common platform established before starting the actual data migration. The Source to Target matrix(STM) has to be clearly defined during the HLD/Detailed Design stage.

5) Before uploading the data into target SAP system, it is a good practice for the SAP team involved in migration to check the quality/correctness/format of data. This will help prevent unnecessary iterations of loading the data again & again into target SAP system.

Data Warehousing Projects

The purpose of a Data Warehouse is to give power/enough data/KPIs to leadership to make high level business decisions.

This is achieved by the following:

1)     Generating daily/weekly/monthly reports for C –level executive decision making.

2)     Generating KPIs, scorecards for visual representation of historical/real-time data.

Data Warehouses are typically huge repositories/volumes of data gathered from multiple source systems. Implementing data warehouses requires huge investments of money with the end result/value addition achieved by management not fully known at the beginning of the project.

Data Migration Projects

Data Migration projects are executed for moving data from one/multiple source systems to a target system. There is well defined end result out of executing data migration projects. Hence, business will be very much inclined to invest money in executing Data Migration projects. This could be needed from a business perspective for one of the following reasons:

1)     Source system consolidation may happen. Multiple source systems might be merged into one.

2)     In a SAP ERP scenario, multiple ERP instances could be merged into a single instance to cut TCO (Total Cost of Ownership).

3)     Some source systems may become redundant/deprecated. Hence, those systems data has to be merged with another/new source system.

4)      Data could be migrated across database vendors, for example, from SQL Server to Oracle.

5)     Data migration from lower version to higher version say from Oracle 9i to Oracle 11g or from SAP ECC 4.6c to ECC 6.0.

Hi All ,

I have scheduled test for PowerCenter Data Integration 9.x Administrator Specialist instead of

PowerCenter Data Integration 9.x:Developer Specialist

 

PowerCenter Data Integration 9.x:Developer Specialist'.

 

need to change it ASAP as the test is in next 2 days.

Anybody  kindly help.

 

regards,

Satyendra

Hi All,

 

 

I have successfully cleared  ICS 9x MDM Developer Specialist Certification.Thanks to group for Info.

Hi All,

 

I successfully completed ICS 9x PowerCenter Developer Specialist Certification.

 

I am really happy and feel proud to be a part of such an excellent ‘Informatica Community’.

 

Very useful information

 

1.  ‘Informatica Inventory Skill Set’

2. The PowerCenter Help Content will help

3. Practice the scenarios for each transformation There would be 70 questions and it should be done within 90 minutes.

 

Questions sequence will be random like first question from Troubleshooting and second question from Mapping Design Basic. Out of 70 questions, I could categories like 25 - Easy, 35 – Tricky and 10 – Tough.

 

You can reach me if any more doubts/clarification required on my experience .

 

Thanks,

Vinay Bhukya

Chennai

vinay.lbrc@gmail.com

Ph:9176932567