Salesforce.com has been increasingly capturing the Cloud CRM market these days. 

 

I have provided here important tips & tricks in executing a Salesforce Data Migration project :

 

    1. The sequence of migration of the different objects has to be followed. Otherwise, it is likely to mess up the data. For example, Users have to be migrated first, next Accounts, next Contact & so on.
    2. The Legacy_ID__c is the key field which serves as  the link between the parent Salesforce Org & the new Salesforce Org. The Legacy_ID field has to be chosen carefully. Mostly it will be the primary Key of the object. For example, for Account object it will be AccountID, for User object it will be UserID.
    3. Data can be migrated in a single full load or as delta loads. If the data is migrated as delta loads, the Created_Date, Last_Modified_Date, SystemModTimeStamp audit fields are the best Keys using which the delta load can be performed.
    4. When migrating data from multiple Salesforce Orgs, the following checks are good to perform before loading the data :
      1. De-duplication of standard object data records. For example, a big company may have two different Salesforce Orgs/systems. It  may sell a  similar set of products but not exactly the same. Few hundreds of Products may be the same. In such a scenario, when moving data to a new Org, the redundant ProductIDs are to be eliminated. Eliminating multiple ProductIDs from different Orgs into a single ProductID in the target Org is called data de-duplication. The dependent object of the product object  like Pricebook are to be merged in the target org.
      2. Cleansing of bad/Old data. The transactional objects like Case may have Closed Cases which need not be retained beyond a certain period of time. Those data has to be deleted/filtered while moving the data to the new Org. Another scenario is a Product may have become defunct. The Master record & transaction record of that Product should not be migrated. For example, if we are migrating the CRM/Salesforce Org of Maruti Udyog to  a new Org , then most of the records related to Maruti 800 or it’s sub-parts need not be migrated since Maruti stopped production of Maruti 800 long time ago.
    5. A single object can have multiple Master object dependencies. Make sure all the dependent objects are migrated before migrating the dependent object. Otherwise, the dependent object migration is likely to fail & lead to unnecessary re-work.
    6. A common show-stopper is the extraction of special characters into the flat file/staging table which cannot/may not need to be migrated to target Org. Learn to identify those special characters. Salesforce product generates lot of  special characters (similar to New-line,Tab characters) by itself. These sometimes get extracted as part of data. Learn to identify these characters & eliminate them on a need basis before migrating to Target org. The below link which explains how to handle these special characters will come in handy.
    7. Migration of BLOB object may pose a challenge. For example the Attachments in an EmailMessage, Attachment for Cases will contain PDF files, Excel, Word documents. Migrating these documents may pose a challenge because of  the nature of  these objects. These objects take a significantly long time to load into Salesforce compared to normal records. Schedule the load of these data in nightly batches rather than day-time loads.
    8. Informatica has a limitation of supporting only upto v33 of the Salesforce API for most of the current installations. However, if we upgrade the Informatica license/use PowerExchange for Salesforce , this limitation can be overcome. I faced this limitation in  my last project. The workaround is to upload data via DataLoader rather than using Informatica in case upgrading the license will take time. Related article on this is provided below which will be helpful :https://network.informatica.com/thread/81207
    9. For Bulk loading of data into the same table, the smart way to do is to use multiple sessions for the same mapping. We used 5 sessions in parallel to execute the same mapping in a single workflow to speed up the data upload process. This will overcome the Salesforce timeout limitations.
    10. The DataLoader tool has a limitation of not being able to upload data via VPN networks. This is based on my experience. Instead, if we upload data directly via Internet, local firewall the data upload will be fast & efficient.