Cloud Data Integration : 2013 : April Skip navigation

Guest Blog Written by Daniel Sisson


There are several best practices associated with integration of Salesforce with other applications. The following tips will help you to have a successful implementation project:

  • Consider your key data elements that are necessary for your Salesforce application. These elements might include your customer master, product master, order or transaction information, target market segments or prospect profile types.
  • Determine the source of the data elements. Make sure you consider what source is the “one truth” in your company. For example, should your customer master be your accounts receivable database, or should it be your sales or marketing database? This can be determined by level of accuracy, frequency of updates, or a combination.
  • Review the data management requirements that should be in place with the integration. This could include data replication, business intelligence, analytics, data cleansing or validation practices, and the overall process flow. For example, if someone using Salesforce discovers an error in a customer phone number, how should it be corrected?
  • Use standard integration applications. Whenever possible, avoid writing customized code for your integration. Instead look for powerful products such as Informatica Cloud to build a solid foundation that maintains data integrity while optimizing the integration process.
  • Initiate quality assurance (QA) as a critical component of your integration. Sometimes the source data systems are built around decades of legacy policies and procedures. These older processes could have left gaps in your data integrity. With a good QA process in place, issues can be identified and resolved before they become a problem in your Salesforce integration.
  • Audits and logging are crucial. Maintain a very detailed and robust logging process that includes error notification capabilities and auditing reports. This includes logging for automated jobs as well as user-executed transactions.
  • Review your security policies with all integrated systems to make sure there is consistency and compliance in all areas.


Your Salesforce implementation is a worthwhile investment. Take it to an even higher level of performance with integration to your back office systems while using industry best practices.


Daniel Sisson, Marketing and Business Development Director at Mansa Systems, Mansa is an SI and ISV that specializes in implementing, integrating, and devloping for Salesforce. Follow Mansa: @mansasys

Last week, the Informatica Cloud Team was at Salesforce Customer Company Tour Chicago. Although it was pouring and there were 30 mph winds, there was a good showing at the event - this just shows you how loyal Salesforce customers are! We had a lively reception with our partner, Cloud Sherpas, and CipherCloud.


Next week, the Informatica Cloud Team will be at the Salesforce Customer Company Tour London. We are at Booth #120 - swing by to learn about our Salesforce Cloud Data Management solutions. Since Chicago was such a blast, we will have an encore reception with Cloud Sherpas and CipherCloud. RSVP here.


If you haven't already registered for Salesforce Customer Company Tour London, do so here.


To learn more about Informatca Cloud @ Salesforce Customer Company Tour, visit this page.


See you at an upcoming Salesforce event!

Guest Blog Written By Terry Luschen

Originally posted here:

Informatica Cloud gives you a great platform for working integrations between platforms, databases and applications.  I currently work mostly with Salesforce, and Informatica Cloud has a built-in connector to Salesforce that makes it a great choice for me.  This connector can be used to extract, update or delete Salesforce data.  The data format that is used is your typical csv file.  So, if you are used to interacting with APEX Data Loader, the concepts will seem the same to you.


With Informatica Cloud you setup ‘Connections’ to your Salesforce environment and a directory where your csv file will live.  From there you create a ‘Data Synchronization Task’ under the ‘Applications’, ‘Data Synchronization’ menu item.  If you have a group of tasks that need to be run one after the other then you can group these tasks into a ‘Task Flow’, which can be found under the menu item ‘Applications’, ‘Task Flows’.


This is already an amazing amount of functionality out of the box.  But what if you need some special functionality like doing a FTP after you have downloaded some data from Salesforce?  Informatica Cloud does not have that built in, but they have a very easy way to hook in added functionality.  All you need to do is build an exe and then place that exe in a path that can be accessed by the ‘Secure Agent’, which is the local application that is doing the work for Informatica.  All configuration work is done in the ‘cloud’, but all the work is done by the ‘Secure Agent’ which is running on your local computer.




Now that we have our exe we simply need to add this exe to the ‘Postprocessing Commands’ text box on the ‘6. Schedule’ tab under the ‘Advanced Options’ section.  An example of a post processing command may look like this…


C:\InformaticaExes\FileProcessing\FTP_Me.exe C:\InformaticaExes\FTPCredentials.txt C:\InformaticaExes\FTP_File.csv


So in this example we have a FTP_Me.exe that is going to run that is accepting two parameters…

1) The first parameter is C:\InformaticaExes\FTPCredentials.txt and this will contain the credentials to the FTP site and the connection information.  You can even encrypt this file or the string within it so it is not plainly viewable on the Windows box.

2) The second parameter is C:\InformaticaExes\FTP_File.csv, which is the name of the file to FTP.


Now we are all set!  After our file is downloaded, the exe will be kicked off and our file will be FTPed as needed.


This can easily be extended in so many ways.  Maybe before the file is FTPed you need to modify the contents of the csv file by removing the headers and deleting the ID column in the first column.  Those types of things are so easy to do with a .NET exe and it is so easy to add that into your existing exe or even another one that can be in a different post processing command.  I would suggest keeping each exe to doing only a very specific task. That will make them easier to test and manage.


How have you extended Informatica Cloud?  I would love to hear about it!


Let’s get back to integrating our Cloud!


Terry Luschen: Systems Architect / Senior Software Engineer at Sundog. Specializes in Cloud integrations with Salesforce and other technologies. Salesforce Certifications: Advanced Developer, Sales Cloud and Service Cloud. Find me @TerryLuschen

In one of the previous posts we looked at Custom Integration Templates that allow you to define your custom data integration flows. Informatica Cloud also has a powerful REST-based API that allows developers to create Custom Integration tasks and run and monitor them from environments outside the Informatica Cloud UI– for example a Java-based or application. Using Cloud Integration Templates and the API features together allows you to create applications in the environment familiar to your users while behind the scenes relying upon the Informatica integration platform as a service (iPaaS) to move your data.


Sample App

We will illustrate the iPaaS concept here using a sample “White Label” application we have created using and Apex code. This application runs on and shows a VisualForce UI to the user. Behind the scenes it creates various Informatica objects to implement the cloud data integration use case. There is no interaction with the Informatica Cloud UI. Informatica Cloud is completely configured and controlled via our REST API.


In this example, the use case is to copy data from a Salesforce object, such as Account, Contact or Opportunity, to a different environment. We’ll use a flat file in this example, but it could be any object in any database or application that Informatica Cloud supports. You need to define the field mapping between the source and the target and also configure a filter condition on the data.


Using this sample app, you first login to the Informatica Cloud integration service. Once the login is successful you will see a list of Salesforce and Flat File connections in two lists. From these select the right connection of each type. Also specify a date to be used as a filter, to select all records created after that date.




As you select a connection of each type the field mapping control below will show fields from the Salesforce Account object and also from the default target file. You can change the object from Account to Contact or Opportunity.


You can drag and drop to map the fields or create expressions, very similar to how you do it on the Informatica Cloud UI.



Then click “Run” to execute this custom data synchronization task. This custom app invokes the Informatica Cloud REST API to create a Custom Integration task for the template, and runs it. Then it also monitors it to report the status.




APIs used



First, we use the Login API to login to Informatica Cloud and provide the user login and password in the request body.


This API call returns a Session ID and a URL that we need to use for all subsequent calls.


GET Connections


Next, we invoke the API to get connections. From its response we get all Salesforce connections and all Flat File connections, using the “type” attribute.


Custom Integration Task (“mttask”)


Then after the user clicks “Run” after configuring the UI, we invoke a series of API calls. First we create a Custom Integration Task for the template with values for parameters applicable based on the user selection.


Then run the task


As it runs we invoke the Activity Log API to check if the job is complete, and when it’s complete we get the run results.


Set up


  1. An Informatica Cloud account is required. Also a Secure Agent should be configured with that account.
  2. The app uses a Cloud Integration Template that reads from a Salesforce object, writes to a flat file target and performs a set of transforms on the way (see below).
  3. This template should be uploaded to the Informatica Cloud account.
  4. At least one connection each should be created in Informatica Cloud for Salesforce and Flat File (the folder where the files are located).


Last week Informatica announced the 2013 Innovation Awards Finalists. The winners will be announced at Informatica World in June.


For Cloud Integration, the finalists are:



For Hybrid IT, the finalists are:


  • Blue Shield CA with Model Metrics
  • Canon Australia


Dan Byrne, Solutions Architect at Canon Australia had this to say about being recognized as a finalist:


“Canon Australia has implemented a hybrid IT integration solution from Informatica which provides us with a complete, near real-time view of information around each business deal, countrywide. This new found ability to integrate all sales-related data across cloud and on-premise environments is allowing Canon to respond more quickly and effectively to the needs of sales prospects and customers, helping them to achieve more.”


Congratulations to this years finalists! Informatica World is gearing up to be another fantastic event for our customers and partners. We hope to see you there!

I recorded a brief video this week about the Hybrid IT track at Informatica World.


Informatica World 2013 runs from June 4-7 at the Aria Hotel in Las Vegas. Some of the sessions we'll be running in the new Hybrid IT Track  include:


We plan to have most of the Informatica Cloud product management team at the event, speaking in sessions and diving into technical details at the newly expanded Hands-On Lab.Many of our top customers and partners are already confirmed for the event.


It's going to be the cloud integration event of the year and a great opportunity to network with industry experts and your peers.


We hope you can join us!


Guest Blog Written by Richard Seroter

Last week I had the pleasure of visiting the Developer User Group in San Francisco and I delivered a presentation called Using the  Integration APIs. This group meets regularly to discuss timely development topics related to applications and the platform. In my presentation, I reviewed the comprehensive set of APIs offered by, when to use each, and how software providers like Informatica are using these APIs to create powerful software products.


What kind of APIs does provide to developers? First, there are three options for interacting with from outside systems.


•    SOAP services. SOAP is the classic web services protocol for exchanging XML messages between systems. The SOAP API is a great fit for quick development or integration with existing tools that know how to invoke SOAP-based services. The SOAP API has a rich set of operations for querying, creating, and modifying data or application settings.


•    REST services. RESTful services are all about using meaningful URIs and a common interface (e.g. HTTP POST, GET, PUT, DELETE) to interact with representations of data resources. The REST API gives developers the ability to interact with standard or custom objects represented as XML or JSON objects. This is a great fit for mobile applications or modern web applications that embrace JSON and prefer the REST-style of web services.


•    Batch processing. Both the SOAP and REST APIs work best with bite-sized messages as opposed to massive data sets. However –  as I’m sure the folks at Informatica can tell you – there are plenty of reasons to transfer large blocks of data between systems! The Batch API lets you upload XML or CSV data files as “jobs” that are processed in parallel on servers. Choose this API when performing data migration activities or irregular data synchronizations between systems.





In addition to the host of inbound integration options above, gives developers outbound interfaces for connecting to remote endpoints.


•    Message streaming. One way to detect changes in a system is to constantly poll it and try and find out what’s new. The Streaming API is a push-based channel for sending data events out of in real time. Once a developer defines a “PushTopic” that specifies the condition they are interested in, the consuming application can connect to that PushTopic and listen for changes. As soon as that condition is met in the system (e.g. “New Account Added in California”), a message is instantly sent to all subscribers to that topic. This is an intriguing option for organizations who want a real-time feed of information from and have applications behind the firewall that can connect to the Streaming API.


•    Outbound Messaging. Much like the Streaming API, Outbound Messaging is a powerful way to do event-driven integration with Whenever a user-created workflow condition is met, a SOAP message is sent to a listening web service. Unlike the Streaming API which only delivers messages once to online subscribers, the Outbound Messaging infrastructure can queue failed messages and try to deliver them later when the listener is available.




All of these diverse integration endpoints make it possible for software companies like Informatica to create compelling products that interface with applications. In fact, Informatica provides multiple ways to integrate with First, it can act as a listener for Outbound Messages. When using as a source in an Informatica Cloud data synchronization task, you can choose to “run this task in real-time” as a response to receiving an Outbound Message. Imagine building an Informatica flow where as soon as a record changed in, the data was instantly transferred to an internal database for processing.


The demonstration of Informatica Cloud that I presented to the User Group was taking data and regularly transferring it to another cloud CRM system, Microsoft Dynamics CRM Online. Because of the handy library of Informatica cloud connectors, it’s a snap to quickly build enterprise-class ETL operations between SaaS systems. In this demonstration, I showed how easy it was to use the Informatica Cloud Data Synchronization Task Wizard to connect to, connect to Dynamics CRM Online, filter the data, map the fields between the two systems, and schedule the ETL. After executing the ETL and showing how simple this really was, one User Group attendee jumped on Twitter to proclaims his readiness to try out the platform!




This was a fun group to present to as they regularly face the challenge of connecting applications to other systems used by their customers. While the APIs provided by give developers plenty of choices for integration, using them requires a fair amount of knowledge and custom code. However, tools like the Informatica Cloud simplify the process by offering an easy-to-use web interface for creating robust and reliable data integration flows. I’ve been a fan of this platform for a while and even taught a video-on-demand course that highlights its role in a cloud architect’s tool belt. Its critical to understand the marketplace so that we can make the best possible build vs. buy decision when integrating the cloud!


Richard Seroter is a senior product manager for cloud software company Tier 3, a trainer for Pluralsight, Microsoft MVP, editor, blogger,, author, and speaker. You can find Richard on Twitter as @rseroter.

Written by :  Sumit Jain of Astadia orginally posted here:

Salesforce to Salesforce is a native feature to share data records in real time between two environments (orgs). For example, two business partners may want to collaborate by sharing accounts and opportunity data within their orgs. It is very easy to share the data using Salesforce to Salesforce feature.


In most scenarios, data can be shared using the standard user interface manually. A user creates an account in one org, clicks the external sharing button, selects the appropriate org connection and then shares the record. This involves some manual effort and when sales reps are working on lot of records, the manual sharing process can become a painful exercise and that may cause some user adoption issues in the long run.


There are two objects which are controlling the Salesforce to Salesforce feature at the backend:


  • PartnerNetworkConnection - Represents a Salesforce to Salesforce connection between Salesforce organizations
  • PartnerNetworkRecordConnection – Represents a record shared between two Salesforce Orgs using Salesforce to Salesforce.

Whenever the user shares a record using Salesforce to Salesforce, a record gets created in PartnerNetworkRecordConnection object.

Informatica Cloud can be used to make this integration between two Salesforce orgs seamless and automatic.  The process flow will be as follows:

  • User creates the records (which need to be shared) in his org.
  • As soon as the record is created, a workflow is kicked off which sends an outbound message.
  • This outbound message will initiate an Informatica Cloud task.
  • The Informatica Cloud task will insert the record details (created in setp -1) in the PartnerNetworkRecordConnection object.
  • As soon as the record gets created in PartnerNetworkRecordConnection, it is shared with the Partner Org.

Thus, the records created in one org are automatically shared to the Partner Org in near real time without any manual intervention.

Overall, the benefits of using Informatica Cloud are:

  • Data Synchronization achieved through the use of out of the box functionality.
  • Centralizes Integration Logic and no need of writing custom VF/Apex code for automating Salesforce to Salesforce feature.
  • Record Selection criteria for sharing can be easily defined using filters.
  • Exception handling of failed records can be set up to monitor the record sharing status.
  • Scalable approach.

So if you are considering implementing Salesforce to Salesforce feature to share records automatically, you can consider the Informatica Cloud approach as described above.

Filter Blog

By date: By tag: